Learning decision trees in continuous space

  • József Dombi
  • Á. Zsiros

Abstract

Two problems of the ID3 and C4.5 decision tree building methods will be mentioned and solutions will be suggested on them. First, in both methods a Gain-type criteria is used to compare the applicability of possible tests, which derives from the entropy function. We are going to propose a new measure instead of the entropy function, which comes from the measure of fuzziness using a monotone fuzzy operator. It is more natural and much simpler to compute in case of concept learning (when elements belong to only two classes: positive and negative). Second, the well-known extension of the ID3 method for handling continuous attributes (C4.5) is based on discretization of attribute values and in it the decision space is separated with axis-parallel hyperplanes. In our proposed new method (CDT) continuous attributes are handled without discretization, and arbitrary geometric figures are used for separation of decision space, like hyperplanes in general position, spheres and ellipsoids. The power of our new method is going to be demonstrated oh a few examples.

Downloads

Download data is not yet available.
Published
2001-01-01
How to Cite
Dombi, J., & Zsiros, Á. (2001). Learning decision trees in continuous space. Acta Cybernetica, 15(2), 213-224. Retrieved from https://cyber.bibl.u-szeged.hu/index.php/actcybern/article/view/3575
Section
Regular articles

Most read articles by the same author(s)

<< < 1 2