International Research journal of Management Science and Technology

  ISSN 2250 - 1959 (online) ISSN 2348 - 9367 (Print) New DOI : 10.32804/IRJMST

Impact Factor* - 6.2311


**Need Help in Content editing, Data Analysis.

Research Gateway

Adv For Editing Content

   No of Download : 21    Submit Your Rating     Cite This   Download        Certificate

COMPARISON OF INDUCTION METHODS FOR DECISION TREE

    1 Author(s):  VANDNA DAHIYA

Vol -  5, Issue- 11 ,         Page(s) : 141 - 148  (2014 ) DOI : https://doi.org/10.32804/IRJMST

Abstract

Induction of decision trees is the learning from examples. Decision tree is a flowchart like structure where the root node and all the internal nodes represent an attribute with a question and the arcs represent possible outcomes for each question. Class-labeled tuples are used for this learning to design the decision tree for further classification of data in data mining. Goal is to create a model which can predict the class for the tuples. Several methods have been developed for developing the decision tree. This paper tries to compare two of the methods namely, Information Gain and Gain Ratio with their advantages and shortcomings.

  1. Quinlan, J.R. (1969). A task-independent experience gathering scheme for a problem solver. Proceedings of the First International Joint Conference on Artificial Intelligence. Washington, D.C.: Morgan Kaufmann
  2. J.R, QUINLAN, Induction of Decision Trees,  Machine Learning 1: 81-106, 1986, Kluwer Academic Publishers ,p-81-106
  3. Mitchell (Eds.), Machine learning. Los Altos: Morgan Kaufmann (in press). Quinlan, J.R. (1985b). Decision trees and multi-valued attributes. In J.E. Hayes & D. Michie (Eds.),
  4. A. Natsev, R. Rastogi, and K. Shim. Walrus: A similarity retrieval algorithm for image databases. In Proc. 1999 ACM-SIGMOD Int. Conf. Management of Data (SIGMOD’99), pages 395–406, Philadelphia, PA, June 1999.
  5.  J. Nocedal and S. J.Wright. Numerical Optimization. Springer-Verlag, 1999. 
  6.  Data Mining concepts and Techniques, Jiawei Han, Micheline Kambler
  7. G. Pagallo. Learning DNF by decision trees. In Proc. 1989 Int. Joint Conf. Artificial Intelligence (IJCAI’89), pages 639–644, Morgan Kaufmann, 1989.
  8. R. Mattison. Data Warehousing and Data Mining for Telecommunications. Artech House, 1997.
  9. Quinlan, J.R. (1983a). Learning efficient classification procedures and their application to chess endgames. In R.S. Michalski, J.G. Carbonell & T.M. Mitchell, (Eds.), Machine learning: An artificial intelligence approach. Palo Alto: Tioga Publishing Company
  10. H. Lu, J.Han, and L. Feng. Stock movement and n-dimensional inter-transaction association rules. In Proc. 1998 SIGMODWorkshop Research Issues on DataMining and Knowledge Discovery (DMKD’98), pages 12:1–12:7, Seattle,WA, June 1998.

*Contents are provided by Authors of articles. Please contact us if you having any query.






Bank Details