YaBeSH Engineering and Technology Library

    • Journals
    • PaperQuest
    • YSE Standards
    • YaBeSH
    • Login
    View Item 
    •   YE&T Library
    • ASCE
    • Journal of Computing in Civil Engineering
    • View Item
    •   YE&T Library
    • ASCE
    • Journal of Computing in Civil Engineering
    • View Item
    • All Fields
    • Source Title
    • Year
    • Publisher
    • Title
    • Subject
    • Author
    • DOI
    • ISBN
    Advanced Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Archive

    Wrapper Methods for Inductive Learning: Example Application to Bridge Decks

    Source: Journal of Computing in Civil Engineering:;2003:;Volume ( 017 ):;issue: 001
    Author:
    Hani G. Melhem
    ,
    Yousheng Cheng
    ,
    Deb Kossler
    ,
    Dan Scherschligt
    DOI: 10.1061/(ASCE)0887-3801(2003)17:1(46)
    Publisher: American Society of Civil Engineers
    Abstract: The decision tree algorithm is one of the most common techniques of inductive learning. This paper investigates the use of wrapper methods for bagging, boosting, and feature selection to improve the prediction accuracy of the decision tree algorithm. A set of concrete bridge decks is extracted from the Kansas bridge database, and the deterioration of the health index is selected as the decision/class value for induction. From the conducted experiments, the decision tree accuracy obtained is 67.7%, whereas bagging and the boosting gave 73.4% and 72.7%, respectively. Wrapping with a feature selection method gave an accuracy of 75.0%. If feature selection method is applied first, bagging and boosting do not provide any further improvement to the decision tree algorithm. A series of tests were conducted where the selected features were examined and manually eliminated for the data set. This revealed that the improvement obtained by the feature selection method can be misleading. For the problem at hand, the attributes selected were not the most important ones to the problem domain. Therefore, what may be an improvement from the machine learning or data mining viewpoint, can turn out to be a mistake from an engineering perspective. Automatically selected attributes should be checked carefully. Feature selection is not recommended in this case.
    • Download: (790.0Kb)
    • Show Full MetaData Hide Full MetaData
    • Get RIS
    • Item Order
    • Go To Publisher
    • Price: 5000 Rial
    • Statistics

      Wrapper Methods for Inductive Learning: Example Application to Bridge Decks

    URI
    http://yetl.yabesh.ir/yetl1/handle/yetl/43118
    Collections
    • Journal of Computing in Civil Engineering

    Show full item record

    contributor authorHani G. Melhem
    contributor authorYousheng Cheng
    contributor authorDeb Kossler
    contributor authorDan Scherschligt
    date accessioned2017-05-08T21:13:00Z
    date available2017-05-08T21:13:00Z
    date copyrightJanuary 2003
    date issued2003
    identifier other%28asce%290887-3801%282003%2917%3A1%2846%29.pdf
    identifier urihttp://yetl.yabesh.ir/yetl/handle/yetl/43118
    description abstractThe decision tree algorithm is one of the most common techniques of inductive learning. This paper investigates the use of wrapper methods for bagging, boosting, and feature selection to improve the prediction accuracy of the decision tree algorithm. A set of concrete bridge decks is extracted from the Kansas bridge database, and the deterioration of the health index is selected as the decision/class value for induction. From the conducted experiments, the decision tree accuracy obtained is 67.7%, whereas bagging and the boosting gave 73.4% and 72.7%, respectively. Wrapping with a feature selection method gave an accuracy of 75.0%. If feature selection method is applied first, bagging and boosting do not provide any further improvement to the decision tree algorithm. A series of tests were conducted where the selected features were examined and manually eliminated for the data set. This revealed that the improvement obtained by the feature selection method can be misleading. For the problem at hand, the attributes selected were not the most important ones to the problem domain. Therefore, what may be an improvement from the machine learning or data mining viewpoint, can turn out to be a mistake from an engineering perspective. Automatically selected attributes should be checked carefully. Feature selection is not recommended in this case.
    publisherAmerican Society of Civil Engineers
    titleWrapper Methods for Inductive Learning: Example Application to Bridge Decks
    typeJournal Paper
    journal volume17
    journal issue1
    journal titleJournal of Computing in Civil Engineering
    identifier doi10.1061/(ASCE)0887-3801(2003)17:1(46)
    treeJournal of Computing in Civil Engineering:;2003:;Volume ( 017 ):;issue: 001
    contenttypeFulltext
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian
     
    DSpace software copyright © 2002-2015  DuraSpace
    نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
    yabeshDSpacePersian