| contributor author | William D. Lawson | |
| date accessioned | 2017-05-08T21:20:50Z | |
| date available | 2017-05-08T21:20:50Z | |
| date copyright | October 2007 | |
| date issued | 2007 | |
| identifier other | %28asce%291052-3928%282007%29133%3A4%28320%29.pdf | |
| identifier uri | http://yetl.yabesh.ir/yetl/handle/yetl/47879 | |
| description abstract | This study explores whether fundamentals of engineering (FE) exam scores are reliable and valid measures of individual competence, program accreditation, and college performance, each of these being processes commonly assessed using FE scores. Findings indicate that a trend exists toward erosion of reliability and validity as one moves further from the individual assessment level. That is, FE exam scores are probably reliable and valid indicators of minimal technical competency at the individual level. However, program-level assessments require a careful, fine-grained comparison of the subject-content statistics reported by NCEES relative to stated program objectives, and in certain cases, FE exam data will not serve as reliable and valid assessment indicators for some engineering programs. Further, assessment of entire engineering schools based on college-wide FE exam pass rates is inappropriate and cannot be rationally supported. | |
| publisher | American Society of Civil Engineers | |
| title | Reliability and Validity of FE Exam Scores for Assessment of Individual Competence, Program Accreditation, and College Performance | |
| type | Journal Paper | |
| journal volume | 133 | |
| journal issue | 4 | |
| journal title | Journal of Professional Issues in Engineering Education and Practice | |
| identifier doi | 10.1061/(ASCE)1052-3928(2007)133:4(320) | |
| tree | Journal of Professional Issues in Engineering Education and Practice:;2007:;Volume ( 133 ):;issue: 004 | |
| contenttype | Fulltext | |