Bowler, Mark C.Barbee, Ashley A.2012-05-202014-05-312012http://hdl.handle.net/10342/3817The present study presents a Monte Carlo evaluation of the application of variance partitioning to the assessment of the construct-related validity of assessment center (AC) post exercise dimension ratings (PEDRs). Data was produced by creating sixteen population models representing a variety of AC models by varying dimension factor loadings, exercise factor loadings, dimension intercorrelations, and exercise intercorrelations. Analyses demonstrated that variance partitioning differentiated among all sixteen varieties of AC models. Variance partitioning also detected other sources of variance including person effects, person by dimension effects, and person by exercise effects. These findings suggest that variance partitioning may be a more appropriate method for analyzing AC multitrait-multimethod (MTMM) data instead of the traditional confirmatory factor analysis (CFA) method.  47 p.dissertations, academicPsychologyAssessment centersConfirmatory factor analysisConstruct-related validityMultitrait-multimethod matrixsVariance partitioningManagement--ResearchExecutives--Rating ofMonte Carlo methodAnalysis of varianceA Monte Carlo Evaluation of the Application of Variance Partitioning to the Assessment of Construct-related ValidityMaster's Thesis