cv

Matthew M. Gushta

Education

Ph.D. Candidate (Expected 2011) University of Maryland – College Park, Education: Measurement, Evaluation, and Statistics

M.Ed. (2003) University of Alberta, Educational Psychology, Educational Measurement and Evaluation

B.A. (2001) University of Alberta, Psychology (Co-operative), with distinction

Professional Experience

Employment Details

2004-Present Research Analyst/Psychometrician, American Institutes for Research (AIR)

  • Educational Assessment Technical Team (2004-Present):

    • Performs quality control checks on data received from subcontractors, item parameter estimation, student ability estimation, forms equating, and vertical linking. Contributes to technical report writing, revising, and conducting additional analyses as requested by clients

    • Provides psychometric services including classic item analysis, modern item analysis (IRT), differential item functioning (DIF), and support for standard setting activities for large-scale, statewide educational assessment programs.

    • Provides educational assessment and psychometric research services including technical contributions for proposals, manuscript editing, monitoring document publication, conducting literature searches, drafting research reports, etc.

    • Responds to internal and external client requests for psychometric services (i.e., data simulation, quality control procedures, test equating, classical and modern item response theory analysis, etc.)

  • No Child Left Behind Peer Review Team (2006-2007):

    • Under the direction of Dr. Harold Doran, I assisted state assessment programs as they attempted to satisfy federal peer review requirements. This included, but was not limited to, coordination of state assessment documentation, analysis of federal education policy, and supplementary psychometric analyses. Many of the tasks conducted for peer review centered around reporting validity evidence – such as the results of performance classification analysis and item-content alignment studies.

  • Depth of Knowledge Taxonomies Literature Review (2006):

    • In response to federal demands for evidence of breadth and depth of cognitive processing represented in state educational achievement tests, a client solicited a comprehensive review of learning theories and taxonomies that have been, and could be, used to define the cognitive processing demands of test item. Referencing over seventy publications, this literature review applied eight theories from the areas of cognitive psychology, educational psychology, psychometrics, and educational policy in making recommendations impacting the design of state achievement tests.

  • Ohio Success Suite (2006):

    • As the Statistical Task Leader for the Ohio Success Suite, I ensured the data quality for an internet-based student roster and test information tracking system. The system dynamically assigned students to educational units across the state and aggregated demographic and test information at various levels. After designing the statistical data quality control test plan, I executed the plan and coordinated the efforts of junior quality assurance staff.

  • Analysis of District of Columbia Public Schools Special Education Programs (2005):

    • Coordinated multiple data sources from the District of Columbia Public Schools (DCPS) in analyzing trends in student referrals and requests for referrals to special education programs across multiple years.

    • Assured data quality.

    • Consulted with the client and senior analysts on the construction of the analytic procedures and the subsequent technical report which was delivered to the client.

  • Study of Young Adults in Education Programs (2005):

    • Led psychometric tasks and data maintenance in a study examining achievement gains within Adults in Basic Education (ABE) and English as a Second Language (ESL) programs. Coordinated the construction of a large-scale, four-state, multi-year database containing student characteristic information and multiple outcome measures. Designed the analytic approach used in interpreting longitudinal test information. Consulted with senior analysts regarding content and psychometric issues. Contributed to the final report which was subsequently delivered to the Department of Education.

  • Principal Technology Leadership Assessment (2005):

    • Assured data quality and Performed the reliability analysis for the prototype of an online survey administered to educational technology leaders.

  • Ohio Personal Digital Assistant Pilot Project (2004-2005):

    • Developed the project plan to computerize a classroom-based Grade 1 reading diagnostic assessment to be administered via Personal Digital Assistants (PDAs).

    • Collaborated on the development of a pseudo-adaptive assessment algorithm, effectively shortening administration time.

    • Managed the adaptation, programming and development, and the deployment of over 100 pilot PDA instruments, used by teachers across the state.

    • Managed a project team composed of content and professional development experts, software programmers and testers, and research assistants.

    • Produced a policy and research report for the Ohio Department of Education demonstrating the effectiveness of the PDA as a method for delivering the reading diagnostic instrument.

  • NAEP State Analysis Project (2004):

    • Conducted quantitative analyses examining the effects of disparate sampling, calibration, and linking procedures on cross-year linking estimates.

    • Assisted in the design and revision of a standardized tool for assessing the quality of state assessment programs.

    • Conducted a review of technical characteristics of educational assessment programs across the nation.

    • Contributor on the technical report.

2001-2003 Graduate Research Assistant, Centre for Research in Applied Measurement and Evaluation, University of Alberta

  • Provided statistical and test development consultation services.

  • Updated and maintained the CRAME website.

2001-2002 Research Assistant, Academic Technologies for Learning, University of Alberta

1999-2000

  • Performed quantitative and qualitative data analysis.

  • Focus group and interview facilitation.

  • Lead author in the development of an institutional centre for evaluation, instruction, innovation, and research.

  • Collaborated on a variety of evaluation projects at the institutional, provincial, and national level.

2000-2001 Associate Director, Student Distress Centre, University of Alberta

  • Maintained community referral information.

  • Ensured availability of the service through volunteer recruitment and scheduling. Coordinated and participated in various volunteer development and training activities.

Software

General MSOffice Suite, Visual Basic, SQL, HTML

Statistics SAS, SPSS, STATA, MPlus, Winsteps, R, WinBUGS, LISREL, AM, Conquest

Publications

Peer Reviewed Journals

Rupp, A. A., Gushta, M., Mislevy, R. J., & Shaffer, D. W. (2010). Evidence-centered design of epistemic games: Measurement principles for complex learning environments. Journal of Technology, Learning, and Assessment, 8(4).

Varnhagen, C. K., Gushta, M., Daniels, J., Peters, T. C., Parmar, N., Law, D., Hirsch, R., Sadler Takach, B., & Johnson, T. (2005). How Informed is Online Informed Consent? Ethics & Behavior, 15 (1).

Estabrooks, C. A., Floyd, J. A., Scott-Findlay, S., O’Leary, K. A., & Gushta, M. (2003). Individual determinants of research utilization: a systematic review. Journal of Advanced Nursing, 43(5), 506-520.

Books and Book Chapters

Gushta, M. M., and Rupp A. A. (2010). Reliability. In N. J. Salkind (Ed.) Encyclopedia of Research Design (pp. XXX-XXX). Thousand Oaks, CA: Sage.

Technical Reports

American Institutes for Research (2006). The Ohio Department of Education Literature Review #1: Depth of Knowledge Taxonomies. Washington, DC: American Institutes for Research.

Gushta, M., Seburn, M., Tolosa, S. (2005). District of Columbia Public Schools: Special Education Analysis, Final Report. Washington, DC: Computer and Statistical Sciences Center, American Institutes for Research.

American Institutes for Research (2005). A Study of Young Adults in Education Programs. Washington, DC: American Institutes for Research.

Gushta, M. (2005). Principal Technology Leadership Assessment (PTLA), Reliability Analysis. Washington, DC: American Institutes for Research.

Gushta, M. M., Gottesman, J., Hardwick, B., and Cohen, J. (2005). Ohio personal digital assistant pilot project: Final report. Washington, DC: American Institutes for Research.

Cohen, J., Seburn, M., Gushta, M., Chan, T, and Jiang, T. (2004). What can NAEP and State Assessments Learn from each Other about Measuring Progress? Washington, DC: American Institutes for Research.

Gushta, M. M. (2003). Working memory and syllogistic reasoning in computer-based testing. Unpublished master’s thesis, University of Alberta, Edmonton, Alberta, Canada.

Professional Presentations

Gushta, M. M., Yumoto, F., and Williams, A. (2009). Separating Item Difficulty and Cognitive Complexity in Educational Achievement Testing. Paper presented at the annual meeting of the American Educational Research Association, San Diego, CA.

Yumoto, F. and Gushta, M. M. (2008). Generating Multivariate Dependent Data From Non-Normal Distributions: Copula Approach. Paper presented at the annual meeting of the American Educational Research Association, New York City, New York.

Cohen, J., Gushta, M., Gottesman, J., Feil, J., and Xu, M. (2006). Pen and Paper versus PDA: A Comparison of a Teacher Administered Reading Diagnostic. Paper presented at the annual meeting of the American Educational Research Association, San Francisco, California.

Doran, H. C., Cohen, J., and Gushta, M. (April, 2005). From Saint to Sinner: Should value-added models assume a vertical scale? Paper presented at the annual meeting of the American Educational Research Association, Montreal, Quebec, Canada.

Doran, H. C., Cohen, J., and Gushta, M. (2004). From Saint to Sinner and Back Again: The Confounding Effect of Linking Error on Gains Estimated from Value-Added Models. Paper presented at the conference on Value-Added Modeling: Issues with Theory and Application, University of Maryland, Maryland.

Gushta, M. M. (April, 2004). Equivalence across modes of administration: An item-level analysis of computer-and paper-based test versions. Paper presented at the annual meeting of the National Council on Measurement in Education, San Diego, California.

Sadesky, G. S., and Gushta, M. M. (April, 2004). Applying Rule-Space Methodology to the problem of standard setting. Paper presented at the annual meeting of the National Council on Measurement in Education, San Diego, California.

Varnhagen, C. K., Gushta, M., Daniels, J., Peters, T. C., Parmar, N., Law, D., Hirsch, R., Sadler Takach, B., and Johnson, T. (November, 2003). How Informed is Online Informed Consent? Paper presented at the annual meeting of the Society for Computers in Psychology, Vancouver, British Columbia, Canada.

Gushta, M. M. (May, 2003). New and Improved: Standard Setting Issues in Computerized Adaptive Testing. Paper presented at the annual meeting of the Canadian Society for the Study of Education, Halifax, Nova Scotia, Canada.

Gushta, M. M. (March, 2003). Computer-based testing: Cognitive factors and equivalence. Paper presented at the seventeenth annual Joseph R. Royce Research Conference, Edmonton, Alberta, Canada.

Gushta, M., Varnhagen, C, Daniels, J., Peters, T., Parmar, N., Law, D., Hirsch, R., Sadler Takach, B., and Johnson, T. (March, 2003). How informed is online informed consent? Paper presented at the seventeenth annual Joseph R. Royce Research Conference, Edmonton, Alberta, Canada.

Gushta, M. M. and Varnhagen, S. J. (November, 2002). Using Formative Evaluation for Institutional Development: Proposing a Center for Evaluation at the University of Alberta. Paper presented at the annual meeting of the American Evaluation Association, Washington, D. C.

Gushta, M., Grace, D., and Varnhagen, S. (June, 2002). Pedagogy, Performance, and Evaluation: Issues in Web-Enhanced Instruction. Paper presented at the annual meeting of the Alberta Teachers of Psychology, Camrose, Alberta, Canada.

Grace, D., and Gushta, M. (May, 2002). Directions for Faculty Evaluations at Post-Secondary Institutions: Faculty Evaluation, Instructional Technology, & The Scholarship of Teaching. Paper presented at the meeting of the Canadian Evaluation Society, Halifax, Nova Scotia, Canada.

Sears, M., Grace, D., and Gushta, M. (May, 2002). Unique Issues When Evaluating Distance Courses Using instructional Technology. Workshop presented at the meeting of the International Council for Open and Distance Education (ICDE) and the Canadian Association for Distance Education (CADE-ACED), Calgary, Alberta, Canada.

Professional Affiliations

National Council for Measurement in Education

American Educational Research Association

Honors and Awards

Alberta Learning Graduate Student Scholarship, 2003

Faculty of Education Travel Grant, 2003

Mary Louise Imrie Graduate Student Award, 2003

Lambda Chi Alpha Educational Scholarship, 2000