Automated Analysis Of COnstructed Response

Traditionally resource constraints in large introductory STEM courses at the university level have limited student assessments (quizzes, tests, pre-instructional surveys, etc.) to the use of multiple choice questions. Open-ended written assessments like short-answer questions and essay prompts are known to provide deeper insight into student understanding of STEM concepts, but they are often to too labor intensive for university courses with student-to-faculty ratios of 100s-to-1. The Automated Analysis of Constructed Response research group (AACR) is working to make the insights provided by student writing available to STEM instructors independent of course size. Here is a summary of some of my contributions to the AACR efforts.

 
 
Machine Learning Predictive Scoring of Student Writing

Machine Learning Predictive Scoring of Student Writing

Predictive Analysis of Student Responses

The tools of data science provide an excellent solution to the resource intensive nature of analyzing student writing. By leveraging the natural language processing techniques and machine learning algorithms we have built a system that can score open-ended, short-answer questions (referred to as constructed response items) for a range of STEM concepts spanning the disciplines from biology to statistics. The system uses a heterogeneous ensemble of machine learning algorithms to handle a diverse range of question structures and contents with performance that rivals that of human content experts.

 

Automated Interactive Reporting

Interactive Feedback for Course Instructors

Interactive Feedback for Course Instructors

Automated scoring alone only gets one so far. In order to improve instruction in STEM courses the insights provided by scoring must be clearly and efficiently communicated to participating instructors. To that end we have built a set of interactive web apps that present automated scoring of student writing, statistical analyses of the results, and the ability to explore patterns in student data. With these tools instructors can quickly discover patterns in student writing across large data sets.

 
NARST 2016

NARST 2016

Physics Content Knowledge Items

The AACR system provides an excellent set of tools for exploring student thinking and writing, but to provide a useful pedagodic tool it is imperative that instructors know what to ask their students. In helping build a body of constructed response items that may be used with the AACR system, we working to develop a set of Physics content items ranging from astronomy to thermodynamics.