The Journal Paper
Here is an article that is focused on the assessment results of interest to Educators.
- Feng, M., Heffernan, N.T., & Koedinger, K.R. (2009). Addressing the assessment challenge in an Intelligent Tutoring System that tutors as it assesses. The Journal of User Modeling and User-Adapted Interaction, 19,243-266. .
Award for Best Paper of the Year at the Journal
This is the Press release that WPI put out.
I was informed by the editor of UMAUI of the following: “I am pleased to inform you that your recent UMUAI article was selected as the winner of the 2009 James Chen Annual Award for Best UMUAI Paper. A prize committee of three editorial board members reviewed all nominated articles from the 2009 production volume and selected your paper as the winner.” UMAUI is one of the journals that has the highest impact factor ratings.
Follow up paper
The following paper follows up on this work and shows an even more impressive result in that in this work we controlled for time and showed ASSISTments is a better assessor that tradition paper and pencil.
- Feng, M. & Heffernan, N. (2010) Can We Get Better Assessment From A Tutoring System Compared to Traditional Paper Testing? Can We Have Our Cake (Better Assessment) and Eat It too (Student Learning During the Test)? Educational Data Mining 2010.
Follow up to the Follow up paper
I have a submitted AIED in paper that improves even further on the resultes in Feng & Heffernan (2010). Trivedi, S., Pardos, Z. & Heffernan, N. (2011) Clustering Students to Generate an Ensemble to Improve Standard Test Score Predictions Proceedings of the Artificial Intelligence in Education Conference. (Accepted) http://nth.wpi.edu/pubs_and_grants/papers/2011/AIED/Clustering%20Students%20to%20Generate%20an%20Ensemble.pdf
Cited in the National Educational Technology Plan
This article was cited in the NETP. Read the full text here but below is the relevant content.
- “The ASSISTment system, currently used by more than 4,000 students in Worcester County Public Schools in Massachusetts, is an example of a web-based tutoring system that combines online learning and assessment activities (Feng, Heffernan, & Koedinger, 2009). The name “ASSISTment” is a blend of tutoring “assistance” with “assessment” reporting to educators. The ASSISTment system was designed by researchers at Worcester Polytechnic Institute and Carnegie Mellon University to teach middle school math concepts and to provide educators with a detailed assessment of students’ developing math skills and their skills as learners. It gives educators detailed reports of students’ mastery of 100 math skills, as well as their accuracy, speed, help-seeking behavior, and number of problem-solving attempts. The ASSISTment system can identify the difficulties that individual students are having and the weaknesses demonstrated by the class as a whole so that educators can tailor the focus of their upcoming instruction. When students respond to ASSISTment problems, they receive hints and tutoring to the extent they need them. At the same time, how individual students respond to the problems and how much support they need from the system to generate correct responses constitute valuable assessment information. Each week, when students work on the ASSISTment website, the system “learns” more about the students’ abilities and thus can provide increasingly appropriate tutoring and can generate increasingly accurate predictions of how well the students will do on the end-of-year standardized test. In fact the ASSISTment system has been found to be more accurate at predicting students’ performance on the state examination than the pen-and-paper benchmark tests developed for that purpose (Feng, Heffernan, & Koedinger, 2009).”