The Alaska Measures of Progress test has been met with disappointment by some school district superintendents, lawmakers, and even the state’s Department of Education. Some were distressed by the delayed release of results, and others say the scores don’t give educators specific-enough information about students’ progress. The Kansas-based contractor that created AMP says it can resolve those issues.
The state of Alaska has a $25 million, 5-year contract with the Achievement and Assessment Institute (AAI) at Kansas University to develop and score AMP.
The scores were a few weeks late in being released this fall.
AMP project director Marianne Perie at AAI says much of that delay was due to Alaska-specific issues that caught her team off-guard.
“For example, Alaska has a lot more mobility than Kansas, where we were used to working,” says Perie. “So we had quite a few students who changed schools within the testing windows; they would start the test in one school and finish in another school. And for us it actually created two separate records that we had to reconcile.”
While Perie apologized that the data cleanup took longer than anticipated, she says the scores themselves were always valid.
“The scores were triple-checked, we know those were those were right,” says Perie. “It was just trying to make sure we had the right kids associated with the right buildings and got to the right people to get out score reports. And we have learned a lot this year that we will implement next year, and we will get better.”
Technical issues aside, Perie says AAI also plans to address educators’ concerns about the usefulness of the test scores.
The plan involves adding another type of question to the test.
The first round of AMP was made up only of test items that could be scored by machines – that means multiple choice or simple short-answer questions.
“But in order to get at some of these other sub-scores we want to produce, we have to have the student actually develop a mathematical model, or explain and communicate their reasoning,” explains Perie. “‘Why do you think this is the right answer’ or ‘How did you come to this answer?’ And actually have them write a response or show their work.”
These new questions, called “performance tasks,” will have to be hand-scored by people, not machines.
And since the performance task asks students to write essays or show their work, they may add another hour to two hours to overall testing time.
But Perie says they should get at the kind of information educators want about how their students are performing.
“One of the exciting things is that we will actually release the performance task when we release the score report,” says Perie. “So everybody can see, what was the student responding to, what were the questions? We will return to schools the students’ responses, so that teachers can sit down with students and say, ‘Here was the question, here was your response, let’s talk about why you didn’t get a perfect score…’ So hopefully it’ll have more instructional value.”
For mathematics, the performance tasks will allow AAI to report out more scoring categories.
Perie says the performance tasks were originally meant to be added during the second annual AMP, in the 2015-2016 school year. But Alaska’s Education Commissioner
Mike Hanley pushed that back a year, not wanting to burden superintendents with another test component just yet.
Hanley will meet with AAI Tuesday to discuss how to fix this year’s AMP issues. He’ll then meet with a working group of superintendents later this week.