[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

MSPAP grading shocks teachers - text version



Here is the text of the article in Washington Post on MSPAP scoring procedures. The issues raised by teachers may not be unique at all to Maryland. Monty

MSPAP Grading Shocked Teachers
Even Wrong Answers Could Earn Points

By Brigid Schulte
Washington Post Staff Writer
Monday, February 4, 2002; Page B01

Hour after hour, Martha Stevens would scour the childish scrawl for the "key words."

This was the Maryland School Performance Assessment Program, the annual essay-based test that would determine which schools received extra cash from the state and which faced takeover, which principals and teachers received a grateful clap on the back and which shuffled off to other jobs.

Yet as a scorer for the high-stakes MSPAP last summer, Stevens found herself with little time to read the hundreds of thousands of student essays. Rather, she and other teachers were trained to look for key words. If the students used the words, they would get credit, even if the answer was wrong or made no sense. Answers that were perfectly sound but lacking the key words got a zero.

"We're professionals," said Stevens, a third-grade teacher at Oak View Elementary School in Silver Spring. "Yet when we go in and score, everything we value is thrown out the window."

Stevens and several other Montgomery County teachers have stepped forward in the past few weeks with horror stories about grading the MSPAP. They tell of being pushed to complete as many as 100 test booklets a day and only needing to be accurate 70 percent of the time. They also told of scoring criteria that kept changing midstream with no apparent effort to recheck the earlier scores.

But officials with the state Department of Education and the testing firm that supervises the grading dismissed such claims and said the process is rigorously controlled. They questioned the complaints, coming in a year when most of the state school districts saw test scores drop. Montgomery County's scores were among the biggest declines.

"It never has seemed to be as big a deal as it has been this year," said Jo Davidson, who manages MSPAP scoring for Measurement Inc., a 20-year-old North Carolina testing company. "And I don't know why. Because some people don't like the scores, I guess."

The teachers' complaints would not necessarily explain this year's drop in MSPAP scores; some scorers say such problems arise every year. But they drew the attention of Montgomery County School Superintendent Jerry D. Weast, who has spent the past few months trying to understand why 100 of his 118 elementary schools posted declines, including 26 with drops of more than 10 percentage points. Many of the schools are considered among the best in the state.

His protests about the scores helped persuade the state to delay releasing results for six weeks while consultants looked for problems. State officials say they found no problems, but Weast continues to question the test.

Since its inception in 1992, the MSPAP has been controversial. It is not designed to test how well a student remembers facts, as most standardized tests do. Rather, it asks students to apply facts and perform tasks. Students work in groups and talk about answers for some portions. And knowing how to write essay responses, even in third grade, is critical.

The test was designed to change the way teachers teach, and some maintain that it has revolutionized the classroom away from the old "drill and kill" style of teaching. The approach has drawn praise nationally, as educators extol its emphasis on critical learning skills rather than rote memorization.

But critics say that anyone who can read and write can do well on the test and that it measures process, not content. Also, parents have long complained that the test is shrouded in secrecy. Some students say they don't take it seriously because the test measures schoolwide improvement and does not give individual scores. Indeed, many of the scorers coming forward report seeing such responses as "I don't care" and "Who knows" from third-graders and lengthy essays from eighth-graders on how the test is designed to give teachers summer jobs.

But Stevens, along with Oak View teachers Shelly Turi and Myrna Schwadron, said they were committed to the test when they signed up to score the MSPAP last summer. After six grueling weeks, though, they lost faith.

"Scoring the test is the complete opposite of what we're told to do in the classroom," said Turi, a third-grade teacher. "I refuse to go back."

They described an assembly-line atmosphere where candy bars and other incentives were given to teachers who scored more than 100 booklets a day. "It's a sweatshop," said Jennifer Kawar, a third-grade teacher at Flower Valley Elementary in Rockville who helped score tests last summer. "You're constantly under pressure, pressure, pressure."

One middle school reading specialist who scored lengthy and often complicated eighth-grade English essays said she was surprised that many of the scorers in her room were not English teachers. And she was surprised at their speed.

"One scorer read an average of 90 booklets a day," recalled the grader, who spoke on condition of anonymity. "I asked him on a break what his secret was. He said, 'Just look at the length. You don't have to read all of it.' "

Davidson said Measurement Inc. tried to find teachers certified in specific subjects to score those test subjects but was often unable to. But she and Mark Moody, who heads the state's testing program, said teachers are trained to score no more than 60 books a day.

"Look, we do know in the beginning how many papers we have to score, how many people we have and how many days there are to do it, and mathematically we can come up with a number," Davidson said. "It isn't profitable -- and Measurement Incorporated is a profit-making company -- for us to keep on someone who is scoring less than half of what the average room rate is. There are deadlines we must meet."

Moody and Davidson said the speed did not lead to inaccuracies. Both said the 70 percent accuracy rate required of the MSPAP is the industry standard for such tests. Moody said MSPAP's average accuracy rate was about 76 percent. Teachers are checked three times a week, and scoring leaders -- who are also teachers -- review the graders' work to make sure they are hitting that mark, Moody added.

But what the state considers the correct answer is what has so many teachers upset. To a person, eight teachers interviewed by The Washington Post and about 30 others who brought their concerns to Mark Simon, head of the local teachers union, said they were told to give credit based on certain "key words."

Zena Rollins, who has 34 years' experience in the classroom and scored Montgomery County's local tests for several years, said she became so demoralized by the process that she quit halfway through.

"If a child had an excellent answer but didn't use a buzzword, or didn't use material from the text, they could conceivably get zero. But if a child used the buzzword but didn't make much sense, they'd get a point," she said. "I couldn't believe the zeros I had to give out because they didn't have the buzzword.

Moody and Davidson said they had never heard of scoring by key words. "I just can't imagine such a thing, because that's not the design of the scoring tools," Moody said.

Each teacher interviewed said questions and disagreements about what should be counted were raised continually with their scoring leaders. And every teacher reported that at least once, the scoring criteria changed in the midst of the process. "It's common in all the centers, and it happens throughout the period," said Stevens, who has scored the MSPAP for two summers. "It was horrible."

Moody called that assertion "patently untrue." Once every few years, he said, teachers raise concerns that cause the criteria to change and questions to be rescored, he said. Last year, one question on 1,500 tests was rescored. "We do go back and rescore. Does the same person see the papers again? Very unlikely."

Teachers were also upset that on math tests, students could end up with the wrong answer but receive credit for using math terms in their answer or explaining how they arrived at their answer.

Moody and Davidson called that complaint more philosophical. The test measures different skills, and one is the ability to explain. "If a kid is good at explaining what they've done, even if what they've done is incorrect, then he has done well," Davidson said. "That's one of the things we like so much about MSPAP -- it gives credit for doing the things kids know how to do and doesn't punish them for things they don't."

Peggy Salazar, principal of Oak View Elementary, said many educators appreciate the MSPAP. Part of the reason is political -- so much weight is put on the tests -- and part is because it has made some positive changes in classrooms. "It's been a constructive teaching tool because we're asking children to think at higher levels and come up with answers after thinking on their own," she said.

But her teachers' scoring experiences have turned her into a "MSPAP dissident."

"After listening to them, I knew we had a big problem on our hands -- egregious enough that it finally needed to be said aloud," said Salazar, who contacted Weast's staff with the complaints. "If the MSPAPs are not scored the way we're supposed to teach, why even put forth the effort? What's the point?"

©© 2002 The Washington Post Company




Monty Neill, Ed.D.
Executive Director
FairTest
342 Broadway
Cambridge, MA 02139
617-864-4810; fax 617-497-2224
email monty@fairest.org
web: http://www.fairtest.org