Re: Fwd: washingtonpost.com: Testing Errors Didn't Cause MSPAP Swings, Panel Says
- Subject: Re: Fwd: washingtonpost.com: Testing Errors Didn't Cause MSPAP Swings, Panel Says
- From: William Cala <Wcala@ROCHESTER.RR.COM>
- Date: Fri, 25 Jan 2002 14:29:27 -0500
- Reply-to: Assessment Reform Network Mailing List <ARN-L@LISTS.CUA.EDU>
- Sender: Assessment Reform Network Mailing List <ARN-L@LISTS.CUA.EDU>
Can't we FOIL that question to SED?
----- Original Message -----
From: "Ed Levine" <eddie185@YAHOO.COM>
Sent: Friday, January 25, 2002 9:23 AM
Subject: Re: Fwd: washingtonpost.com: Testing Errors Didn't Cause MSPAP
Swings, Panel Says
The cut point between a "2" ("Basic" level) and a "3"
("Meets the Standard") on the NYS Grade 8 Math exam
last year is rumored to have been around the 75th
percentile. No wonder all the kids "got dumb." Of
course, there's no way to confirm this, since all the
psychometrics on NYS tests are considered proprietary
information, belonging to CTB/McGraw Hill (AKA "The
Ministry of Education.")
--- William Cala <Wcala@ROCHESTER.RR.COM> wrote:
> Simply amazing. Yup, in one year's time (we should
> believe) countless
> schools forgot how to teach. Same thing happened
> here in NY. Isn't
> incredible how we all can do the wrong things at the
> same time? This year,
> 8th graders in ALL NY schools "got dumb." It
> couldn't have been the test or
> the scoring or the cut scores, they are put together
> by "experts."
> ----- Original Message -----
> From: "Victor Steinbok"
> To: <ARN-L@listsrva.CUA.EDU>
> Sent: Friday, January 25, 2002 3:41 AM
> Subject: Fwd: washingtonpost.com: Testing Errors
> Didn't Cause MSPAP Swings,
> Panel Says
> Testing Errors Didn't Cause MSPAP Swings, Panel Says
> By Nurith C. Aizenman
> Washington Post Staff Writer
> Thursday, January 24, 2002; Page B01
> A panel of experts has concluded that testing errors
> were not to
> blame for unexpected swings in scores on Maryland's
> elementary and middle school exams, clearing the way
> for state
> education officials to release the results Monday
> after a two-month
> delay, state authorities said yesterday.
> However, officials said they did not know whether
> the score
> fluctuations in the Maryland School Performance
> Assessment Program
> tests were caused by changes in the quality of
> instruction at the
> hundreds of schools affected, or by other, unrelated
> "We're hoping that once the individual schools get
> their data,
> they'll be able to unravel some of this," said Ron
> Peiffer, assistant
> state superintendent of schools.
> Scores for the 2001 MSPAP were originally scheduled
> to be released
> Nov. 27. But State Superintendent of Schools Nancy
> S. Grasmick
> decided to postpone their release after preliminary
> showed double-digit increases or declines in the
> marks of about 200
> schools statewide, as well as a drop in Montgomery
> County's overall
> score (officials would not release the amount of the
> Although such shifts are not unusual -- and although
> overall student performance did not substantially
> differ from that in
> the previous year -- officials were concerned
> because many of the
> swings could not be readily explained by changes in
> the staff,
> student demographics or instructional programs at
> the schools
> involved, Peiffer said. For instance, Prince
> George's County saw
> declines at even its most affluent schools, which
> have consistently
> posted above-average scores.
> Under pressure from about a half-dozen county
> superintendents, state
> officials hired consultants from the nonprofit, New
> National Center for the Improvement of Educational
> Assessment, at a
> cost of $40,000, to determine whether something had
> gone amiss during
> the complex process of developing and grading the
> According to center researcher Richard Hill,
> officials from
> Montgomery initially presented him with evidence
> that the score
> swings were the result of problems with the MSPAP.
> However, Hill said
> he found flaws in Montgomery's studies.
> After an afternoon of meetings about the matter
> yesterday, Montgomery
> schools spokesman Brian Porter said, "We have no
> comment at this
> Through a spokeswoman, Prince George's School
> Superintendent Iris T.
> Metts said Grasmick "has communicated well with all
> superintendents in the state and has included us
> throughout this
> process." Metts declined to comment further until
> the MSPAP scores
> are officially released.
> The score swings are troubling to local school
> officials because the
> MSPAP is Maryland's most important benchmark for
> assessing the
> quality of public schools. The tests are given
> annually to third-,
> fifth- and eighth-graders in reading, writing,
> language usage, math,
> science and social studies. They do not grade
> individual students.
> Instead, they attempt to measure how well schools
> are training
> students to solve problems and analyze information.
> Schools that do
> well are rewarded with cash. Those that repeatedly
> perform poorly
> face takeover by the state.
> Over a six-week period, the outside consultant
> examined the
> procedures and mathematical formulas used to draw up
> the MSPAP's
> questions, grade students' responses, and ensure
> that the 2001 tests
> were comparable to versions used in previous years.
> "We found no problems in any of those areas," Hill
> said. "In fact,
> our conclusion was that Maryland's program is quite
> strong. They have
> a lot of quality-control checks, and they were done
> Hill's findings were supported by a national panel
> of testing experts
> who regularly advise Maryland on testing issues, as
> well as a group
> of researchers at the University of Maryland,
> Peiffer said.
> Nonetheless, Peiffer and Hill cautioned against
> reading too much into
> the fluctuations until they can be analyzed on a
> case-by-case basis.
> For instance, many of the seemingly large score
> changes at individual
> schools may prove to be statistically insignificant
> because the
> number of children taking the test at the school was
> "At a school with 100 students taking the test, a
> change of 14 points
> ought to be interpreted as essentially no change,"
> Hill said.
> The best way to judge a school's performance is to
> consider long-term
> trends, Hill added. "You shouldn't be looking at how
> you did this
> year compared to last year. You should look at how
> you compare to the
> last six years, which allows all the sampling errors
> to balance out."
> Furthermore, even in cases where the change is
> outside the margin of
> error -- such as the countywide drop in Montgomery's
> scores -- the
> shift does not necessarily reflect changes in school
> quality, he said.
> "It could also be due to student motivation in
> taking the test that
> year, methods of administering the test, or
> demographic changes
> within the school district," Hill said.
> © 2002 The Washington Post Company
> To unsubscribe from the ARN-L list, send command
=== message truncated ===
Edward J. Levine, Ed.D.
NYC Board of Education
110 Livingston St. (306A)
Brooklyn, NY 11201
phone: (718) 935-3044
Do You Yahoo!?
Great stuff seeking new owners in Yahoo! Auctions!
To unsubscribe from the ARN-L list, send command SIGNOFF ARN-L
To unsubscribe from the ARN-L list, send command SIGNOFF ARN-L
Post a Message to arn-l: