- Subject: writing
- From: "Gerald W. Bracey" <gbracey@EROLS.COM>
- Date: Tue, 28 Sep 1999 16:57:13 -0400
- Comments: To: firstname.lastname@example.org, email@example.com, firstname.lastname@example.org, email@example.com, firstname.lastname@example.org, email@example.com, firstname.lastname@example.org, Ethan Bronner <email@example.com>, Randal Archibold <firstname.lastname@example.org>, email@example.com, firstname.lastname@example.org, email@example.com, firstname.lastname@example.org
- Reply-to: Assessment Reform Network Mailing List <ARN-L@LISTS.CUA.EDU>
- Sender: Assessment Reform Network Mailing List <ARN-L@LISTS.CUA.EDU>
QUESTION FOR MEDIA RECEIVING THIS ALERT: Why are you so uncritical about the NAEP proficiency levels whose problems I have relayed to you on various occasions?
Thanks to Alan Flanigan for calling attention to an AP story that turned up in online editions of the Washington Post. Don't know if it made any print editions or if it will tomorrow--likely it will: Claudio Sanchez carried it on All Things Considered around 4:35 p.m. EDT.
The story, by AP reporter Joseph Schuman, about whom I know nothing, begins with "About three-fourths of the nation's school children demonstrated only partial mastery of the knowledge and skills needed to write proficiently for their grade level, the Education Department reported today."
This is how the Department presented the results from the latest NAEP writing assessment, the first such since 1992 (the results are not comparable). The results are Disinformation because they are reported in terms of the percent attaining the various NAEP "proficiency levels"--basic, proficient and advanced. These levels were initially established when Checker Finn and his gang of ideologues ran NAGB (National Assessment Governing Board). They have been attacked since their inception as misleading. One study concluding this was conducted by the GAO; another by CRESST (Center for Research in Evaluation, Student Standards and Testing at UCLA and the U of CO at Boulder). The standards have generally been the object of scorn and derision from the psychometric community).
In a speech a couple of years ago, Lyle Jones of the University of North Carolina observed that in a NAEP mathematics assessment, only 18% of fourth graders attained the proficient level and only 2% attained the advanced level. Jones then pointed out that these same 4th graders were above average among students from the 26 nations that participated in TIMSS. He asked if it made sense to characterize American students so drearily when they performed so well in comparison to their international peers. He didn't answer the question. Didn't have to.
Perhaps the most thorough evaluation of NAEP performance levels was conducted by the National Research Council in a study published this spring. The council concluded that "the process for setting NAEP Achievement levels is fundamentally flawed... [It] should be replaced" (Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress).
The National Academy of Education has made similar recommendations.
When apprised of these recommendations, Checker Finn is said to have said, "I think the National Research Council and the National Academy of Education are fundamentally flawed." I can't vouch that it happened, but the source is pretty good, Paul Barton of ETS--not one to make this sort of thing up.
I expect that there is great room for the improvement of writing. For a long time, we stopped teaching writing, relying instead on the "editing" tests that came as part of commercial achievement tests and came, of course, in multiple choice format. But we need to have a valid means of describing how students write. The NAEP proficiency levels do not give this to us.
Post a Message to arn-l: