[Date Prev][Date Next
][Thread Prev][Thread Next
PISA for higher ed?
- To: firstname.lastname@example.org, email@example.com, ndsg <firstname.lastname@example.org>, epata <email@example.com>, firstname.lastname@example.org
- Subject: PISA for higher ed?
- From: Monty Neill <email@example.com>
- Date: Sun, 2 Jan 2011 14:34:22 -0500
Pondering PISA’s Promise for Higher Ed
December 13, 2010, 8:24 am
By Ben Wildavsky <http://chronicle.com/blogs/worldwise/author/bwildavsky/
How should the United States interpret last week’s international PISA test
scores? And do the results of the assessment, which is administered to
15-year-olds around the world in reading, math, and science, have
significant implications for higher education? Some thoughts:
The stellar scores of students from Shanghai on the exams, which are
sponsored by the Organisation for Economic Cooperation and Development
(OECD), instantly fed into U.S. angst about our place in the worldwide
educational pecking order. But they shouldn’t necessarily be read that way.
Shanghai simply isn’t representative of China as a whole: It’s a talent
magnet and the beneficiary of extensive government investment in education.
Scores for the United States and other nations, by contrast, reflect the
performance of a cross-section of teenagers. The *New York Times* commendably
the second and third paragraphs of its story, but the inevitable “the
Chinese Are Eating Our Lunch” meme may simply be too hard for commentators
and policy makers to resist.
Note, too, that the performance of U.S. students hasn’t
They actually made some gains in science and math, rising to the
international average in the former while remaining below average in the
latter among the 34 OECD member countries. In reading, scores were more or
less unchanged, leaving American teenagers in the middle of the pack
These results will of course serve as a Rorschach test for everybody’s views
on what kinds of education policy are most successful or deleterious,
whether in high-performing places like China, Finland, South Korea, or here
in the lackluster United States. Probably the best and funniest
phenomenon came in this laconic early-morning blog post from Kevin
Huffman. “The cool thing about this is that we can do this again in
2012-13,” he concluded. “It’s so awesome to have a Sputnik moment every
As I’ve argued in the context of university
the high-performance of other nations ought not to be cause for
hand-wringing in the United States. Educational improvement is not a
zero-sum phenomenon—we’re all better off in a world in which more countries
successfully build human capital.
All this said, whether or not one believes that we need a Manhattan Project,
a Marshall Plan, or some equivalent, there is little question that the
United States is underperforming vis-à-vis its potential. The mediocre
showing of U.S. students is not something about which we ought to be
complacent. And our students’ disappointing results on previous PISA tests
have had a useful effect in galvanizing continuting efforts at elementary
and secondary education reform.
PISA results also matter for higher education, in the United States and
everywhere else, because strong secondary-school preparation is vital to
creating successful universities. Without improvements throughout the
educational pipeline, even better access to postsecondary education may not
be accompanied by a corresponding level of degree completion, as the U.S.
experience, unfortunately, demonstrates.
International comparisons of universities on measures such as graduation
rates can, like the PISA results, be misread as winner-take-all
Nevertheless, we could use new assessments that compare higher-education
systems around the world and, where necessary, spur underperforming nations
into action. I believe global university rankings can be more helpful than
critics acknowledge <http://www.youtube.com/watch?v=OS2wbgf-Uus
>, but they
compare individual institutions rather than entire countries, and they do
nothing to measure what undergraduates actually learn on campus.
As it happens, the OECD is also playing an important role in improving this
state of affairs with its relatively new AHELO
short for Assessment of Higher Education Learning Outcomes. This academic
year and next, AHELO is carrying out a feasibility study of tests in
specific subjects—economics and engineering—and in “generic skills” such as
analytical reasoning in 15 OECD countries, including Mexico, the United
States, Sweden, Egypt, South Korea, Japan, and Australia. The initiative as
currently envisioned will examine a modest number of universities in each
country, not national samples of students. No wonder. A 2006 background memo
describing the nascent project as “PISA for Higher Education” generated
considerable controversy. Ever since, the OECD has backpedaled furiously,
taking pains to note that AHELO isn’t intended to rank nations at all.
Still, it isn’t hard to see why a substantive comparison of teaching and
learning in universities around the world could be very useful. That
contentious 2006 memo (no longer available online, alas) made the case very
*a direct assessment of the learning outcomes of higher education could
provide governments with a powerful instrument to judge the effectiveness
and international competitiveness of their higher education institutions,
systems and policies in the light of other countries’ performance, in ways
that better reflect the multiple aims and contributions of tertiary
education to society.*
The AHELO project faces considerable methodological challenges. But if it
moves ahead successfully, as I hope it does, its architects hope eventually
to produce value-added results, looking not only at a snapshot of what
students know but at how much they improve during their time at university.
This would be a breakthrough in global higher-education assessment.
Moreover, protestations from the OECD notwithstanding, it’s easy enough to
see how a cross-national comparison of individual institutions today could
eventually become a comparison of representative samples of student
performance in different countries, just like … PISA! That would mean, yes,
a global ranking of postsecondary learning outcomes.
Like last week’s international test results for 15-year-olds, such measures
might give rise to misinterpretation, misplaced zero-sum alarmism, and so
forth. But new and improved tools for self-assessment and global comparisons
on important dimensions of higher education, whether created as an outgrowth
of AHELO or through other initiatives, could serve as catalysts for
improvement that are useful—and overdue.
This entry was posted in
Bookmark the permalink<http://chronicle.com/blogs/worldwise/pondering-pisas-promise-for-higher-ed/27705
Monty Neill, Ed.D
Interim Executive Director
15 Court Sq, Ste 820
Boston, MA 02108
857-350-8207; fax 850-357-8209
Post a Message to arn-l: