Top U.S. students, those that are among the top 10 percent of the population, lag far behind the top students in the highest achieving countries, a gap that is far bigger than the gap between the bottom students in the United States and elsewhere. That’s if you measure it by the results of the 2012 PISA test, given to 15 year olds across the world.
I wanted to dig deeper into the 2012 PISA test results, released Dec. 3, 2013, to see not just how the average American 15-year-old performs, but how both extremes are faring. First, I isolated the top 10 percent of students (aka 90th percentile) in each of the 65 countries or subregions tested by the OECD and ranked their math scores. The top 10 percent here earned a score of 600, on average, putting the US in 34th place. That’s about the same ranking as the average US student, whose score of 481 places 36th among nations. But the score gaps at the top are bigger than I expected. Top US students are more than a 100 points below the top students in Shanghai Singapore and Taipei. That’s the equivalent of several years of schooling. By contrast, the bottom 10 percent of US students lag the bottom students of the top achieving regions by much less than 100 points (except for in Shanghai, where the bottom 10 percent of students approach the score of the average American 15 year old!)
I’d also like to emphasize that scores of the top 10 percent in the United States have been declining over the past decade. See this post.
Here follow charts for the top 10 percent, the bottom 10 percent. I took spreadsheets from the PISA report annex, filtered for 90th and 10th percentile scores, ranked them and deleted everything else. For reference, here’s a link to the global rankings, that list the average scores for each nation. See Table 1A, page 19.
2012 PISA Math Scores for the Top 10 Percent (90th Percentile)
|Ranking||Country or Subregion||90th Percentile Score|
|47||United Arab Emirates – Ex. Dubai||538|
|Version 1 – Last updated: 26-Nov-2013|
|Distribution of scores in mathematics in PISA 2003 through 2012, by percentiles|
2012 PISA Math Scores for the Bottom 10 Percent (10th Percentile)
|Rankings||Country or Subregion||10th Percentile Score|
|47||United Arab Emirates – Ex. Dubai||318|
|Version 1 – Last updated: 26-Nov-2013|
|Distribution of scores in mathematics in PISA 2003 through 2012, by percentiles|
|PISA 2012||Change in percentiles between 2003 and 2012
(PISA 2012 – PISA 2003)
|Score||Score||Score||Score||Score dif.||Score dif.||Score dif.||Score dif.|
Source: OECD, PISA 2012, Annex B1 (Table I.2.3d)
Overall, the test score of the average US 15-year-old student hasn’t changed much in the past decade. PISA math scores for 2012 were 481, not that different from the 483 posted in 2003.
But that masks movement among different student groups. It turns out that the weakest 10 percent of US students have been steadily improving, gaining 11 points over the past decade. Meanwhile, the top two tiers of American students at the top 10th and top 25th of the nation are performing worse today than they did a decade ago, losing 7 points and 6 points respectively.
So I stand corrected from an earlier post where I wondered if US students were simply stagnating while students in other nations, especially in Asia, are learning more and improving year after year. Top US students are not only being outrun by their Asian peers, they’re sliding backwards.
I wonder if the pressure on US teachers to bring as many students as possible up to some sort of arbitrary “proficient” level on state assessments is exacerbating the problem at the top. In order to maximize the number of students that can hit a minimum threshold, instruction inevitable gets directed down toward the students who are just below average to help prop them up. Meanwhile, advanced students are being asked to repeat the basics over and over again and aren’t being pushed. It’s another argument for measuring learning gains or educational growth among all students and not how many kids can hit a certain threshold.
These PISA results, broken down by percentile, have me scratching my head a bit. Earlier national test data from NAEP showed that top students were improving more than bottom students. There were many examples of the bottom students slipping.
In the 2012 international test that measures what 15-year-old students know, called PISA, private school students did only a smidgen better than public school students on the math test. Almost seven percent of American 15-year-olds attend private school and they scored an average of 486, only four points more than the average public school student, and still below the international average of 494. Private school students did do a bit better in science at 508, surpassing the international average of 501.
Where private school students shine is in reading, outperforming their public school peers by 22 points. Private school students, if they formed a separate nation, would rank at #10 behind Ireland in this subject. However, if we broke out the private school students for each nation, their scores would be higher too and American private school kids would no longer be among the top 10 readers. Indeed, US private school students would be no better than average.
|2012 PISA test scores of public school and private school students|
|United States of America||public||93.01||497||482||498|
|United States of America||private||6.99||519||486||508|
|OECD Total = (OECD as single entity) – each country contributes in proportion to the number of 15-year-olds enrolled in its schools|
|OECD Average = (country average) – mean data for all OECD countries – each country contributes equally to the average|
|Data generated from http://pisa2012.acer.edu.au|
Top US students fare poorly in international PISA test scores, Shanghai tops the world, Finland slips
Conventional wisdom is that top U.S. students fare well compared to their peers across the globe. According to this line of reasoning, the US doesn’t make it on the list of the top 25 countries in math (or top 15 in reading) because America has higher poverty and racial diversity than other countries do, which drags down the national average.
The latest 2012 PISA test results, released Dec. 3, 2013, show that the U.S. lags among 65 countries (or sub country entities) even after adjusting for poverty. Top U.S. students are falling behind even average students in Asia. I emphasize Asia because Asian countries (or sub entities) now dominate the top 10 in all subjects: math, reading and science.
In descending order from the top spot in math, they are (1) Shanghai, (2) Singapore, (3) Hong Kong, (4) Taipei, (5) Korea, (6) Macao, (7) Japan, (8) Lichtenstein, (9) Switzerland and (10) the Netherlands. Most of these countries are also posting top-of-the charts reading scores. (Here’s the global list. See Table 1.A on page 19. I also chain the list — in two parts — at the bottom of the post for those who are having trouble clicking on the pdf file. Click on it to see a larger full-screen version.)
Let’s break down the data for the 2012 PISA (Program for International Student Assessment) conducted by the Organization for Economic Cooperation and Development (OECD) taken by 15 year olds around the world.
* The United States has a below average share of top performers in mathematics. Only 2% of students in the United States reached the highest level (Level 6) of performance in mathematics, compared with an OECD average of 3% and 31% of students in Shanghai, the top performing entity in this year’s PISA test.
* Students at the 90th percentile in the United States — the very top — are below the average student in Shanghai. Top U.S. students scored 600 in math. The average score in Shanghai was 613. (Click on chart at the top right of the page to see this in more detail).
* Massachusetts, the top performing state in the nation, did not come close to the top 10 in math. Their 15 year olds scored a 514 in mathematics, placing the state even with Germany at number 16. (To put this in context, Germany is alarmed by how low its PISA scores are.) Massachusetts did prove better in reading. Only three education systems scored higher.
* Poverty rates alone do not explain low U.S. test scores. In a telephone briefing, Andres Schleicher explained that the OECD attempted to adjust test scores for income and put all the students of the world on a level playing field. It turns out that the US has slightly lower poverty and diversity than other OECD countries on average. The average U.S. test score dropped after making this adjustment.
* There is also a problem at the bottom end in the United States. The scores of low-income Americans are exceedingly low. The U.S. has a higher percentage of kids that can’t even hit the lowest levels on the math tests than other OECD countries do on average. So, it is true that the scores of poor U.S. students are dragging the average down. Still, absent poor students, U.S. scores would still be low.
* Finnish slide. Seven years ago, U.S. educators and policy makers were all traveling to Finland, trying to understand the secrets behind its high achievement. Finland declined between 2006-2009 and again between 2009-2012, scoring 548 in math in 2006 and 519 in 2012. Finland is firmly out of the top 10 in math and science, although its reading scores are still high. The OECD’s Andres Schleicher says that demographic changes and immigration have not been high enough to explain the test score slide and it’s a bit of a mystery.
* Poland is showing substantial increases in test scores on all three tests, rising well above the United States.
* Vietnam’s debut on the list is very impressive. This high poverty nation falls between Austria and Germany at #17.
* Stagnation. U.S. scores on PISA exams haven’t improved over the past decade. See here. That’s a bit of a contrast from the NAEP exam where American students have been showing modest improvement. I believe the NAEP exam plays to U.S. strengths of simple equation solving. It has fewer word problems where students have to apply their knowledge to a new circumstance and write their own equations and models.
* Shanghai was also the top performer in 2009. Other provinces in China are expected to start reporting PISA results beginning with the next 2015 test and have similarly high scores.
* Asia rising. Notice the strong gains among the top performing countries. Shanghai, Singapore, and the next four education systems are all posting strong annual gains on their PISA tests. It may be that top US students aren’t getting weaker, but stagnating, while the rest of the world, especially Asia, is getting stronger.
* $$$$: The OECD data show almost no link between spending on education and PISA test results. Wealthier nations tend to score better. But the amount of money that a nation spends on education doesn’t seem to matter much. The United States is one of the biggest spenders in education, spending $115,000 per student on average between the ages of 6 and 15. The Slovak Republic spends less than half that amount at achieves similar test scores. Only four countries spend more than the United States: Austria, Norway, Luxembourg and Switzerland.
* Test quality. I took sample questions from the 2012 PISA math test and was impressed with the sample questions. Many are not multiple choice. So you can’t always use a Princeton Review technique of eliminating answer choices. You have to calculate answers yourself. I was also surprised by how many word problems there were in which you had to come up with models and equations yourself and not just solve for x in a given equation.
* Cheating. In previous posts and among colleagues, questions are coming up about cheating, especially in China. I haven’t seen evidence of widespread cheating on PISA tests that would effect a nation’s score. I know that an outside Australian contractor is involved in administering the PISA tests in China. But please comment if you have any information on PISA cheating.
Yes, the United States has an achievement gap. Poor students are doing poorly. But our top students are nothing to brag about.
Unclear where U.S. students stand in math and science (Oct. 25, 2013)
A new statistical analysis by the National Center for Education Statistics sheds some light on why so few Americans pursue STEM subjects (Science, Technology, Engineering and Math) in college. “Some 28 percent of beginning bachelor’s degree students and 20 percent of beginning associate’s degree students entered a STEM field at some point during their enrollment between 2003 and 2009. As of 2009, 48 percent of the bachelor’s degree STEM entrants and 69 percent of the associate’s degree STEM entrants had left these fields by either changing majors or leaving college altogether without completing a degree or certificate.”
What really surprised me was that women have more staying power in STEM subjects than men do.
“Bachelor’s degree STEM entrants who were male or who came from low-income backgrounds had a higher probability of leaving STEM by dropping out of college than their peers who were female or came from high-income backgrounds, net of other factors.”
The Data Quality Campaign issued its annual survey, Data for Action 2013, of how states are collecting and using education data on Nov. 19, 2013. The advocacy group argues that using data more would improve education policy and classroom instruction. It reported that two states, Arkansas and Delaware, were using data the most. But they’re also seeing a widespread growth of data collection and crunching around the country.
High school feedback reports are a good example. These reports show how graduates from a particular high school fare when they go on to college. The bar chart I created (using data that DQC helped me pluck from several years of surveys) shows that more than 80 percent of U.S. states are now producing a publicly available high school feedback report. Not all of them are useful, high quality ones. (Only seven states are producing great ones, according to DQC). The group argues these reports are important because they help parents dig deeper than graduation rates, and learn whether graduates of a particular high school were able to handle college math right away or whether they had to take remedial classes first.
Another measure of how data use is becoming institutionalized in education is that the data systems themselves are becoming part of state budgets. Back in 2009, only nine states were funding their own student data systems that can track students from kindergarten through college. Now, 41 states are funding them.
The group also reported that 35 states now give teachers access to student data through some sort of computer dashboard. But it’s still hard for teachers to use this data to target academic weaknesses and help change their instruction on a daily basis.
Paige Kowalski, director of state policy and advocacy at the Data Quality Campaign, said that “these systems are pretty new” (built within the last six years). Thus far, she said, states have been focusing on easy-to-produce aggregate reports, such as the high school feedback reports. “When you’re talking about student-level data, it gets trickier. Privacy. Log-ons. And there’s so much data. No teacher wants to look at 500 data points on a screen. There’s a lot more to figure out,” Kowalski explained.
I remain obsessed with trying to understand the gigantic seven point surge in scores that Washington DC posted on the 2013 NAEP national assessment, which I first reported on Nov. 7, 2013. Last week, on Nov. 15th, I broke the test data down by race and noticed that while black scores did improve, the seven point increase was more influenced by the growing population of white and Hispanics. Both groups, on average, have even higher test scores than blacks do. A blog reader asked me if I looked at socioeconomic status. Unfortunately, NAEP doesn’t have a SES variable, but it does look at which students are low income as measure by whether they qualify for free or reduced price lunch.
Average Fourth Grade Mathematics Scores in Washington DC on the 2013 NAEP
Qualifies for free or reduced price lunch
Doesn’t qualify for free or reduced price lunch
So it’s interesting to see that the lowest income students are not driving the gains as much as the middle class and upper income students are. That jives with what NCES Commissioner Jack Buckley noted nationwide, that the bottom students are not making the same incremental progress that the top students are.
I also broke down the DC results by percentile, but didn’t see the same stark trend. Not sure what to make of this…
Average Fourth Grade Mathematics Scores in Washington DC on the 2013 NAEP by Percentile
Back in 2010, experts were stunned when 15-year olds in Shanghai, China earned the top scores in reading, math and science on the 2009 PISA exams, also known as Program for International Student Assessment. And when the 2012 results come out on Dec. 3, it seems that Shanghai may be poised to do it again, according to researchers who are familiar with the preliminary results.
Education testing experts cautioned against comparing Shanghai to an entire nation, such as Japan or the United States. The megapolis of 23 million is one of the wealthiest, most cosmopolitan cities in China. Still, low income residents are part of the sample of students who are tested*. And this year, we will be able to compare Shanghai with comparable sub-regions of other countries. (My prediction: Massachusetts does miserably compared to Shanghai).
Researchers say they are also seeing high test scores in other Chinese provinces where PISA trials are taking place, but official scores from regions outside of Shanghai won’t officially be reported until 2015. “You will be surprised at how strong some of the results are in the provinces,” said Andreas Schleicher of the Organization for Economic Cooperation and Development, which administers the PISA tests.
Shanghai’s replication of results, combined with strong test results in the provinces, make me want to ask this question: Does China have the best educational system in the world?
Some might dismiss the test results and say that Chinese students are good testers, but don’t necessarily have the higher order thinking skills and creativity that other education systems try to cultivate. Its national curriculum is built around exam preparation. On the other hand, China is clearly doing something right and it’s worth understanding the nuts and bolts of their system. In this write up about the Chinese educational system by the International Center for Educational Benchmarking, two things popped out at me: 1) large class sizes (50 students/class); 2) specialized teachers, who might only teach one particular class, such as “Senior Secondary 2 Physics”, but they teach it multiple times a day.
Schleicher adds that China differs from other top performing countries in that its teacher workforce isn’t drawn from the top students in Chinese society, as the teaching ranks are drawn from the top third in Japan, Finland or Singapore#. Rather, in China, the average teacher was himself an average student in high school. Instead, China boosts the professionalism of the teaching profession through constant teacher training. About 30% of a teacher’s time every year is spent on professional development.
*The children of migrant workers, who number about 9 million in Shanghai, are believed to be largely excluded from the PISA results. Migrants who don’t have Shanghai residency are not entitled to education there. But Shanghai recently relaxed its residency policies this past summer. Future tests might include migrant children
#By contrast, teachers tend to come from the bottom third in the United States.
As I wrote on Nov. 7, 2103, Washington DC posted the one of the strongest test score gains in the nation on the 2013 National Assessment of Educational Progress and I wanted to look at how demographic shifts in the nation’s capital might be influencing these test results. I began by constructing this table.
Washington DC NAEP Test Results for Fourth Grade Mathematics
Sources: Table A-12. Percentage distribution of fourth-grade public school students assessed in NAEP mathematics, by race/ethnicity, eligibility for free/reduced-price school lunch, and state/jurisdiction: 1992, 2003, and 2011
So my question is, what drove the 7 point increase in test scores? When you break the test scores down by ethnicity and weight them by their percentage of the student population, it’s interesting to see how both white and Hispanic test gains contributed more to the average score than black gains. True, black test scores increased by 6 points on average, but their share of the population is falling. Meanwhile, whites and Hispanics are growing populations in the city and their smaller test score gains are proportionally accounting for more.
I also found it interesting to see that the average White test score in the District is 276. That’s a very high number, surpassing the average white test score of Massachusetts by 16 points. I think that shows just how wealthy and highly educated the average white family is in Washington DC. But the whites of Northwest DC shouldn’t be too smug. Only 6 percent of the District’s students tested at or above the “advanced” threshold. In Massachusetts, 16 percent of the fourth graders are “advanced”.
As always, comments welcome.
I was just looking at some updated “State Education Reform” statistics from the National Center of Education Statistics and was trying to make sense of the numbers of charter schools in each state. California has the most charter schools by far at 908, but it’s also the most populous state. So I decided to rank some of the states with high numbers of charter schools by the number of charter schools per capita. I was surprised to see that Louisiana doesn’t rise to the top, but Wisconsin does. Chained below are the original figures.
|State||Number of Charter Schools Operating 2010-11||Population (US Census 2010)||Charter Schools Per Capita|
Source: National Center for Education Statistics (NCES) Table 4.3 of State Education Reforms