India data show test scores rise when students are automatically promoted to the next grade
A controversial 2009 law in India outlawed the practice of holding failing students back and making them repeat the entire year of school in classes 1 through 8. In India, it’s called “detention” and at least one student union staged a protest this Spring to bring detention back, arguing that automatic promotion undermines academic quality and standards. But the Times of India published a story June 4, 2013 showing that automatic promotion might be working. It showed that average test scores were rising in states that had been following an automatic promotion policy prior to 2009, but falling states that were still holding kids back. The national drop out rate also fell by almost 30 percent after the law went into effect.
In the United States we call it “retention” and the practice is similarly controversial. A 2012 Wall Street Journal story by Stephanie Banchero cited studies by the Consortium on Chicago School Research at the University of Chicago. According to Banchero, it found that retained students did no better in later years than students who had nearly identical academic achievement but were promoted. Retained students also were more likely to have dropped out.
Nonethess, there’s still a big push to hold kids back, especially in 3rd grade, when they aren’t up to snuff in reading.
Teachers better at rating schools than parents and students
Schools are kind of like Congress. Most people claim they hate Capitol Hill, but they like their own representative. Similarly, people say the U.S. education system is broken, but they like the school that their kids go to. I’ve been doing alumni interviews for Brown for more than 15 years and my first question is always, “So, how do you like your high school?” One would think this is an opportunity to show off some critical thinking. But the answer is invariably something like, “I love it. My school is great.”
Nonetheless, more and more school districts around the country are spending gobs of money and time on surveying parents, students and teachers on what they think about their schools. In education lingo, they’re called school climate surveys. The federal government is helping funding them in 11 states. The idea is to have another data point besides test scores by which to assess schools and, hopefully, help identify areas that need improvement.
But how useful are these surveys really?
The Research Alliance for New York City Schools at the New York University Steinhardt School of Culture, Education, and Human Development took a hard look at the NYC survey to see what lessons could be learned.
The NYC School Survey is the second largest survey in the country, after the US Census. All parents, students and teachers in grades 6-12 are asked to complete it. Last year, in 2012, 476,567 parents, 428,327 students, and 62,115 teachers did.
The Research Alliance found that the survey results tell you more about the “differences between individuals within a school, and less information about how that school differs from other schools.” In other words, parents tend to give similar scores to whatever school their kid is attending. Students respond differently than parents, but also they tend to give equally high marks to bad schools as they do to good schools.
Teachers were much better at distinguishing which schools are higher quality. The Research Alliance suggests that teachers’ scores should be weighted more heavily.
For those who are interested in constructing these opinion surveys, the Research Alliance found that the NYC survey is too long and reports the results in an unnecessarily complex way. So many of the satisfaction questions were so highly correlated with one another that the survey could be cut in half and still produce as reliable a result. The authors also recommended that NYC just report a single “school environment” measure and stop reporting scores in four different categories: academic expectations, communication, engagement and safety. The four categories were deemed to be “statistically indistinguishable” from each other.
Number of young American adults with college degrees jumps 36 percent
A New York Times front page story and a Lumina report released Thursday, June 13, 2013 examine the sharp increase in college graduates. In 2012, more than a third of young American adults (25 to 29 years old) had at least a bachelors degree compared with less than 25 percent in 1995. That’s a 36 percent jump. Economists say a more educated workforce bodes well for the U.S. economy. But Indiana-based Lumina Foundation, which is pushing for more Americans to get college degrees, argues that the demand for high-skilled talent is still outpacing our ability to produce educated workers.
Interesting regional variations. Almost half of young Massachusetts adults have a bachelors compared with just 20 percent in Nevada. The Lumina report details graduation rates by metropolitan region and state.
Of course, a sharp rise in college graduates makes one wonder about the quality of these new bachelors degrees. Indeed, many of them have been minted by the for-profit sector. And it seems that so many college graduates are unemployed, drowning in debt and living at home. But the New York Times, citing the Census Bureau, points out that only 3.3 of college graduates aged 24-34 are currently unemployed.
(Disclosure: The Lumina Foundation has supported The Hechinger Report in the past.)
Less math is more: data supports Saxon Math curriculum
A math curriculum that reduces how much new content elementary students are exposed to each day was found to be effective, according to an analysis by Mathematica Policy Research. Mathematica looked at two studies that focused on Saxon Math, a curriculum designed by Houghton Mifflin Harcourt. The studies covered more than 8000 students in 11 states and found that students who used the curriculum, on average, did 3 percentile points better on math assessments than those who did not use the curriculum.
In addition to introducing new concepts gradually, the Saxon curriculum allows students to do practice homework in the classroom and constantly review old content. In third grade, the instruction shifts from teacher-directed to independent student-driven instruction. Kids frequently take assessment tests to help teachers identify and help struggling students.
Data analysis of MOOCs shows that many students skip videos
The MIT Technology Review posted, “As Data Floods In, Massive Open Online Courses Evolve,” on June 5, 2013. Writer Tom Simonite reports that both Coursera and Udacity data show that “a large subset of students who prefer to skip videos and fast-forward as much as possible.” Udacity is already restructuring courses to reduce the amount of video and is rerecording old videos.
I would encourage MOOCs to retain videos. But improve the liveliness and humor in them. Pack them with content. The production quality doesn’t need to be fancy. I suspect it’s not that students hate videos per se, but that students hate boring videos.
The big challenge for MOOCs is how to get more people to finish the online courses that they start. Only 10 percent of students complete them now.
Report urges that federal funds for class-size reduction should instead go to train teachers in data analysis.
The New America Foundation, a non-partisan think tank in Washington headed by Anne-Marie Slaughter, is calling for more federal funds and school time for teachers to use student data to change how they teach. The report, “Promoting Data in the Classroom,” written by Clare McCann and Jennifer Cohen Kabaker, was published on June 4, 2013.
There’s a ton of education data out there now. Every state in the nation now maintains a longitudinal data system that tracks each student’s test scores year after year. (That’s thanks to more than $620 million in federal funds for setting up state data systems since 2005, plus additional Race-to-the-Top grants). But McCann and Kabaker make the argument that, for the most part, they’re not being used by teachers to figure out how to teach their students better.
McCann and Kabaker describe recent efforts in Oregon and Delaware to get teachers to actually use the data. In both cases, it was time consuming. One Oregon school district got the school board to approve a later start time to the school day so that teachers could pore over data in the morning. Other schools set aside regular blocks of time during the school day for teachers and administrators to meet. Delaware hired professional data coaches from Wireless Generation, a private company now owned by Rupert Murdoch’s Newscorp. The idea was to examine the data and plan instructional changes, such as when to use whole group versus small group or individual instruction. Or the teachers could identify which students need additional help. Some teachers were resentful that it was taking away from conventional lesson planning. Many teachers struggled to find the required hours. And McCann and Kabaker point out there’s only so much you can do with year-end test data.
What data freaks really want are so-called “formative” tests that children take many times through the year so that you can see how much they’re learning before the year is over when there’s still time to make adjustments. But formative tests are at their infancy and there’s a lot of push back against adding more tests to the school year. Oregon didn’t have any formative assessments, but it is now starting some pilots.
The results?
In Oregon, schools that participated in the data program tended to see their reading scores increase more than schools that didn’t participate. It’s important to note that the participating schools tended to be lower performing at the start, back in 2008. By 2012, the students of the data-analyzers had, on average, surpassed the reading scores of the non-analyzers, but just by a hair (80.52 vs. 79.62). It was not as rosy in math. Students of the data analyzers did improve and close the achievement gap. But the non-analyzers were still outperforming the analyzers at the end.
In Delaware there is no data on whether the data coach program is working. That’s because Delaware was experimenting with other reforms at the same time and it’s difficult to say how much of a role the data analysis alone had on student achievement. But the state department of education is working to produce an independent analysis.
Delaware benefited from Race to the Top grants to fund its data coach program, but McCann and Kabaker are worried that Congress will cease funding it in 2014. Even President Obama’s budget doesn’t include any Race to the Top money for elementary and high schools.
Instead, the authors point to the “Improving Teacher Quality State Grants” program for advocates of data-driven instruction. It is currently used for class-size reduction and teacher training programs. The authors want Congress, when it reauthorizes the No Child Left Behind Act, to explicitly promote the use of data training projects with these funds instead. The authors also want the Department of Education to redesign the statewide longitudinal data systems grants to reward proposals that focus on the use of data, not just the existence of the data.
Not much educational data is yet improving classroom instruction
A May 28, 2013 blog post from the Michael and Susan Dell Foundation by Micah Sagebiel notes that after a decade of collecting and analyzing education data, since the No Child Left Behind Act of 2001, that classroom instruction is no better for it. So far, all this education data has mostly been used for “accountability” purposes, that is, to show how bad teachers are or how little students are learning.
The foundation argues that the data community needs to rally behind producing data that teachers will want to use. For example, daily reports that help summarize what kids have and haven’t learned and ways to tailor lessons or homework to help students learn what they don’t yet understand.
Presumably, that would be helpful. But when I think of data-driven ways to improve instruction, I think more of clinical studies. Just like academic doctors study whether it’s better to give a kid antibiotics or let an ear infection clear on its own, I’d like to see similar studies in education. I suspect, in education, that there are several good ways to teach a particular topic, such as how to do long division or how to teach Shakespeare. If I were a new teacher, I’d like to have, say, the five best ways described for me, maybe with a video example of each one. And it would be fascinating to know what kinds of student populations responded well to each of the methods.
Is anyone doing this kind of clinical trial work in education with control groups?
Education of girls and youth literacy varies widely in Africa, new educational data on developing nations
Last week on May 23, 2013 the Global Partnership for Education launched an Open Data Project that consolidates education indicators from 29 developing nations, from Afghanistan to Zimbabwe. The World Bank Development Data Group and the aid data organization Development Gateway are supporting it. The data posted so far is uneven and scanty. For many nations, a lot of data is not available.
But it’s interesting to compare youth literacy rates across countries and the extent to which girls are educated. For example, in Ghana, 95 girls complete primary school for every 100 boys who do. And 81% of the nation’s youth (between the ages of 15 and 24) are deemed literate. In Rwanda, more girls are educated than boys: 110 girls complete primary school for every 100 boys who do. But overall, fewer kids are educated in Rwanda. Only 77 percent of Rwanda’s youth are deemed literate. Unfortunately, there are too many countries where fewer than 40 percent of the youth can read. That includes Afghanistan, Burkina Faso and Niger.
I wish enough data were available so that I could do some mash-ups looking at how youth literacy and the education of girls correlates with economic growth and political stability. Is it really true that the more educated your populace, the more your economy grows? Some developing nations like Albania and Tajikistan claim a 100% youth literacy rate, but I suspect these two nations are growing more slowly than others with lower literacy rates. On the other hand, sometimes conventional wisdom is right. Ethiopia saw huge gains in youth literacy between 2005 and 2007, growing from 45 percent to 55 percent. Indeed, per capita GDP grew from $169 to $253 during this period.
You can access the data here.
This World Bank data blog, GPE launches open data project to better measure education progress and make it transparent, explains more about the data project.
The number of high-poverty schools increases by about 60 percent
Poverty is getting so concentrated in America that one out of five public schools was classified as as a “high-poverty” school in 2011 by the U.S. Department of Education. To win this unwelcome designation, 75 percent or more of an elementary, middle or high school’s students qualified for free or reduced-price lunch. About a decade earlier, in 2000, only one in eight public schools was deemed to be high poverty. That’s about a 60 percent increase in the number of very poor schools!
This figure was part of a large data report, The Condition of Education 2013, released by the National Center for Education Statistics on May 23, 2013. There’s a lot to chew on in it. But school poverty jumped out at me as a really depressing data point showing the growing income inequality in America.
Qualifying for free or reduced-price lunch is an imperfect measure of poverty. A mother with two kids who makes under $35,000 a year would be in this group. Certainly, that’s poor family in New York City, but maybe not destitute in Utah. I’ve also heard that many poor families feel that it is such a stigma to accept a discounted or free lunch that they don’t sign up for the program. So the poverty rates in many schools are probably much higher than the official statistics say they are.
Here is the chart of income thresholds to qualify for free and reduced-price lunch.
Public-school spending dropped for the first time
The Census Department reported on May 21, 2013 that spending in public elementary, middle and high schools fell 0.4 percent in fiscal 2011 to $10, 560 per student compared with fiscal 2010. That was the first ever spending drop in public education since the Census Department began tracking this figure in 1977. Here is the press release and here is the full report, titled Public Education Finances: 2011.. The figures are not adjusted for inflation. The drop would be much more dramatic if the figure were adjusted for inflation.
So the question is why? I’m pondering that today.
The report ranks large school districts and states by how much they spend per student. Among the 50 largest school districts in the United States, New York City spends the most per student at $19,770. The State of Utah spends the least per student at $6,212.