Three lessons from the science of how to teach writing

AP Stock Photo

What’s the best way to teach writing? The experts have many answers — and they often contradict each other.

In contrast to the thousands of studies on effective methods for teaching reading and mathematics, there are relatively few rigorous studies on writing instruction. That’s partly because it’s time-consuming and expensive to assess writing quality in a way that can be quantitatively measured. Commonly, researchers come up with an eight-point scale. They write descriptions and sample essays to show what each score involves. Then they train teams of graders to score properly and consistently. But writing quality is ultimately a subjective judgment. What you consider to be well-written, I might not.

Steve Graham, a professor of education at Arizona State University, has made a career out of monitoring research studies on teaching writing, to figure out which methods actually work.  For a forthcoming article*, Graham and two colleagues, Karen Harris  of ASU and Tanya Santangelo of Arcadia University, looked at approximately 250 of the most prominent studies on how to teach writing to students from kindergarten through 12th grade.

This article also appeared here.

This article also appeared here.

Graham’s review of the research doesn’t resolve the age-old debate of whether students learn writing best naturally —  just by doing it — or through explicit writing instruction.

But there are effective practices where the research is unequivocal. Distressingly, many teachers aren’t using them. “We have confirmation of things we know that work, but are not applied in the classroom,” said Graham.

Here are three:

1. Spend more time writing

To teach kids to write well, you need to ask them to write a lot. You’re not going to become a great basketball player unless you play a lot of basketball. The evidence is strong that this is true for writing too. Five studies of exceptional literacy teachers found that great teachers ask their students to write frequently. In nine separate experiments with students, 15 additional minutes of writing time a day in grades two through eight produced better writing. Seventy-eight percent of studies testing the impact of extra writing found that student’s writing quality improved.

Several studies found unexpected bonuses from extra writing time. Not only did writing quality improve, so did reading comprehension. Another cluster of studies proved that writing improves a students’ mastery of the subject; the act of writing helps you learn. (Another reason for teachers to refrain from spoon-feeding printed notes to students.)

However, surveys of U.S. teachers reveal that after third grade, very little time is spent writing in classrooms. In fourth through sixth grade, on average, 20-25 minutes a day is spent on writing, according to Graham. Writing assignments rarely extend beyond a page; sometimes they’re not more than a paragraph. This is what teachers self-report, and if anything they’re probably overstating how much writing they’re asking of students.

In a 2011 survey of classroom writing instruction, “A Snapshot of Writing Instruction in Middle Schools and High Schools,” published in English Journal, Arthur Applebee and Judith Langer at SUNY Albany found that U.S. students were expected to write only a total of 1.6 pages of extended prose for English a week, and another 2.1 pages for all their other subjects combined. Applebee and Langer also observed classrooms across the four core subjects (English, science, math and social science/history) and found that, on average, only 7.7 percent of classroom time was devoted to writing a paragraph or more. Applebee and Langer called that “distressingly low.”

Why so little writing? Graham hypothesizes that many English language arts teachers are more passionate about literature than teaching writing. But in surveys teachers often say they don’t assign more writing because they don’t have the time to read and provide feedback on frequent long assignments. I can sympathize with a high school English teacher who has 37 kids in her class.

One could argue that fewer high quality writing assignments might be better than a bunch of low quality ones. But again, the teacher surveys and classroom observations reveal that students are more commonly asked to write summaries. “We don’t see a high level of writing activities that involve analysis and interpretation,” said Graham. “We’re not seeing development of skills you need for college and the workplace.”

Common Core may change things, as the standards ask for more writing and analysis, not just in English class but also in the social sciences, hard sciences and math.

It’s unclear what the ideal amount of time for writing is. Graham, who wrote a teachers’ guide of evidence-based techniques for teaching writing for the What Works Clearinghouse unit of the Department of Education, recommends one hour a day. He admits he doesn’t have research to substantiate that number. But he may be onto something: When Poland increased its language arts classes to more than four hours a week for each student, its scores on international tests began to soar.   

2. Write on a computer

In 83 percent of 30 studies on the use of word processing software, students’ writing quality improved when they wrote their papers on a computer instead of writing by hand. The impact was largest for middle school students, but younger students benefited, too. The theory is that students feel more free to edit their sentences because it’s so easy to delete, add and move text on a computer. The more editing, the better the final essay.

I was concerned about how these experiments were constructed. Could graders have been more biased toward these word-processor essays because typed fonts are more legible than hand-written ones? In most cases, the hand-written essays were retyped first before the graders scored them. So graders had no idea which essays had been drafted by computer and which by hand, and still the word-processor essays were rated higher.

It’s also possible that the spell checkers and grammar checkers that are sometimes bundled with word processing software enable students to submit cleaner drafts, which are perceived to be of higher quality.

Some educators feel passionately about the importance of writing by hand, convinced that the act of writing neurologically imprints stronger memories. And there’s some early evidence that note taking might be more effective by hand. But if your goal is writing quality and not memorization, it seems the evidence points to word processing, especially  beginning in middle school.

Another benefit for educators who believe that students should write not just for teachers: computerized text files are easier to share with classmates, providing more opportunity for a real audience and feedback.

Despite this evidence,teacher observations and surveys reveal that teachers have been slow to adopt this basic technology.  In Arthur Applebee and Judith Langer’s observations, students used word processing software in only 5.1 percent of the classes. Separate 2008 and 2010 surveys by Graham show that “too many schools still use pencil and paper as the primary or only writing medium,” he wrote.

3. Grammar instruction doesn’t work

This article also appeared here.

This article also appeared here.

Traditional grammar instruction isn’t effective. Period. Six studies with children in grades three to seven showed that writing quality actually deteriorated when kids were taught grammar. That is, graders scored the essays of students who’d been taught traditional grammar lower than those of students who had not received the lessons.

Three studies did show that teaching kids how to combine two simple sentences into a single complex sentence was beneficial. (As a writer, I find that baffling as I am always trying to shorten my sentences! That makes me question the judgment of the essay graders.)

But traditional grammar — diagramming sentences or teaching grammar rules — didn’t help. Graham suspects that’s because grammar lessons often feel disconnected from actual writing. Graham found one study that showed great improvement in student writing quality when teachers modeled correct usage, showing how to use grammar rules in sentences that students were drafting. But not many experimental studies are looking at effective procedures for teaching grammar.

In this case, classroom practice isn’t totally at odds with the research. Grammar instruction has declined in U.S. classrooms over the last 40 years. But that might be because there isn’t much writing instruction going on at all.

* “Research-Based Practices and the Common Core: Meta-Analysis and Meta Synthesis,” (in press for The Elementary School Journal)

Related stories:

With new standards, can schools find room for creative writing?

Education Nation: Revived support for grammar instruction

Robo-readers aren’t as good as human readers — they’re better

Twenty five percent of low-income urban high schools beat the odds

It won’t surprise anyone to learn that wealthier high schools send more students to college than low-income high schools. But a October 2014 report from the research arm of the National Student Clearinghouse, which tracks college students, reveals that a quarter of low-income urban high schools are doing better than a quarter of their high-income counterparts.
This article also appeared here.

This article also appeared here.

On average, low-income urban high schools with high concentrations of minority students sent about half, or 51 percent, of their 2013 graduates to college in the fall immediately following graduation. That could be either a two-year or a four-year college or university. By contrast, 70 percent of the students from high-income urban high schools with few minority students were enrolled in college in the fall. (Only high-income mostly white suburban high schools send more kids to college with 73 percent of the students enrolled in college in the fall).

But these averages mask big differences among public city high schools. This same data show that best 25 percent of the low-income, minority schools — about 130 high schools in the data sample —  sent at least 60 percent of their 2013 graduates to college in the fall of 2013. The number was much higher at some. By contrast, the worst 25 percent of the high-income schools — about 60 of them — sent fewer than 60 percent of their graduates to college in the fall. (Low-income means that more than 50 percent of the students qualify for free and reduced price lunch. High minority means that more than 40 percent of the students are black and/or Latino).

“In every category of high school, there are clearly schools that are beating the odds,” said Doug Shapiro, director of the National Student Clearinghouse Research Center, which published the report, “High School Benchmarks 2014.”

This chart shows that 38 percent or fewer of the 2013 high school graduates from the bottom quarter of low-income high schools with high concentrations of minorities went to college in the fall of 2013. But among the top quarter of these low-income high schools, 60 percent or more of the students went to college in the fall. 

It’s no surprise that some successful low-income schools would be doing much better than the average low-income school. And I wouldn’t be surprised to hear about a handful of low-income schools, perhaps small magnet schools that cream off the top students, that are doing as well as higher income schools. But what is surprising is that so many low-income schools — 25 percent of them — are doing better than so many of the higher-income schools. You wouldn’t expect such a big overlap, especially when the means for each group are 20 percentage points apart.

This data speaks to both sides of the debate on education reform. Those who says that income determines educational outcomes argue that you can’t reasonably ask schools to overcome a student’s family background. And they can point to the data here which show that students who attend higher income city high schools, on average, are 37 percent more likely to go straight to college than a student from a low income high school. Indeed, the fact that 75 percent of the high-income schools have more students going to college than 75 percent of the low-income schools is great evidence for those who say that income matters.

But the beating the odds data is music to the ears of the so-called school reformers who argue that better schools and teachers can get better student results. The fact that a quarter of the low-income schools are outperforming the high-income schools is exactly the kind of data that supports their cause. Interestingly, none of these are charter high schools. The National Student Clearinghouse excluded charters from the analysis because it was concerned that its charter school participants were too few to be nationally representative.

The National Student Clearinghouse isn’t revealing which high schools are outperforming or underperforming. But it hopes to publish the names of  the high performing high schools in the future so that their practices can be studied and replicated. “Possibly next year,” said Afet Dundar,  associate director of the National Student Clearinghouse Research Center.

While the National Student Clearinghouse is now tracking a giant data set of 3.5 million high school graduates from 2010 to 2013, a big shortcoming is that the data isn’t a nationally representative sample. It includes only student data from high schools who voluntarily participate in StudentTracker, a service that the National Student Clearinghouse markets to high schools so that they can see where their graduates end up. More than 3,000 high schools in all 50 states participate, covering 25 percent all high school graduates in the country. The participation rate is even higher in urban districts where 65 percent of the largest 100 districts participate, covering 40 percent of all urban high school graduates.The National Student Clearinghouse didn’t reveal which districts are or are not participating and so it is unclear how the missing high schools might be skewing the data.

Another shortcoming is that the data don’t give you a sense of how the students are faring in college. It does not reveal if the students are taking remedial classes, essentially repeating what they should have learned in high school. And we don’t have data yet on whether these students are eventually graduating from college.

This is the second year of the National Student Clearinghouse’s high school report and there aren’t enough years yet to show trends as to whether more kids are going to college than in the past. But it’s a welcome additional data point, beyond standardized test scores, to see which high schools are doing a good job.


New research suggests repeating elementary school grades — even kindergarten — is harmful

The already muddy research on whether it’s better to hold back struggling students or promote them to the next grade just got muddier. A new study ,“The Scarring Effects of Primary-Grade Retention? A Study of Cumulative Advantage in the Educational Career,” by Notre Dame sociologist Megan Andrew, published Sept. 26, 2014, in the journal Social Forces is an empirically solid analysis that adds more weight to those who say retention — what education wonks call repeating a grade — is ultimately harmful.

Andrew mined two large data sets in a way no researcher has done before and concludes that kids who repeat a year between kindergarten and fifth grade are 60 percent less likely to graduate high school than kids with similar backgrounds, and even 60 percent less likely to graduate high school than siblings in the same family.

Before I discuss Andrew’s paper in more detail, it’s helpful to understand some history. Most early research overstated how harmful it is to be held back a grade. It tended to point out that the struggling kids who repeat a grade don’t fare as well as kids who stay with their class, most of whom are not struggling. But that’s shoddy research. These studies didn’t compare the held-back kids with the kids who were also failing, but were promoted nonetheless.

Related story:  Why Los Angeles sends failing students on to the next grade

This article also appeared here.

This article also appeared here.

In data analysis terms, this early research conflated the bad effects being held back with the bad effects of the underlying issue that led a school (or a parent) to hold the child back in the first place. Consider a child who has trouble paying attention, can’t read by the end of fourth grade and is held back. Say, this child continues to get bad grades, tests poorly and eventually drops out of high school. Did the stigma of repeating fourth grade cause the child to become demoralized and to perform worse at school? Or was it his ongoing struggle with attention deficit disorder? If he had been promoted, would his academic career turned out differently? These early studies don’t say.

Even as the low quality research kept showing that holding kids back was bad, a growing chorus of critics urged schools to end “social promotion,” the practice of passing failing students onto the next grade. As my Hechinger Report colleague Molly Callister wrote here, 15 states and the District of Columbia have adopted policies requiring third-grade reading proficiency before a student can move to fourth grade. Two big cities, Chicago and New York City, undertook ambitious experiments in ending social promotion.

Those urban experiments attracted sophisticated researchers. Brian Jacob and Lars Lefgren studied students in Chicago, where the decision to hold a student back was based on a test score. The researchers were able to compare the experience of students who scored just below the threshold for passing with the experience of students who scored just above the threshold. Because of test measurement errors, these students were effectively testing at the same level — academically identical. But half were held back and half were promoted. In a 2009 paper, Jacob and Lefgren found that the harmful effects of retention largely melted away when comparing these two groups of students. Students held back in older grades still suffered a bit, but there was no decrease in high school graduation for students who’d been held back young. (Jacob, Brian A., and Lars Lefgren. 2009. “The Effect of Grade Retention on High School Completion.” American Economic Journal: Applied Economics, 1(3): 33-58.)

Four years later in 2013, a RAND study looking at New York City’s experiment with ending social promotion came to a similar conclusion — retention isn’t harmful.  It also found that the kids who repeated fifth grade were better off than kids who just squeaked by and passed the test and moved on to sixth grade. (Study: “The Academic Effects of Summer Instruction and Retention in New York City.”  Educational Evaluation and Policy Analysis, v. 35, no. 1, Mar. 2013, p. 96-117)

So a growing consensus was emerging in the research community that holding a kid back in younger grades isn’t harmful and sometimes helpful if accompanied by support services, such as summer school, tutoring and advising.

And now Andrew’s paper — contradicting the new consensus — lands. It’s a quantitatively rigorous study finding harmful effects for younger children. She looked at more than 37,000 children across the United States from two older multi-year surveys (NLSY 1979 and NELS 1988) and found that about 10 percent had been held back at school, most of them during the 1980s. The surveys included details of the family characteristics of the children. That allowed Andrew to create 6,500 matched pairs of students, where the retained and non-retained students had similar backgrounds. Their mothers had attained the same level of education and their families had the same household income. The students had scored the same on a pre-school cognitive test. (In layman’s terms, they started school with similar IQs). The matched students also had similar behavioral problems, as reported on the surveys. Home environment, gender and race were factored in, too. In other words, Andrew matched the held-back students with students who were equally “at risk” for being held back, but weren’t.

Related story: India data show test scores rise when students are automatically promoted to the next grade

Then Andrew looked at whether these matched students eventually graduated from high school. And that’s where she found that the held-back children were 60 percent less likely to have graduated from high school than their matched “partners” who stayed on grade level. Andrew went one further to see if she could reproduce the results in a different way. Using the 1979 data survey, which included sibling information, she compared children who were held back with their siblings who weren’t held back. Again, she found the same result. Even in the same family, held-back kids were 60 percent less likely to graduate high school than their brothers and sisters. Astonishing!

Andrew acknowledges that held-back students often show a short-term boost in their grades and test scores, but she believes this boost “disappears” after just a few years. A sociologist by training, Andrew hypothesizes that being held back is so psychologically scarring that many students fail to regain their confidence in the long-term.  In her paper Andrews argues that being held back is a one of the biggest negative events of a child’s life. “In surveys, students rank being retained in grade second only to a parent’s death in seriousness in some cases,” Andrews wrote.

At first blush, the data seem to defy common sense. (Data have a way of doing that!) Kids, especially boys with fall birthdays, are commonly held back in kindergarten as they get another year to mature. I have a hard time believing that they’re 60 percent less likely to graduate from high school than the kid who stayed with his class and moved on to first grade.

Unfortunately, Andrew wasn’t able to test whether kindergarten retention was less scarring than say, fourth grade retention. But by email she explained that the majority of the students were held back in the earliest grades, confirming that she found even held-back kindergarteners less likely to graduate from high school.

How much you buy Andrew’s conclusions depends on how similar you think her paired children are. If there were a characteristic that prompted a parent to hold back one child that his statistical “partner” doesn’t have, then the analysis isn’t clean. Her control group (the promoted partner) isn’t otherwise identical to the treatment group (the retained child).  Andrew’s data sets didn’t list every behavioral problem and learning disability, so she couldn’t control for Attention Deficit Hyperactivity Disorder (ADHD) and other many other conditions.  It’s quite possible that some of the held-back children had behavior issues or a mild learning disability and the promoted partner child didn’t. Years later, when Andrew found that the held-back child didn’t graduate from high school, it’s possible that factors related to the student’s behavior or learning issues — being placed in an alternative academic track, for example — impeded his academic career and not the psychological scarring of being held back in first grade.

I don’t want to suggest that ADHD makes it hard to graduate from high school, but I am trying to explain how Andrew’s research can fall into the same trap that the early research on retention fell into. It can accidentally conflate the bad effects connected to a behavioral or learning problem with the bad effects of the retention.

I asked Andrew how a parent should factor in her research when deciding whether to hold a student back. “My study is not a parent’s how-to guide on retention,” she replied by email, explaining that holding a child back is a very personal decision. The most important thing is to address your child’s underlying academic problems, whether you’re holding him back or passing him on to the next grade.

She explained her study is aimed at education policy officials who are deciding whether to have high-stakes tests that determine who moves on and who is held back. “My study is an argument about how a very expensive policy, grade retention, may actually undermine our shared goals of ensuring even child gets a quality education,” she replied. “I would argue that my study is evidence that we might take funds used for an expensive and likely deleterious policy and use them for earlier, pre-school interventions and …supplemental services… to help get a student up to speed.”

Even education data geeks agree that education data is completely inscrutable and inaccessible to parents

Example of a graphic on a Hawaii school report card

Example of a graphic on a Hawaii school report card

One of the many provisions of the 2001 federal education act, known as No Child Left Behind, was a requirement that states had to issue a “report card”  for every public school. The report cards include things you might expect like student test scores and test score changes, but also a laundry list of data from graduation rates to school demographics.

Part of the purpose of making this data available was to help parents see how the students in their children’s school were faring and make more informed choices, whether it’s pressuring the school and district to do better, or taking their children elsewhere.

More than a decade later, much of this data remains inaccessible and inscrutable to parents — even to education experts.

To see the report cards in Florida, for example, you’d have to download Excel spreadsheets, or you can try clicking through a series of user unfriendly screens . In Hawaii, each school’s report includes a baffling distribution chart (see graphic on the right). There is a separate document, twice as long as the report card, that explains how to interpret all the figures and acronyms. Minnesota’s report cards disclose the number of students who are eligible for “celebration.” Of what, one might wonder, birthdays? Another state awarded an elementary school 17.29 points for an “average growth z score” without further explanation. The Education Commission of the States (ECS) issued a August 2014 report, “Rating States, Grading Schools” and concluded that too many report cards are hard to find and hard to understand.

Related story: NCLB co-author says he never anticipated federal law would force testing obsession

It’s a sad outcome because the parents who could take advantage of this data the most, where school improvement is most needed, tend to be from the most disadvantaged and disenfranchised communities. These report cards need to be easily digestible.

“There are smart statistical people good at cranking out data, but they are not known for design. What is good design for a policy wonk, is not a good design for a parents and policy officials,” said John Bailey, a former Bush White House official, who now wears several hats in the education policy world, including vice president at the Foundation for Excellence in Education. (The conservative organization, headed by former Florida Gov. Jeb Bush, supports the greater use of education technology, promotion of charter schools and measuring schools by student test performance.)

By coincidence, in June 2014, Bailey had just collaborated on a paper about the role of prizes and competitions in generating good ideas. And he was inspired by a health design challenge where visual graphic designers reimagined patient information records. “There’s all this very wonky complicated data that’s given to patients that doesn’t make a lot of sense. It’s the same in education,” he said.

So Bailey came up with the idea to use $35,000 of his organization’s money to launch an education data design competition. Ed Excellence has reached out to the design community both for contestants and for judges.  The winner will be announced in December.

This article also appeared here.

This article also appeared here.

It sounds like a fun fall project for a Rhode Island School of Design student. But it’ll be a tough challenge for even a seasoned graphics expert to put a school’s test scores in the context of its student demographics in an understandable way. If you just report straightforward test scores, schools with rich students will likely have higher scores than those with poor students. Showing how much students learn each year is better. But measuring academic growth is a complicated task and it’s hard to explain simply. Using regression analysis to adjust for poverty — even more complicated!

Hallmarks of good design are simplicity and minimalism. But state report cards are required to include lots of data. One of the problems mentioned in the ECS report are that parents are already overwhelmed by data. It cited one parent who complained that one report card  was“[l]ike reading a corporate financial report of 20 pages.” Bailey hopes that designers will come up with a simple starting point, but give parents an intuitive way to dig deeper if they’re interested. “You don’t have to show all the proficiencies by subgroup on the same page,” Bailey suggested.

Some states and urban school districts have tried to improve their report cards. Washington, D.C. recently revamped its school report cards on its LearnDC website and they’re an inspiring model of simplistic design. Here’s an example. (Still, there are numbers that lack context. What’s that school classification score based on?) And New York City announced on Oct. 1, 2014 that it was revamping its school report cards. In addition to jettisoning simple letter grades for each school, the city is trying to make them parent friendly. Here’s a model example with fictitious numbers. It’s written in plain English, but way too wordy. And for a city that’s filled with brilliant designers, this document lacks inspiration, colors and graphics. Probably not a design competition winner.

Bailey says his ultimate goal is to raise the quality of discourse on education.”The more information that parents and policy makers have, the more informed the debate is going to be,” he said.

Useful report cards also won’t hurt the cause of data proponents, who’ve recently been burnt by the demise of national student data warehouse inBloom. If ordinary parents start to see data as something useful and not just a threat to their children’s privacy, perhaps the data geeks will have enough public support to be able to resurrect their dream of mining vast amounts of student data to improve education.
Related story: Big data and schools: Education nirvana or privacy nightmare?

Homeless students increase by 58 percent in past six years

Despite signs of a national economic recovery, homelessness in U.S. public schools steadily increased 8 percent, to 1.26 million students, in the 2012-13 school year from the previous year. That may not sound terrible, but consider that it is part of a 58 percent jump in the number of homeless students in the six years since the start of the economic recession of 2007-08.

Percent change in the number of homeless students in U.S. public schools over six years (2007/08 to 2012/13)

(Zoom in and click on any state to see actual numbers of homeless students and annual percentage changes for each state. Interactive map created by Jill Barshay and Sarah Butrymowicz of The Hechinger Report.)

“It’s safe to say there’s been a significant increase in homelessness in schools,” said Diana Bowman, director of the National Center for Homeless Education. Her organization, funded by the U.S. Department of Education, provides technical assistance for the federal Education for Homeless Children and Youth Program.

The U.S. Department of Education quietly released this data on homeless students, in grades pre-K through 12, without issuing a press release or detailed report. The new data were added to a publicly accessible database on September 22, 2014 as part of its annual Consolidated State Performance Report Data.

Related story: Poverty among school-age children increases by 40 percent since 2000

Some states saw much larger than average one-year increases in homelessness. Student homelessness in New Jersey grew by 77 percent and in Alabama by 68 percent over the most recent one-year period. Washington, D.C., Maine, Montana and New York also experienced sharp increases in the number of homeless students.

But Bowman cautioned against putting too much stock in sharp one-year fluctuations. States sometimes change counting methodologies; longer multi-year trends are more reliable.

More important, and distressing, is the data for the six-year period. Some less populous states saw some of the largest percentage increases in student homelessness. The number of homeless students grew by more than 140 percent in Oklahoma, Hawaii, Alabama, West Virginia, Montana, Idaho, North Dakota and Washington D.C. The color-coded map above highlights which states have suffered the greatest increases in student homelessness since 2007.

The majority of homeless students are not sleeping outside on park benches. According to the Department of Education’s data, three-quarters of homeless children are temporarily living “doubled up” with extended family members or neighbors. (Table 3 on page 2 of this report, “Education for Homeless Children and Youth, Consolidated State Performance Report Data, School Years 2010-11, 2011-12, and 2012-13”  shows where homeless school children spend the night.)

“A lot of people think of families living in shelters,” said Bowman. “But it’s really a lot of other situations where a lot of homeless children live.”

The Department of Housing and Urban Development defines homelessness more narrowly, often not including people who are living with others. But Bowman explained that it makes sense for the Department of Education to have a more expansive definition, because the children of these families have still lost their primary residence and are often switching homes and changing schools every few months. “The education disruption makes it hard for them to perform academically. They’re losing friends and teacher connections. They also have greater health problems and emotional stresses,” said Bowman.


 Data source: U.S. Department of Education Consolidated State Performance Report Data, 2007-13. Google chart created by Jill Barshay, The Hechinger Report


Many researchers have documented how devastating episodes of homelessness are for a student’s academic performance, both in the short term and over the long term. McKinney-Vento funds were established by Congress in 1987 to support homeless programs. A portion of these funds go to school districts based on the percentage of poverty in their school district. But, as Table 2 here shows, more than a third of the nation’s 1.3 million homeless children are enrolled in school districts that haven’t received any of these McKinney-Vento funds.

A 2014 University of Pennsylvania study found that homelessness was the third most important risk factor to consider when thinking about support programs for disadvantaged children, and that poverty alone wasn’t necessarily harmful to a child’s academic career.

Perhaps with this well-documented rise of student homelessness, lawmakers will start to think about better ways to strategically allocate Title I education dollars —  not just to low-income children, but to the low-income children who need them the most.

Data analysis methodology and explanation: Original source data is from the U.S. Department of Education’s Consolidated State Peformance Report Data, in which states are required to report on a variety of figures, including homeless school children. To locate this data, go to, then click on “Build a State Table,” then “Build Table Now.” That will take you to a “State Tables” page. I clicked all states and then selected data under the “Homeless Program (McKinney -Vento)”. I selected “Total Number of Homeless Students Enrolled in LEAs with or without McKinney-Vento Subgrants – Total” for the all six years available, 2007-08, 2008-09, 2009-10, 2010-11, 2011-12, 2012-13. “LEA” is a local educational agency, commonly known as a school district.

Related story:  The number of high-poverty schools increases by about 60 percent

Education researchers don’t check for errors — dearth of replication studies

Growing of larch budsEducation theories come and go. Experts seem to advocate for polar opposites, from student discovery to direct teacher instruction, from typing to cursive hand-writing, and from memorizing times tables to using calculators. Who can blame a school system for not knowing what works?

One big problem is that education scholars don’t bother to replicate each other’s studies. And you can’t figure out which teaching methods are most effective unless the method can be reproduced in more than one setting and produce the same results.  A new study, Facts Are More Important Than Novelty: Replication in the Education Sciencespublished August 14, 2014 in Educational Researcher, found that education researchers have attempted to replicate other researchers’ results only a scant 0.13 percent of the time. Compare that with the field of psychology where there’s a 1.07 percent replication rate, eight times as much as in education. By contrast, replications within the field of medicine are commonplace and expected.

“When we teach science, we teach students that it’s important for other people to get the same findings as you,” said Matthew C. Makel of Duke University, one of the study’s co-authors. “Replication is a key part of the error-finding process. In education, if our findings cannot be replicated, we lose a lot of credibility with the scientific world and the greater public.”

“Error  — or limited generalizability — won’t be found if no one looks,” he added. “And our findings show that, for the most part, in education research, we aren’t looking.”

Makel and his University of Connecticut colleague Jonathan Plucker conducted a text search through the entire publication history of the top 100 education journals and found that only 221 out of more than 165,000 scholarly articles were replication studies, in which researchers tried to reproduce the results of earlier studies. (The 221 includes both exact replications and approximate ones where the experiments were tweaked a bit, say, to see if the intervention would work with a different type of student).

You’d think in education, where best practices could actually help millions of children, there would be a priority on reproducing results. So why so little replication?

Part of it is unique to education. In psychology, for example, you can reproduce results fairly easily using another group of 25 undergraduates in a laboratory or clinic setting. In education, it’s far more complicated to find similar groups of students in similar school settings. Often poverty levels and racial makeups vary.  And no two teachers are the same. Each will invariably put his own spin on the teaching method being tested. Many parents and school leaders are understandably reluctant to experiment on children at all.

The culture of the Ivory Tower is also to blame. Professors live by a “publish or perish” mentality. Their tenure, prestige and research funding are often based on how many articles they can get published in leading journals. And the editors and reviewers of these journals (staffed by fellow university professors) have a bias toward the new and the novel.

I talked with Steve Graham, an Arizona State University professor of education, who has edited five education journals. He says he gets 600 submissions a year for Educational Psychology alone, and thus has “the luxury to be very choosy.” He says he doesn’t publish replications studies “unless they cover new ground” (a sort of contradiction in terms). “We want studies that have significant new impact,” he said. “There’s a bias built in. I’m not saying it’s a good thing. It’s a problem. I recognize it.”

The problem affects Graham directly because his own research work involves metastudies of how to teach writing — that is, he synthesizes other researchers’ papers on effective writing instruction to figure out what works. The lack of duplications makes his work difficult. “I got a lot of noise in my metaanalyses,” he explains.

Graham suspects that if there were more research funds earmarked for replications, more academics would apply for them and conduct replication studies. (Foundations out there: hark, there’s a new way to fix education!) The American Psychological Association, also worried about the dearth of replication in its field, is looking to launch a new journal exclusively devoted to publishing replication studies. Perhaps education can create one too.

To be sure, Makel’s and Plucker’s word-search methodology — where they looked only for variants of the word “replicate” — may be exaggerating the lack of scientific process in education research. Neil Seftor, an economist at Mathematica, runs the What Works Clearinghouse for the Department of Education. He specifically examines what the majority of scientific studies say about the best way to teach, or about a particular curriculum or textbook.  He admits that exact replications are rare, but says he wouldn’t be interested in exact replications such as those in a laboratory setting. “What you want in education is evidence over a variety of settings in the real world — urban areas, special ed,” Seftor said. When he searches for studies on new interventions, he said he often finds dozens of papers on each one, but they might not have the word “replication” anywhere in their text.

“I don’t think it would be fair to say that they’re all these educational approaches out there and they’ve only been studied one time,” Seftor said.  (Admittedly, many of the studies Seftor looks at are unpublished and financed by the developer of the curriculum.)

Seftor, of course, would welcome more scientific studies on education theories. Often, when he is developing practice guides for teachers, the teaching methods recommended by experts don’t have much scientific evidence to support them.

In the meantime, the American Educational Research Association (AERA), which publishes a number of top education journals including Educational Researcher, decided last year (2013) to publish AERA Open as a new open-access research journal, and is specifically encouraging the publication of peer-reviewed replication studies in it. It just began accepting submissions on Sept. 15.

Maybe, once we see more replication studies in print, we’ll be able to judge whether they can filter down to the classroom and improve instruction.

Related stories:

Study finds taking intro statistics class online does no harm

US DOE evaluation of Gates-funded Early College High Schools shows that low-income males more likely to graduate from high school and enroll in college afterward

Bonus pay for teachers thoroughly discredited

Less math is more: data supports Saxon Math curriculum

What U.S. schools can learn from Poland

Source: Encyclopaedia Britannica, Inc. (

Source: Encyclopaedia Britannica, Inc. (

By any measure, Poland has made remarkable education progress since the fall of the Berlin Wall. On the most recent 2012 international tests of 15-year-olds, known as PISA tests, Poland ranked 9th in reading and 14th in math among all 65 countries and sub-regions that took the test. It used to be on par with the United States, a mediocre performer. In math, for example, Poland gained 2.6 points a year between 2003 and 2012 while the rest of the world, on average, remained unchanged.

And on Sept. 9, 2014, when the Organization for Economic Co-operation and Development (OECD) released its annual indicators, “Education at a Glance 2014,”  another important indicator appeared: Poland’s college graduation rate is soaring.  In 2012, 25 percent of Poland’s adults held a college degree, up from only 11 percent in 2000. At that rate, it could soon eclipse the United States, where more than 40 percent of adults have a college degree (this includes two-year degrees).

“Poland is an interesting case study,” said Andreas Schleicher, director of education at the OECD. “It used to be modest. It  is now at the frontier, in little more than a decade.”

This article also appeared here.

This article also appeared here.

How did Poland do it? Its political leaders scrapped their Communist-era education system back in 1998. Instead of the state sorting students into vocational tracks, it opened the system up and allowed students to make their own choices. Dismantling a central command system is not required in most countries, but others countries can learn from Poland, Schleicher says, in two other ways: educational improvements can happen relatively quickly and relatively cheaply.

“For other countries, Poland highlights that what’s really possible in a relatively short period,” Schleicher said. That upends the conventional wisdom in education that real progress is slow and incremental. Also, “None of this has been achieved by putting more money into the system,” he said.

Although spending per student has in fact gone up  in Poland (largely because of declining birth rates and a declining student population), the growth in per-student spending remains well below the growth in education spending in other countries. (Click on the chart below, also from the 2014 Education at a Glance publication, to see a larger version).


OECD spending per student


Fast education results on a modest budget are alluring. And it will take more scrutiny of the Polish education system to understand what teachers are doing there. Classroom culture and student behavior may play a role. Polish teachers spend less time keeping order in their classrooms than teachers in any other nation, according to the OECD’s teacher survey (TALIS 2013).

But instructional time seems to be a key factor. A 2006 World Bank analysis credited initial rise in Poland’s PISA reading test scores, in part, to increased hours of classroom instruction. It noted that back in 2000, Polish students spent fewer than four hours a week reading and writing, but that by 2006, more than three-quarters of Polish students were spending more than four hours a week reading and writing.

Schools in the United States experimented with increased instruction time in the basics during the Bush administration, but have since retreated. Back in the 2000’s, No Child Left Behind policies that mandated student testing prompted a majority of school districts to increase instruction time devoted to the two tested subjects: reading and math. (Source: Center on Education Policy, NCLB Year 5: Choices, Changes, and Challenges: Curriculum and Instruction in the NCLB Era). But parents protested that other subjects, especially science, social studies and the arts, suffered. A majority of states have since received waivers from the testing requirements.

Apparently, parents in Poland did not feel the same way.


The teaching profession is becoming less gray and less green, but more teachers are leaving poor schools

Much ink has been devoted to the teaching profession’s increasingly gray and green complexion — the profusion of teachers at the two extremes of the age spectrum. There are lots of veteran teachers older than 50. Meanwhile, school systems have hired hundreds of thousands of cheaper newbies without much experience in the classroom. That leaves the U.S. school system without as much weight in the happy middle of mid-career, experienced teachers.

But new data from the National Center for Education Statistics,”Teacher Attrition and Mobility: Results From the 2012–13 Teacher Follow-up Survey, First Look,” released Thursday, Sept. 4, 2014, shows that these troubling trends may be abating. In the 2012-13 school year, only 12 percent of the nation’s 3.4 million public school teachers (that includes public charter schools) had less than four years of teaching experience. Compare that with an earlier NCES report that put the percentage of rookie teachers with 1-3 years teaching experience at 17 percent. That’s a 5 percentage point decline in the number of the most inexperienced teachers.

Similarly, there’s good news at the opposite end. Back in 2008-09, a third of the teaching force was 50 years or older. That’s dropped slightly to 31 percent in 2012-13.

What that means, according to Richard Ingersoll, a University of Pennsylvania professor of education who studies teacher turnover, is that the graying of the teaching force is over. “The graying — which was a big story — that is done,” he said. That’s because older teachers have been retiring and are continuing to retire. And there isn’t a huge group of teachers in their forties just behind them.

According to Ingersoll’s analysis, the most common teacher in 1987 had 15 years of teaching experience. But because of two decades of rapid hiring in school districts around the country, by 2008 the most common teacher had only one year of teaching experience. That has again changed because of the decrease in teacher hirings since the recession. “Now that’s not quite true; now the most common teacher is someone in their fifth year,” he said.

Whether rookie teachers will remain a smaller part of the teaching force is unclear.  The recent reduction in new hires could be a momentary blip of the 2008 recession, when school systems around the country were scaling back on new hires and laying off teachers (usually those with the least seniority were laid off first). So far, there aren’t any indications that school districts are hiring again. But Ingersoll argues that the two decades of massive hiring until 2008 — with a 48 percent growth in the teaching force compared with only a 19 percent increase in the student population — mean that it will take a much larger decline in the teaching force to rebalance the teacher-student ratio to what it used to be. (The total number of U.S. public and charter school teachers fell only slightly, by 2400 teachers to 3,377,900 in 2012-13.)

The new teacher turnover data also reveal that the charter school sector, often criticized for hiring young teachers who change schools frequently or leave the profession, is becoming more stable. Back in 2008-09, 23.9 percent of charter school teachers either changed schools or left the profession. In 2012-13, only 18.4 percent of charter school teachers had changed schools or left the profession. That’s a 5.5 percentage point decrease in charter school faculty turnover. By contrast, teacher turnover in traditional public schools was virtually flat, at about 15.5 percent, during the same time period.

It’s also unclear whether faculty stability at charter schools is here to stay. It could be that these non-unionized teachers were affected by the recession and didn’t leave their jobs because there weren’t as many job prospects elsewhere. Interestingly, the public school teaching profession is otherwise impervious to economic cycles. Most other professions see a decrease in turnover during recessions because there aren’t as many job prospects elsewhere. But annual teacher turnover has barely budged during the past 15 years over various business cycles, hovering between 15 and 16 percent.

Although the overall picture looks more sunny, with fewer inexperienced rookies and more mid-career teachers in the ranks, one alarming data point emerges.  Teacher turnover has grown at schools with high poverty levels. Among schools where more than 75 percent of the students qualified for free or reduced price lunch, many of them in large urban districts, teacher turnover hit 22 percent in 2012-13. In order to get an average number like that, it means that some schools likely saw 40 percent of their teachers leave in one year. In the 2008-2009 school year, by contrast, average teacher turnover in high poverty schools was 15 percent. (See Table 2 in both studies here and here).


Lessons from Hawaii: tracking the right data to fix absenteeism

This article also appeared here.

This article also appeared here.

Good school attendance is associated with all sorts of good educational outcomes, especially higher grades and higher test scores. It’s obvious: if you’re not showing up for school, you’re not going to learn as much. But only 17 states track and report chronic absenteeism data, according to the Data Quality Campaign and Attendance Works, a non-profit organization that advocates for more focus on absenteeism data and ideas for getting students to come to school.

“People aren’t tracking the right data now. They’re paying attention to average daily attendance and truancy, but not the kids who are at academic risk,” said Phyllis Jordan, a spokeswoman for Attendance Works. Truancy is generally defined as unexcused absences, but many chronically absent children don’t get captured in the truancy data because they had a reason for missing school or a parent-signed slip excusing their absence.

A recent presentation by a state education official from Hawaii, one of the 17 states that does track chronic absenteeism, showed just how misleading it is to focus on average daily attendance rates. Dave Moyer, speaking at the 2014 National Center for Education Statistics data conference on July 31, 2014, found that even at Hawaiian public schools where 95 percent of the students show up every day, chronic absenteeism can be a gigantic problem where as many as one in four kids  — 25 percent — are missing 15 or more school days a year. Hawaii schools boast of 95 percent daily attendance rates. But when Moyer first drilled down into the data, he found that more than one in five students throughout the state were chronically absent.

It’s worth pausing a moment to understand how these seemingly opposing statistics  — high daily attendance and high chronic absenteeism — can coexist. Imagine a school with 100 students and a 95 percent daily attendance rate. On day one, 95 of them show up and five play hooky. Then imagine that the same five students play hooky 15 days in a row. Already, you have 5 percent chronic absenteeism just 15 days into the school year. Now pretend those truant children decide to mend their ways and attend school again. And grab a new group of five kids (from the 95 that had been attending every day) to skip school. Again, you still have a 95 percent attendance rate. But if this second group of 5 skips school for 15 days, then you’d have a total of 10 kids, or 10%, of the student body that would be considered chronically absent. That’s after only 30 days. You could theoretically get to 20% chronic absenteeism just 60 days into the school year.

In other words, it’s a small group of, say, 20 students who are frequently missing school. Maybe only five of them miss school on any particular day. Most of the remaining 80 percent have fairly stellar attendance records. And the school can still boast of a 95 percent attendance rate overall.

Since too few states track it, it’s hard to say if Hawaii’s absenteeism problem is worse than, better than or about the same as the national average. Even the states that track absenteeism have different definitions for what it means to be chronically absent. Hawaii’s threshold of 15 days is believed to be one of the lowest in the nation. Most other states or districts wait until 18 days, or until 10 percent of the 180-day school year is missed, before labeling a student “chronically absent.” A 2012 Johns Hopkins study estimated that 10 to 15 percent of students in the U.S. are chronically absent each year.

Solving absenteeism is another matter. “I don’t know how to fix the problem,” Moyer said. Moyer found that the reasons that kids don’t show up for school are many and varied. Some students suffer from asthma and have trouble coming to school on what Hawaiians call “voggy” days, when volcanic particles are thick in the air. On the big island of Hawaii, a two-mile hike down a steep mountain to the bus stop can be too arduous in bad weather. Bullied kids can be too scared to go to school. Others simply cut school to go to the beach.

Race and ethnicity seem to be a factor, too. Native Hawaiian, Micronesian and Samoan students were disproportionately represented among the chronically absent population.

But Hawaii has had some success in lowering chronic absenteeism statewide from 21.8 percent to 19.7 percent over the last few years, after making principals accountable for it. Five percent of an elementary’s school performance rating is now based on its chronic absenteeism rate. One community noticed that students were hanging out at the Seven-Eleven instead of showing up for school on time, so they persuaded the convenience store to shut down at 7:30 in the morning. “One Seven-Eleven closing had a big effect,” said Moyer. “The best solutions are local.”

So which days of the year are students least likely to cut school? Moyer counted attendance on every day of the school year and found two of the highest attendance rates on Halloween and Valentine’s Day. “Really, the takeaway here is that candy drives attendance,” Moyer jokingly concluded. (Click on Moyer’s chart below to see a larger version).

Source: NCES STATS-DC Dave Moyer presentation on Chronic Absenteeism in Hawaii

Source: NCES STATS-DC Dave Moyer presentation on Chronic Absenteeism in Hawaii


How much did students really gain on Common Core tests in New York? Data doesn’t say

The main reason for annual standardized tests is to figure out how much kids are learning each year. But when New York released its 2014 Common Core test results on August 14, state education officials were selective in their data reporting and did not disclose actual student scores. Instead they released only the percentage of children hitting various proficiency thresholds. That makes it difficult for outsiders to understand how much New York students improved after their second year of Common Core curriculum and testing.

“Performance levels can be misleading. They can mask the actual score changes that students are making,” said Robert Rothman, senior fellow at the Alliance for Excellent Education in Washington, D.C.

Rothman explained that you can have a big jump in proficiency with only a small test score gain if there are lot of students close to the proficiency cut point. Conversely, if students are far behind, they can have giant test score gains, but not jump over the proficiency threshold. That makes it particularly difficult to see if the most vulnerable students, those in the bottom 10 or 25 percent, are improving.

Despite confusion over how much New York students are improving, New York City’s small gains in proficiency (almost 5 percentage points in math and 1 percentage point in reading) appear to be real progress, experts say, because they mirror similar improvements on national tests (specifically the Trial Urban District Assessment portion of the National Assessment of Educational Progress (NAEP)). On the NAEP, New York City has typically shown a point or two annual gain in math, but flat reading scores. Prior to 2013, when Common Core testing was introduced, New York students had posted much larger gains on the local state tests than on national tests, calling into question how valid test score gains were during the Bloomberg administration.

“No assessment system can tell you the whole story. When you have two different systems, and they’re telling you the same thing, then you have more confidence that the story they’re telling you is correct,” said Henry Braun, the Boisi Professor of education and public policy at Boston College. He concluded that it was a good thing to see small improvement in the second year of Common Core testing, but that it’s going to be a “hard, slow slog” to get the majority of students to the proficient level with only about a third hitting that threshold now.

At the same time, the test-score gap between New York City and the other four largest cities in the state jumps out. New York City has embraced the concept of higher Common Core education standards, and experts think its enthusiasm may be behind the city’s decent showing on the tests. Although the city has outperformed the state’s other cities for years, experts believe that the New York City district is further along than other metropolitan areas in introducing new Common Core lessons into the classroom.

While many political leaders around the country are bowing to public pressure to retreat from Common Core, both Mayor Bill de Blasio and the New York City schools chancellor, Carmen Farina, pointedly confirmed their commitment to Common Core when they announced the test results. “We want to aim high,” said Mayor de Blasio. Both promised to invest more in teacher training to help implement the new standards in the classroom.

This chart below shows how New York City’s students, the majority of whom are low-income minorities, score near the state average with roughly a third of the students scoring proficiently in math and reading. But students in Rochester, Syracuse, Buffalo and Yonkers (where most students are also low-income) are well behind. In Rochester, the weakest performing big city, less than 7 percent of the student population in grades four through eight hit the proficient mark.

Chart created by Jill Barshay using Google Spreadsheets. Data from pages 20 and 32 in the Engage NY August 2014 PowerPoint presentation, Measuring Student Progress in Grades 3-8 English Language Arts and Mathematics.

“The cities in the northern tier are so far below New York City, way, way behind. That ought to be a clarion call to bring to those cities the secret sauce that New York has been using for the past several years,” said Braun.

The city’s Common Core roll out hasn’t been without problems, though. Last school year, teachers throughout the state, including in New York City, complained that curriculum and textbooks were late to arrive and that they had not received enough training in the new standards to teach them effectively.

Older Posts »