Dissecting New Tech Network Numbers

The New Tech Network issued a press release on Oct. 2, 2013, trumpeting that its high school seniors outperformed 68% of 4-year college freshmen with similar backgrounds and abilities on key indicators of higher order thinking skills according to the College and Work Readiness Assessment (CWRA). The New Tech Network employs a lot of technology in its classrooms. On its website, there are a lot of pictures of students smiling in front of laptops and desktop computers. So it would be amazing to learn if New Tech’s unusual mix of computerized instruction and project-based learning was really producing such results.

I clicked on the underlying report and the numbers were more confusing. Instead of the “68%” figure above, the report said that “NTN seniors outperform 77% of college freshman (sic) and 60% of other high school seniors when controlling for academic ability.” I added the italics for emphasis.

I’d never seen test scores controlled for “academic ability” before. Usually I see controls for socio-economic status, income or race so that you’re comparing kids with similar backgrounds. But isn’t it bizarre to say that NTN kids got higher test scores than other kids with similar test scores?

It was also strange to me that NTN seniors performed so much better than college students, but not quite as well against other students their age. (77% outperformance vs. 60% outperformance).

A footnote explained, “On average, students in New Tech Network schools have lower academic skills than those in the comparison groups; this is possibly explained by the fact that the CWRA sample of high schools consists largely of private schools and CWRA does not control for ethnicity or socio-economic status in their analysis.”

So if I understand correctly, NTN seniors are testing WORSE than comparison groups. But when you cherry pick the test questions that deal with “higher order thinking skills”, then NTN students do better on those questions than other students who did as badly on the overall test as they did. Do I have something wrong here?


POSTED BY Jill Barshay ON October 3, 2013

Comments & Trackbacks (3) | Post a Comment

Sherrie Reed

Thank you for engaging in this important conversation concerning student outcomes data. Your questions and observations contain some common misperceptions that we would like to address.

To clarify: some facts you cite are pulled from the “underlying report”. I believe the report that you reference is the 2013 Annual Student Outcomes report (http://www.newtechnetwork.org/sites/default/files/news/2013_annual_data_v14-01.pdf). Included in this report are the CWRA results from 2012, where NTN students outperformed 77% of college freshmen with similar backgrounds and academic ability. The CWRA Report released Oct 3, 2013 (http://www.newtechnetwork.org/sites/default/files/dr/cwra2013.pdf), addressed the 2013 results, where NTN students outperformed 68% of college freshmen.

New Tech Network relies upon the College Work Readiness Assessment (CWRA) as one measure of student success. CWRA results provide an estimate of students’ growth of analytical reasoning and evaluation, problem solving, writing effectiveness and writing mechanics, as well as comparisons of performance to other high school seniors and college freshmen.

When samples of students differ substantially, researchers often use regression to control for differences in samples which in turn allow for adjusted comparisons. This is exactly what NTN, in collaboration with the Council for Aid to Education (CAE), has done. The statistical control in this study is academic ability. It is an established research practice in the value-added (teacher evaluation and program evaluation) literature to control for prior academic performance. This academic performance incorporates background characteristics which are known to influence academic performance, such as socio-economic status and ethnicity — factors you point out as important controls.

You may have noticed that the sample of NTN students participating in CWRA is substantially different than the comparison group of students. While 36% of NTN students have parents with a college degree or higher, almost twice as many students in the entire CWRA sample have parents with a college degree. We know that parent level of education is a strong predictor of college-success and a frequently used proxy for socio-economic status in academic research. You may also note that NTN sample includes five times as many students of color than the CWRA sample.

While NTN high school freshmen and NTN high school seniors score lower than other high school students participating in the CWRA, what is compelling is that NTN students are growing substantially more than students in the comparison groups. This means that despite the fact that students — a greater percent of whom are students of color and lower percent of whom have parents with college degrees — come into NTN schools with lower academic skills, we are able to help them develop the higher order thinking skills necessary for college by the time they graduate. Further, when we use the statistical controls described above, NTN students are in fact outperforming peers with similar backgrounds in higher order thinking skills AND closing a gap that exists between students with different socio-economic backgrounds.

Sherrie Reed
Manager of Research and Evaluation
New Tech Network

Jill Barshay

@Sherrie Reed. Thank you for your thorough explanations. I like it when readers properly take me to task. And you give me pause about attempting to write about reports before interviewing the researchers who conducted them. Indeed, I was not looking at the correct underlying New Tech Network report initially. And I will fix that in the blog post. But both reports make the same point that your high-school students are performing much better than the majority of college freshman. I remain a bit stuck on the idea of controlling for academic ability. I understand that you are using it as a proxy for socio-economic status. That kind of control would make sense to me if you were trying to measure growth during the year, say, by comparing students’ scores in the Spring with how they started out in the Fall. But on a single test, my brain keeps spinning into tautologies. Can you explain to me how you calculated academic ability for each student? Was it a score on the same CWRA test in which your students outperformed others? Did you create buckets with ranges of scores? Thank you.

Sherrie Reed

Jill,
Thank you for your thoughtful response. I appreciate your probing questions as this conversation is critical to understanding how we measure student outcomes beyond the typical measures of academic achievement. New Tech Network recognizes the importance of multiple measures of student outcomes and CWRA is one such measure.
CWRA measures students’ higher order thinking skills. The CWRA presents students with realistic problems that require them to analyze complex materials. Several different types of materials are used that vary in credibility, relevance to the task, and other characteristics. Students’ written responses to the task are graded to assess their abilities to think critically, reason analytically, solve problems, and write clearly and persuasively.
When comparing the performance of high school students to college freshmen, academic ability is defined by CWRA as SAT or ACT scores. As we know, SAT and ACT scores are widely used measures of students’ academic readiness for college and are based on academic skills in areas such as language arts and math. The CWRA differs in that it requires students’ to think critically about complex problems, interact with disciplinary content and apply knowledge through realistic performance tasks that students may be asked to do in college or the workplace. We also know that academic performance indicators such as SAT or ACT scores incorporate background characteristics which are known to influence academic performance, such as socio-economic status and ethnicity. Controlling for academic ability (defined as SAT or ACT scores), therefore, allows us to measure students’ ability to perform complex tasks given their current academic skill set and backgrounds. From the CWRA study, we know that students in New Tech schools outperform 68% of college freshmen with similar with similar backgrounds and abilities.
In regards to your methodological question, we use regression to control for academic ability (SAT or ACT) score. Regression allows us to compute expected and observed scores on the CWRA given a students’ performance on SAT or ACT. The difference between expected and observed scores is then standardized to allow for comparisons of similar students. This statistical technique eliminates the need for grouping students in buckets of similar scores. This method is common in quantitative analysis. The use of SAT or ACT as a control in regression is the same as using prior years test scores to determine program impact. For example, controlling for 8th grade math assessment score is often used when comparing students’ performance in high school math across different school models (ie. Charter schools vs. non-charter schools, early-college programs/traditional high school programs).
I hope this helps clarify the methodology. I am happy to discuss further with you by email or phone.

Sherrie Reed
Manager of Research and Evaluation
New Tech Network

Your email is never published nor shared.

Required
Required