By Qinling Li
Eduardo Porter’s School vs. Society in America’s Failing Students is a fine data journalistic piece.
There is clearly a controversy over which methodology should be applied to scientifically evaluate the success or failure of American education system, and if there is a cause-and- effect relation between students’ performance in the PISA tests and the country’s education level. This controversy is newsworthy, as it directly affects our policymaking in education, but it is hard to generate an indisputable conclusion . To investigate a cause-and-effect relation, the experimental studies are always more controllable, but reports here are all observational studies due to the objective restrictions. There are many variables other than students’ academic performances to evaluate a country’s educational effort, and the different interpretations of variables can ultimately reverse the conclusion of an observational study. While a previous study done by the Organization for Economic Cooperation and Development (OECD) concluded that American 15-year-olds performed poorly compared with other countries in the PISA standardized tests, a recent report by the Economic Policy Institute (EPI) questioned the methodology OECD adopted. EPI adjusted variables to include socioeconomic variables, such as gender, age, mother’s education levels and the number of books at home, which surprisingly narrowed the score gap between America and other countries that is released in OECD’s report. One can argue that OECD’s report neglects the influence of socioeconomic variables, but can also argue that EPI’s report overestimates the influence of socioeconomic variables, such as applying a simple indicator: books at home. Each side can be considered reasonable and that is why the author Porter did not give a preference at the end. To fully explore into American education status, the financial investment on education, the students’ extracurricular activities and physical activities should also be considered. The PISA tests only evaluate students’ math, reading and science studies, which cannot simply equal to education, especially considering that each country has different preferences in their curriculum designs.
To question the OECD’s report, there are many other perspectives that a journalist can look into. Here, the PISA standardized tests functions as “questionnaire” – analysts interpret the students’ answers, which is response, to speculate their academic performance. So can students’ academic performance be fully justified by these “questionnaires”? What knowledge has entailed in the exam? How are these students selected from the entire students body in the country? What random sampling method have they chosen? For instance, the OECD’s report selects Chinese students from Shanghai, which is equipped with the best educational resource across China and thus being the outliners. Therefore, there are many other statistical questions that can be raised to the OECD’s report as well as EPI’s report. EPI’s analysis may not be the most powerful rebuttal towards OECD’s report, but it certainly puts a warning on us: Don’t underestimate the complexity of data. This also enlightens us, as journalists, to question variables and methodologies while analyzing and reporting data stories.