Thursday, 3 April 2014
9 things you should know about the new PISA "creative problem-solving" test
Today sees the launch of the first international test of "creative problem-solving". It is the latest addition to the suite of PISA tests run by the OECD which have become hugely influential in global education policy-making.
This test was taken by pupils in late 2012 at the same time as PISA tests in maths, science and reading but the results were held back for a separate launch. I was invited to a pre-embargo briefing yesterday and the information here is taken from a mix of the published reports and answers given by OECD experts at the briefing.
1. The purpose of the test was to measure students ability to solve problems which do not require technical knowledge. The PISA subject tests in maths, science and reading are also based around problem-solving but they do require knowledge in these subject areas (e.g. mathematical concepts and mental arithmetic). Examples of questions include working out which ticket to buy at a vending machine, given a list of constraints, or finding the most efficient place for three people to meet. Unlike the subject PISA tests it was completed on computers which allowed for more sophisticated interactive assessment.
2. Overall the results correlated fairly closely with the PISA subject tests. Unsurprisingly students who are good at maths problems are also good at ones involving general reasoning. The correlation with maths results was 0.8 and with reading was 0.75.
3. But England was one of the countries that did significantly better in this test than in the subject ones. It came 11th overall but the individual rankings are misleading. It makes more sense to think of clusters of countries that did about as well as each other. The leading group of seven consists entirely of Far East countries and jurisdictions. England is in the second group with countries that traditionally do well in PISA like Australia, Canada, Finland and Estonia. Then comes a third group made up other larger European countries and the United States. The countries below the OECD average are primarily smaller European countries and developing nations.
4. This is unhelpful for a number of the big narratives in English education policy. It undercuts the "England is falling behind in the world" narrative so beloved of right-wing newspapers. On a test of intellectual reasoning (which is what this is) our 15 year olds do as well as any other nation bar a small group of Far East jurisdictions (only two of which - Japan and Korea - are not cities or city states).
5. But it's also perhaps unhelpful for those who argue that our education system is dominated by an obsession with tests and narrow curriculum knowledge. It turns out we're actually pretty good at "21st century skills" already. Our students performed better in this test than you would expect based on their maths, science and reading ability. Likewise all the employers arguing that our system isn't delivering the kind of problem-solving skills they need should reflect on these results.
6. The reason England outperformed it's subject PISA scores is that students at the top end did better on the problem-solving test than on the subject ones. Students at the bottom end did no better. This suggests that we're doing something with our more gifted students that we're not doing with our weaker ones. In other countries - e.g. Japan - the opposite was true weaker students did better in problem-solving than subject tests but the strongest ones didn't.
7. In England there was no statistically significant gender difference in performance on this test (in maths and science boys do better; in reading girls do). Interestingly immigrants scored below non-immigrants which is a change from the maths and reading tests where there is no significant difference.
8. The domination of Far East countries puts pay to the notion that their success in PISA subject tests is somehow down to rote-learning or fact-cramming. It also puts pay to the idea that all Far East systems are the same. While Shanghai and Hong-Kong are still in the top group they did much worse on this test than would be expected given their stellar scores in maths, science and reading. Conversely Korea, Japan and Singapore all did better than would be expected.
9. While the test results are interesting they don't tell us why some countries do better than others. Both Singapore and Korea - who come top - have both tried over the past few years to add "21st century competencies" to their curricula to make them less purely focused on academic knowledge. But it's unclear whether their high scores in this test are due to that or because their traditional strength in the academic basics transfers to "creative problem-solving" tests of this type. The OECD presenters were clear that they thought it was impossible to teach problem-solving skills in the abstract without content, but they also felt it was possible to embed them in a knowledge-based curriculum.