Programme for International Student Assessment

"PISA" redirects here. For other uses, see Pisa (disambiguation).
Programme for International Student Assessment
Abbreviation PISA
Formation 1997
Purpose Comparison of education attainment across the world
Headquarters OECD Headquarters
Location
Region served
World
Membership
59 government education departments
Head of the Early Childhood and Schools Division
Michael Davidson
Main organ
PISA Governing Body (Chair – Lorna Bertrand, England)
Parent organization
OECD
Website PISA

The Programme for International Student Assessment (PISA) is a worldwide study by the Organisation for Economic Co-operation and Development (OECD) in member and non-member nations of 15-year-old school pupils' scholastic performance on mathematics, science, and reading. It was first performed in 2000 and then repeated every three years. It is done with a view to improving education policies and outcomes. It measures problem solving and cognition in daily life.[1]

The 2012 version of the test involved 34 OECD countries and 31 partner countries, with a total of 510,000 participating students.[2] The 2015 version of the test will be published in December 2016.[3]

The Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS) by the International Association for the Evaluation of Educational Achievement are similar studies.

Framework

PISA stands in a tradition of international school studies, undertaken since the late 1950s by the International Association for the Evaluation of Educational Achievement (IEA). Much of PISA's methodology follows the example of the Trends in International Mathematics and Science Study (TIMSS, started in 1995), which in turn was much influenced by the U.S. National Assessment of Educational Progress (NAEP). The reading component of PISA is inspired by the IEA's Progress in International Reading Literacy Study (PIRLS).

PISA aims at testing literacy in three competence fields: reading, mathematics, science on a 1000-point scale.[4]

The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them (curriculum attainment). PISA claims to measure education's application to real-life problems and lifelong learning (workforce knowledge).

In the reading test, "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling." Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts."[5]

Implementation

PISA is sponsored, governed, and coordinated by the OECD.

Method of testing

Sampling

The students tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006, however, several countries also used a grade-based sample of students. This made it possible to study how age and school year interact.

To fulfill OECD requirements, each country must draw a sample of at least 5,000 students. In small countries like Iceland and Luxembourg, where there are fewer than 5,000 students per year, an entire age cohort is tested. Some countries used much larger samples than required to allow comparisons between regions.

Test

PISA test documents on a school table (Neues Gymnasium, Oldenburg, Germany, 2006)

Each student takes a two-hour handwritten test. Part of the test is multiple-choice and part involves fuller answers. There are six and a half hours of assessment material, but each student is not tested on all the parts. Following the cognitive test, participating students spend nearly one more hour answering a questionnaire on their background including learning habits, motivation, and family. School directors fill in a questionnaire describing school demographics, funding, etc. In 2012 the participants were, for the first time in the history of large-scale testing and assessments, offered a new type of problem, i.e. interactive (complex) problems requiring exploration of a novel virtual device.[6][7]

In selected countries, PISA started experimentation with computer adaptive testing.

National add-ons

Countries are allowed to combine PISA with complementary national tests.

Germany does this in a very extensive way: On the day following the international test, students take a national test called PISA-E (E=Ergänzung=complement). Test items of PISA-E are closer to TIMSS than to PISA. While only about 5,000 German students participate in the international and the national test, another 45,000 take only the latter. This large sample is needed to allow an analysis by federal states. Following a clash about the interpretation of 2006 results, the OECD warned Germany that it might withdraw the right to use the "PISA" label for national tests.[8]

Data scaling

From the beginning, PISA has been designed with one particular method of data analysis in mind. Since students work on different test booklets, raw scores must be 'scaled' to allow meaningful comparisons. Scores are thus scaled so that the OECD average in each domain (mathematics, reading and science) is 500 and the standard deviation is 100.[9] This is true only for the initial PISA cycle when the scale was first introduced, though, subsequent cycles are linked to the previous cycles through IRT scale linking methods.[10]

This generation of proficiency estimates is done using a latent regression extension of the Rasch model, a model of item response theory (IRT), also known as conditioning model or population model. The proficiency estimates are provided in the form of so-called plausible values, which allow unbiased estimates of differences between groups. The latent regression, together with the use of a Gaussian prior probability distribution of student competencies allows estimation of the proficiency distributions of groups of participating students.[11] The scaling and conditioning procedures are described in nearly identical terms in the Technical Reports of PISA 2000, 2003, 2006. NAEP and TIMSS use similar scaling methods.

Results

All PISA results are tabulated by country; recent PISA cycles have separate provincial or regional results for some countries. Most public attention concentrates on just one outcome: the mean scores of countries and their rankings of countries against one another. In the official reports, however, country-by-country rankings are given not as simple league tables but as cross tables indicating for each pair of countries whether or not mean score differences are statistically significant (unlikely to be due to random fluctuations in student sampling or in item functioning). In favorable cases, a difference of 9 points is sufficient to be considered significant.

PISA never combines mathematics, science and reading domain scores into an overall score. However, commentators have sometimes combined test results from all three domains into an overall country ranking. Such meta-analysis is not endorsed by the OECD, although official summaries sometimes use scores from a testing cycle's principal domain as a proxy for overall student ability.

PISA 2012

The results for the 2012 "Maths" section on a world map.
The results for the 2012 "Science" section on a world map.
The results for the 2012 "Reading" section on a world map.

PISA 2012 was presented on 3 December 2013, with results for around 510,000 participating students in all 34 OECD member countries and 31 partner countries.[2] This testing cycle had a particular focus on mathematics, where the mean score was 494. A sample of 1,688 students from Puerto Rico took the assessment, scoring 379 in math, 404 in reading and 401 in science.[12] A subgroup of 44 countries and economies with about 85 000 students also took part in an optional computer-based assessment of problem solving.[13]

Shanghai had the highest score in all three subjects. It was followed by Singapore, Hong Kong, Chinese Taipei and Korea in mathematics; Hong Kong, Singapore, Japan and Korea in reading and Hong Kong, Singapore, Japan and Finland in science.

They were a sample of about 28 million in the same age group in 65 countries and economies,[14] including the OECD countries, several Chinese cities, Vietnam, Indonesia and several countries in South America.[2]

The test lasted two hours, was paper-based and included both open-ended and multiple-choice questions.[14]

The students and school staff also answered a questionnaire to provide background information about the students and the schools.[2][14]

PISA 2012 was presented on 3 December 2013, with results for around 510,000 participating students in all 34 OECD member countries and 31 partner countries.[2] This testing cycle had a particular focus on mathematics, where the mean score was 494. The mean score in reading was 496 and in science 501.

The results show distinct groups of high-performers in mathematics: the East Asian countries, with Shanghai, scoring the best result of 613, followed closely by Hong Kong, Japan, Chinese Taipei and South Korea. Among the Europeans, Liechtenstein and Switzerland performed best, with Netherlands, Estonia, Finland, Poland, Belgium, Germany, Austria all posting mathematics scores "not significantly statistically different from" one another. The United Kingdom, Ireland, Australia and New Zealand were similarly clustered around the OECD average of 494, with the USA trailing this group at 481.[2]

Qatar, Kazakhstan and Malaysia were the countries which showed the greatest improvement in mathematics. The USA and the United Kingdom showed no significant change.[15] Sweden had the greatest fall in mathematics performance over the last ten years, with a similar falling trend also in the two other subjects, and leading politicians in Sweden expressed great worry over the results.[16][17]

On average boys scored better than girls in mathematics, girls scored better than boys in reading and the two sexes had quite similar scores in science.[15]

Indonesia, Albania, Peru, Thailand and Colombia were the countries where most students reported being happy at school, while students in Korea, the Czech Republic, the Slovak Republic, Estonia and Finland reported least happiness.[14]

By subject

OECD members as of the time of the study are in boldface.
Mathematics Science Reading
1 China Shanghai, China 613
2  Singapore 573
3  Hong Kong, China 561
4  Taiwan 560
5  South Korea 554
6  Macau, China 538
7  Japan 536
8  Liechtenstein 535
9   Switzerland 531
10  Netherlands 523
11  Estonia 521
12  Finland 519
13=  Canada 518
13=  Poland 518
15  Belgium 515
16  Germany 514
17  Vietnam 511
18  Austria 506
19  Australia 504
20=  Ireland 501
20=  Slovenia 501
22=  Denmark 500
22=  New Zealand 500
24  Czech Republic 499
25  France 495
26  United Kingdom 494
27  Iceland 493
28  Latvia 491
29  Luxembourg 490
30  Norway 489
31  Portugal 487
32  Italy 485
33  Spain 484
34=  Russia 482
34=  Slovakia 482
36  United States 481
37  Lithuania 479
38  Sweden 478
39  Hungary 477
40  Croatia 471
41  Israel 466
42  Greece 453
43  Serbia 449
44  Turkey 448
45  Romania 445
46  Cyprus 440
47  Bulgaria 439
48  United Arab Emirates 434
49  Kazakhstan 432
50  Thailand 427
51  Chile 423
52  Malaysia 421
53  Mexico 413
54  Montenegro 410
55  Uruguay 409
56  Costa Rica 407
57  Albania 394
58  Brazil 391
59=  Argentina 388
59=  Tunisia 388
61  Jordan 386
62=  Colombia 376
62=  Qatar 376
64  Indonesia 375
65  Peru 368
1 China Shanghai, China 580
2  Hong Kong, China 555
3  Singapore 551
4  Japan 547
5  Finland 545
6  Estonia 541
7  South Korea 538
8  Vietnam 528
9  Poland 526
10=  Liechtenstein 525
10=  Canada 525
12  Germany 524
13  Taiwan 523
14=  Netherlands 522
14=  Ireland 522
16=  Macau, China 521
16=  Australia 521
18  New Zealand 516
19   Switzerland 515
20=  Slovenia 514
20=  United Kingdom 514
22  Czech Republic 508
23  Austria 506
24  Belgium 505
25  Latvia 502
26  France 499
27  Denmark 498
28  United States 497
29=  Spain 496
29=  Lithuania 496
31  Norway 495
32=  Italy 494
32=  Hungary 494
34=  Luxembourg 491
34=  Croatia 491
36  Portugal 489
37  Russia 486
38  Sweden 485
39  Iceland 478
40  Slovakia 471
41  Israel 470
42  Greece 467
43  Turkey 463
44  United Arab Emirates 448
45  Bulgaria 446
46=  Serbia 445
46=  Chile 445
48  Thailand 444
49  Romania 439
50  Cyprus 438
51  Costa Rica 429
52  Kazakhstan 425
53  Malaysia 420
54  Uruguay 416
55  Mexico 415
56  Montenegro 410
57  Jordan 409
58  Argentina 406
59  Brazil 405
60  Colombia 399
61  Tunisia 398
62  Albania 397
63  Qatar 384
64  Indonesia 382
65  Peru 373
1 China Shanghai, China 570
2  Hong Kong, China 545
3  Singapore 542
4  Japan 538
5  South Korea 536
6  Finland 524
7=  Taiwan 523
7=  Canada 523
7=  Ireland 523
10  Poland 518
11=  Liechtenstein 516
11=  Estonia 516
13=  Australia 512
13=  New Zealand 512
15  Netherlands 511
16=  Macau, China 509
16=   Switzerland 509
16=  Belgium 509
19=  Germany 508
19=  Vietnam 508
21  France 505
22  Norway 504
23  United Kingdom 499
24  United States 498
25  Denmark 496
26  Czech Republic 493
27=  Austria 490
27=  Italy 490
29  Latvia 489
30=  Luxembourg 488
30=  Portugal 488
30=  Spain 488
30=  Hungary 488
34  Israel 486
35  Croatia 485
36=  Iceland 483
36=  Sweden 483
38  Slovenia 481
39=  Lithuania 477
39=  Greece 477
41=  Russia 475
41=  Turkey 475
43  Slovakia 463
44  Cyprus 449
45  Serbia 446
46  United Arab Emirates 442
47=  Thailand 441
47=  Chile 441
47=  Costa Rica 441
50  Romania 438
51  Bulgaria 436
52  Mexico 424
53  Montenegro 422
54  Uruguay 411
55  Brazil 410
56  Tunisia 404
57  Colombia 403
58  Jordan 399
59  Malaysia 398
60=  Argentina 396
60=  Indonesia 396
62  Albania 394
63  Kazakhstan 393
64  Qatar 388
65  Peru 384

Previous years

Period Focus OECD countries Partner countries Participating students Notes
2000 Reading 28 4 + 11 265,000 The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002.
2003 Mathematics 30 11 275,000 UK disqualified from data analysis. Also included test in problem solving.
2006 Science 30 27 400,000 Reading scores for US disqualified from analysis due to misprint in testing materials.[18]
2009[19] Reading 34 41 + 10 470,000 10 additional non-OECD countries took the test in 2010.[20][21]
2012[2] Mathematics 34 31 510,000

Reception

China

China didn't participate as a nation in the 2012 test, but Shanghai, Hong Kong and Macau participated as separate entities. Shanghai has participated for the second time, topping the rankings in all three subjects, as well as improving scores in the subjects compared to the 2009 tests. Shanghai's score of 613 in mathematics was 113 points above the average score, putting the performance of Shanghai pupils about 3 school years ahead of pupils in average countries. Educational experts debated to which degree the result reflected the quality of the general educational system in China, pointing out that Shanghai has greater wealth and better-paid teachers than the rest of China.[22] Hong Kong placed second in reading and science and third in maths.

China is expected to participate as a country in 2018, and will participate with 4 provinces in 2015. The four provinces will be Jiangsu, Guangdong, Beijing, and Shanghai with a total population of over 230 million.[23]

Critics of PISA counter that in Shanghai and other Chinese cities, most children of migrant workers can only attend city schools up to the ninth grade, and must return to their parents' hometowns for high school due to hukou restrictions, thus skewing the composition of the city's high school students in favor of wealthier local families. A population chart of Shanghai reproduced in The New York Times shows a steep drop off in the number of 15-year-olds residing there.[24] According to Schleicher, 27% of Shanghai's 15-year-olds are excluded from its school system (and hence from testing). As a result, the percentage of Shanghai's 15-year-olds tested by PISA was 73%, lower than the 89% tested in the US.[25]

Finland

Finland, which received several top positions in the first tests, fell in all three subjects, but remained the best performing country overall in Europe, achieving their best result in science with 545 points (5th) and worst in mathematics with 519 (12th) in which the country was outperformed by four other European countries. The drop in mathematics was 25 points since 2003, the last time mathematics was the focus of the tests. For the first time Finnish girls outperformed boys in the subject, but only narrowly. It was also the first time pupils in Finnish-speaking schools did not perform better than pupils in Swedish-speaking schools. Minister of Education and Science Krista Kiuru expressed concern for the overall drop, as well as the fact that the number of low-performers had increased from 7% to 12%.[26]

India

India pulled out of the 2012 round of PISA testing, in August 2012, with the Indian government attributing its action to the unfairness of PISA testing to Indian students.[27] The Indian Express reported on 9/3/2012 that "The ministry (of education) has concluded that there was a socio-cultural disconnect between the questions and Indian students. The ministry will write to the OECD and drive home the need to factor in India's "socio-cultural milieu". India's participation in the next PISA cycle will hinge on this".[28] The Indian Express also noted that "Considering that over 70 nations participate in PISA, it is uncertain whether an exception would be made for India".

In June 2013, the Indian government, still concerned with the future prospect of fairness of PISA testing relating to Indian students, again pulled India out from the 2015 round of PISA testing.[29]

Sweden

Sweden's result dropped in all three subjects in the 2012 test, which was a continuation of a trend from 2006 and 2009. In mathematics, the nation had the sharpest fall in mathematic performance over 10 years among the countries that have participated in all tests, with a drop in score from 509 in 2003 to 478 in 2012. The score in reading showed a drop from 516 in 2000 to 483 in 2012. The country performed below the OECD average in all three subjects.[30] The leader of the opposition, Social Democrat Stefan Löfven, described the situation as a national crisis.[31] Along with the party's spokesperson on education, Ibrahim Baylan, he pointed to the downward trend in reading as most severe.[31]

UK

In the 2012 test, as in 2009, the result was slightly above average for the United Kingdom, with the science ranking being highest (20).[32] England, Wales, Scotland and Northern Ireland also participated as separated entities, showing the worst result for Wales which in mathematics was 43 of the 65 countries and economies. Minister of Education in Wales Huw Lewis expressed disappointment in the results, said that there was no "quick fixes", but hoped that several educational reform that has been implented the last years would give better results in the next round of tests.[33] The United Kingdom had a greater gap between high- and low-scoring students than the average. There was little difference between public and private schools when adjusted for socio-economic background of students. The gender difference in favour of girls was less than in most other countries, as was the difference between natives and immigrants.[32]

Writing in the Daily Telegraph, Ambrose Evans-Pritchard warned against putting too much emphasis on the UK's international ranking, arguing that an overfocus on scholarly performances in East Asia might have contributed to the area's low birthrate, which he argued could harm the economic performance in the future more than a good PISA score would outweigh.[34]

In 2013, the Times Educational Supplement (TES) published an article, "Is PISA Fundamentally Flawed?" by William Stewart, detailing serious critiques of PISA's conceptual foundations and methods advanced by statisticians at major universities.[35]

In the article, Professor Harvey Goldstein of the University of Bristol was quoted as saying that when the OECD tries to rule out questions suspected of bias, it can have the effect of "smoothing out" key differences between countries. "That is leaving out many of the important things,” he warned. "They simply don't get commented on. What you are looking at is something that happens to be common. But (is it) worth looking at? PISA results are taken at face value as providing some sort of common standard across countries. But as soon as you begin to unpick it, I think that all falls apart."

Queen's University Belfast mathematician Dr. Hugh Morrison stated that he found the statistical model underlying PISA to contain a fundamental, insoluble mathematical error that renders Pisa rankings "valueless".[36] Goldstein remarked that Dr. Morrison's objection highlights “an important technical issue” if not a “profound conceptual error”. However, Goldstein cautioned that PISA has been "used inappropriately", contending that some of the blame for this "lies with PISA itself. I think it tends to say too much for what it can do and it tends not to publicise the negative or the weaker aspects.” Professors Morrison and Goldstein expressed dismay at the OECD's response to criticism. Morrison said that when he first published his criticisms of PISA in 2004 and also personally queried several of the OECS's "senior people" about them, his points were met with “absolute silence” and have yet to be addressed. “I was amazed at how unforthcoming they were,” he told TES. “That makes me suspicious.” “Pisa steadfastly ignored many of these issues,” he says. “I am still concerned.”[37]

Professor Kreiner agreed: “One of the problems that everybody has with PISA is that they don’t want to discuss things with people criticising or asking questions concerning the results. They didn’t want to talk to me at all. I am sure it is because they can’t defend themselves.[37]

US

The American result of 2012 was average in science and reading, but lagged behind in mathematics compared to other developed nations. There was little change from the previous test in 2009.[38] The result was described as “a picture of educational stagnation” by Education Secretary Arne Duncan,[39] who said the result was not compatible with the American goal of having the world's best educated workers. Randi Weingarten of the American Federation of Teachers stated that an overemphasis on standardised tests contributed to the lack of improvement in education performance.[40] Dennis Van Roekel of the National Education Association said a failure to address poverty among students had hampered progress.[38]

About 9% of the U.S. students scored in the top two mathematics levels compared to 13% in all countries and economies.[38]

For the first time, three U.S. states participated in the tests as separate entities, with Massachusetts scoring well above both the American and international average, particularly in reading.[40] An approximate corresponding OECD ranking is shown along with the United States average.[41]

Maths Science Reading
16=  Massachusetts 514
18=  Connecticut 506
36 United States U.S. Average 481
41~  Florida 467
9~  Massachusetts 527
16=  Connecticut 521
28 United States U.S. Average 497
38=  Florida 485
6~  Massachusetts 527
10~  Connecticut 521
24 United States U.S. Average 498
26~  Florida 492

Research on possible causes of PISA disparities in different countries

Although PISA and TIMSS officials and researchers themselves generally refrain from hypothesizing about the large and stable differences in student achievement between countries, since 2000, literature on the differences in PISA and TIMSS results and their possible causes has emerged.[42] Data from PISA have furnished several economists, notably Eric Hanushek, Ludger Woessmann, Heiner Rindermann, and Stephen J. Ceci, with material for books and articles about the relationship between student achievement and economic development,[43] democratization, and health;[44] as well as the roles of such single educational factors as high-stakes exams,[45] the presence or absence of private schools, and the effects and timing of ability tracking.[46]

See also

References

  1. Berger, Kathleen. Invitation to The Life Span (second ed.). worth. ISBN 978-1-4641-7205-2.
  2. 1 2 3 4 5 6 7 PISA 2012 Results in Focus (PDF), OECD, 3 December 2013, retrieved 4 December 2013
  3. "Programme for International Student Assessment (PISA)". The Council of Ministers of Education, Canada. Retrieved 2016-06-05.
  4. Hefling, Kimberly. "Asian nations dominate international test". Yahoo!.
  5. "Chapter 2 of the publication 'PISA 2003 Assessment Framework'" (pdf). Pisa.oecd.org.
  6. Keeley B. PISA, we have a problem… OECD Insights, April 2014.
  7. Poddiakov A.N. Complex Problem Solving at PISA 2012 and PISA 2015: Interaction with Complex Reality. // Translated from Russian. Reference to the original Russian text: Poddiakov, A. (2012.) Reshenie kompleksnykh problem v PISA-2012 i PISA-2015: vzaimodeistvie so slozhnoi real'nost'yu. Obrazovatel'naya Politika, 6, 34-53.
  8. C. Füller: Pisa hat einen kleinen, fröhlichen Bruder. taz, 5.12.2007
  9. Stanat, P; Artelt, C; Baumert, J; Klieme, E; Neubrand, M; Prenzel, M; Schiefele, U; Schneider, W (2002), PISA 2000: Overview of the study—Design, method and results, Berlin: Max Planck Institute for Human Development
  10. Mazzeo, John; von Davier, Matthias (2013), Linking Scales in International Large-Scale Assessments, chapter 10 in Rutkowski, L. von Davier, M. & Rutkowski, D. (eds.) Handbook of International Large-Scale Assessment: Background, Technical Issues, and Methods of Data Analysis., New York: Chapman and Hall/CRC.
  11. von Davier, Matthias; Sinharay, Sandip (2013), Analytics in International Large-Scale Assessments: Item Response Theory and Population Models, chapter 7 in Rutkowski, L. von Davier, M. & Rutkowski, D. (eds.) Handbook of International Large-Scale Assessment: Background, Technical Issues, and Methods of Data Analysis., New York: Chapman and Hall/CRC.
  12. CB Online Staff. "PR scores low on global report card", Caribbean Business, September 26, 2014. Retrieved on January 3, 2015.
  13. OECD (2014): PISA 2012 results: Creative problem solving: Students’ skills in tackling real-life problems (Volume V), http://www.oecd-ilibrary.org/education/pisa-2012-results-skills-for-life-volume-v_9789264208070-en
  14. 1 2 3 4 PISA 2012 Results OECD. Retrieved 4 December 2013
  15. 1 2 Sedghi, Ami; Arnett, George; Chalabi, Mona (2013-12-03), Pisa 2012 results: which country does best at reading, maths and science?, The Guardian, retrieved 2013-02-14
  16. Adams, Richard (2013-12-03), Swedish results fall abruptly as free school revolution falters, The Guardian, retrieved 2013-12-03
  17. Kärrman, Jens (2013-12-03), Löfven om Pisa: Nationell kris, Dagens Nyheter, retrieved 2013-12-03
  18. Baldi, Stéphane; Jin, Ying; Skemer, Melanie; Green, Patricia J; Herget, Deborah; Xie, Holly (2007-12-10), Highlights From PISA 2006: Performance of U.S. 15-Year-Old Students in Science and Mathematics Literacy in an International Context (PDF), NCES, retrieved 2013-12-14, PISA 2006 reading literacy results are not reported for the United States because of an error in printing the test booklets. Furthermore, as a result of the printing error, the mean performance in mathematics and science may be misestimated by approximately 1 score point. The impact is below one standard error.
  19. PISA 2009 Results: Executive Summary (PDF), OECD, 2010-12-07
  20. ACER releases results of PISA 2009+ participant economies, ACER, 2011-12-16
  21. Walker, Maurice (2011), PISA 2009 Plus Results (PDF), OECD, retrieved 2012-06-28
  22. Tom Phillips (3 December 2013) OECD education report: Shanghai's formula is world-beating The Telegraph. Retrieved 8 December 2013
  23. http://www.bbc.com/news/education-28937662
  24. Helen Gao, "Shanghai Test Scores and the Mystery of the Missing Children", New York Times, January 23, 2014. For Schleicher's initial response to these criticisms see his post, "Are the Chinese Cheating in PISA Or Are We Cheating Ourselves?" on the OECD's website blog, Education Today, December 10, 2013.
  25. William Stewart, "More than a quarter of Shanghai pupils missed by international Pisa rankings", Times Educational Supplement, March 6, 2014.
  26. PISA 2012: Proficiency of Finnish youth declining University of Jyväskylä. Retrieved 9 December 2013
  27. Hemali Chhapia, TNN (3 August 2012). "India backs out of global education test for 15-year-olds". The Times of India.
  28. "Poor PISA score: Govt blames 'disconnect' with India". The Indian Express. 3 September 2012.
  29. "India chickens out of international students assessment programme again". The Times of India. 1 June 2013.
  30. Lars Näslund (3 December 2013) Svenska skolan rasar i stor jämförelse Expressen. Retrieved 4 December 2013 (Swedish)
  31. 1 2 Jens Kärrman (3 December 2013) Löfven om Pisa: Nationell kris Dagens Nyheter. Retrieved 8 December 2013 (Swedish)
  32. 1 2 Adams, Richard (2013-12-03), UK students stuck in educational doldrums, OECD study finds, The Guardian, retrieved 2013-12-04
  33. Pisa ranks Wales' education the worst in the UK BBC. 3 December 2013. Retrieved 4 December 2013.
  34. Ambrose Evans-Pritchard (3 December 2013) Ambrose Evans-Pritchard Telegraph.co.uk. Retrieved 4 December 2013.
  35. William Stewart, "Is Pisa fundamentally flawed?" Times Educational Supplement, July 26, 2013.
  36. http://www.qub.ac.uk/schools/SchoolofEducation/AboutUs/Staff/Academic/DrHughMorrison/Filestore/Filetoupload,387514,en.pdf
  37. 1 2 Stewart, "Is PISA fundamentally flawed?" TES (2013).
  38. 1 2 3 Motoko Rich (3 December 2013) American 15-Year-Olds Lag, Mainly in Math, on International Standardized Tests New York Times. Retrieved 4 December 2013
  39. Simon, Stephanie (2013-12-03), PISA results show "educational stagnation" in US, Politico, retrieved 2013-12-03
  40. 1 2 Vaznis, James (2013-12-03), Mass. students excel on global examinations, Boston Globe, retrieved 2013-12-14
  41. 2012 Program for International Student Assessment (PISA) Results (PDF), Massachusetts Department of Education, retrieved 2014-12-11
  42. Hanushek, Eric A., and Ludger Woessmann. 2011. "The economics of international differences in educational achievement." In Handbook of the Economics of Education, Vol. 3, edited by Eric A. Hanushek, Stephen Machin, and Ludger Woessmann. Amsterdam: North Holland: 89–200.
  43. Hanushek, Eric; Woessmann, Ludger (2008), "The role of cognitive skills in economic development" (PDF), Journal of Economic Literature, 46 (3): 607–668, doi:10.1257/jel.46.3.607
  44. Rindermann, Heiner; Ceci, Stephen J (2009), "Educational policy and country outcomes in international cognitive competence studies", Perspectives on Psychological Science, 4 (6): 551–577, doi:10.1111/j.1745-6924.2009.01165.x
  45. Bishop, John H (1997), "The effect of national standards and curriculum-based exams on achievement", American Economic Review, 87 (2): 260–264
  46. Hanushek, Eric; Woessmann, Ludger (2006), "Does educational tracking affect performance and inequality? Differences-in-differences evidence across countries" (PDF), Economic Journal, 116 (510): C63–C76, doi:10.1111/j.1468-0297.2006.01076.x

Further reading

Official websites and reports

External links

This article is issued from Wikipedia - version of the 11/30/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.