Programme for International Student Assessment

From Wikipedia, the free encyclopedia - View original article

Programme for International Student Assessment
PurposeComparison of education attainment across the world
HeadquartersOECD Headquarters
Region servedWorld
Membership59 government education departments
Head of the Early Childhood and Schools DivisionMichael Davidson
Main organPISA Governing Body (Chair – Lorna Bertrand, England)
Parent organizationOECD
Jump to: navigation, search
"PISA" redirects here. For other uses, see Pisa (disambiguation).
Programme for International Student Assessment
PurposeComparison of education attainment across the world
HeadquartersOECD Headquarters
Region servedWorld
Membership59 government education departments
Head of the Early Childhood and Schools DivisionMichael Davidson
Main organPISA Governing Body (Chair – Lorna Bertrand, England)
Parent organizationOECD

The Programme for International Student Assessment (PISA) is a worldwide study by the Organisation for Economic Co-operation and Development (OECD) in member and non-member nations of 15-year-old school pupils' scholastic performance on mathematics, science, and reading. It was first performed in 2000 and then repeated every three years. It is done with a view to improving education policies and outcomes.

470,000 15-year-old students representing 65 nations and territories participated in PISA 2009. An additional 50,000 students representing nine nations were tested in 2010.[1]

The Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS) by the International Association for the Evaluation of Educational Achievement are similar studies.


PISA stands in a tradition of international school studies, undertaken since the late 1950s by the International Association for the Evaluation of Educational Achievement (IEA). Much of PISA's methodology follows the example of the Trends in International Mathematics and Science Study (TIMSS, started in 1995), which in turn was much influenced by the U.S. National Assessment of Educational Progress (NAEP). The reading component of PISA is inspired by the IEA's Progress in International Reading Literacy Study (PIRLS).

PISA aims at testing literacy in three competence fields: reading, mathematics, science on a 1000 point scale.[2]

The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them (curriculum attainment). PISA claims to measure education's application to real-life problems and lifelong learning (workforce knowledge).

In the reading test, "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling." Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts."[3]

Development and implementation[edit]

Developed from 1997, the first PISA assessment was carried out in 2000. The results of each period of assessment take about one year and a half to be analysed. First results were published in November 2001. The release of raw data and the publication of technical report and data handbook only took place in spring 2002. The triennial repeats follow a similar schedule; the process of seeing through a single PISA cycle, start-to-finish, always takes over four years.

Every period of assessment focuses on one of the three competence fields of reading, math, science; but the two others are tested as well. After nine years, a full cycle is completed: after 2000, reading was again the main domain in 2009.

PeriodFocusOECD countriesPartner countriesParticipating studentsNotes
2000Reading284 + 11265,000The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002.
2003Mathematics3011275,000UK disqualified from data analysis. Also included test in problem solving.
2006Science3027400,000Reading scores for US disqualified from analysis due to misprint in testing materials.[4]
2009[5]Reading3441 + 10470,00010 additional non-OECD countries took the test in 2010.[6]

PISA is sponsored, governed, and coordinated by the OECD. The test design, implementation, and data analysis is delegated to an international consortium of research and educational institutions led by the Australian Council for Educational Research (ACER). ACER leads in developing and implementing sampling procedures and assisting with monitoring sampling outcomes across these countries. The assessment instruments fundamental to PISA's reading, mathematics, science, problem-solving, computer-based testing, background and contextual questionnaires are similarly constructed and refined by ACER. ACER also develops purpose-built software to assist in sampling and data capture, and analyses all data. The source code of the data analysis software is not made public.

Method of testing[edit]


The students tested by PISA are ages between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006, however, several countries also used a grade-based sample of students. This made it possible to study how age and school year interact.

To fulfill OECD requirements, each country must draw a sample of at least 5,000 students. In small countries like Iceland and Luxembourg, where there are fewer than 5,000 students per year, an entire age cohort is tested. Some countries used much larger samples than required to allow comparisons between regions.


PISA test documents on a school table (Neues Gymnasium, Oldenburg, Germany, 2006)

Each student takes a two-hour handwritten test. Part of the test is multiple-choice and part involves fuller answers. There are six and a half hours of assessment material, but each student is not tested on all the parts. Following the cognitive test, participating students spend nearly one more hour answering a questionnaire on their background including learning habits, motivation and family. School directors fill in a questionnaire describing school demographics, funding, etc.

In selected countries, PISA started experimentation with computer adaptive testing.

National add-ons[edit]

Countries are allowed to combine PISA with complementary national tests.

Germany does this in a very extensive way: On the day following the international test, students take a national test called PISA-E (E=Ergänzung=complement). Test items of PISA-E are closer to TIMSS than to PISA. While only about 5,000 German students participate in the international and the national test, another 45,000 take only the latter. This large sample is needed to allow an analysis by federal states. Following a clash about the interpretation of 2006 results, the OECD warned Germany that it might withdraw the right to use the "PISA" label for national tests.[8]

Data scaling[edit]

From the beginning, PISA has been designed with one particular method of data analysis in mind. Since students work on different test booklets, raw scores must be 'scaled' to allow meaningful comparisons. Scores are thus scaled so that the OECD average in each domain (mathematics, reading and science) is 500 and the standard deviation is 100.[9]

This scaling is done using the Rasch model of item response theory (IRT). According to IRT, it is not possible to assess the competence of students who solved none or all of the test items. This problem is circumvented by imposing a Gaussian prior probability distribution of competences. The scaling procedure is described in nearly identical terms in the Technical Reports of PISA 2000, 2003, 2006. NAEP and TIMSS use similar scaling methods.


All PISA results are tabulated by country; recent PISA cycles have separate provincial or regional results for some countries. Most public attention concentrates on just one outcome: the mean scores of countries and their rankings of countries against one another. In the official reports, however, country-by-country rankings are given not as simple league tables but as cross tables indicating for each pair of countries whether or not mean score differences are statistically significant (unlikely to be due to random fluctuations in student sampling or in item functioning). In favorable cases, a difference of 9 points is sufficient to be considered significant.[citation needed]

PISA never combines mathematics, science and reading domain scores into an overall score. However, commentators have sometimes combined test results from all three domains into an overall country ranking. Such meta-analysis is not endorsed by the OECD, although official summaries sometimes use scores from a testing cycle's principal domain as a proxy for overall student ability.


Main article: PISA 2012 Tests

PISA 2012 was presented on 3 December 2013, with results for around 510,000 participating students in all 34 OECD member countries and 31 partner countries.[7] This testing cycle had a particular focus on mathematics, where the mean score was 494.

OECD members as of the time of the study are in boldface.
1China Shanghai, China613
2 Singapore573
3 Hong Kong, China561
4 Taiwan560
5 South Korea554
6 Macau, China538
7 Japan536
8 Liechtenstein535
9 Switzerland531
10 Netherlands523
11 Estonia521
12 Finland519
13= Canada518
13= Poland518
15 Belgium515
16 Germany514
17 Vietnam511
18 Austria506
19 Australia504
20= Ireland501
20= Slovenia501
22= Denmark500
22= New Zealand500
24 Czech Republic499
25 France495
26 United Kingdom494
27 Iceland493
28 Latvia491
29 Luxembourg490
30 Norway489
31 Portugal487
32 Italy485
33 Spain484
34= Russia482
34= Slovakia482
36 United States481
37 Lithuania479
38 Sweden478
39 Hungary477
40 Croatia471
41 Israel466
42 Greece453
43 Serbia449
44 Turkey448
45 Romania445
46 Cyprus440
47 Bulgaria439
48 United Arab Emirates434
49 Kazakhstan432
50 Thailand427
51 Chile423
52 Malaysia421
53 Mexico413
54 Montenegro410
55 Uruguay409
56 Costa Rica407
57 Albania394
58 Brazil391
59= Argentina388
59= Tunisia388
61 Jordan386
62= Colombia376
62= Qatar376
64 Indonesia375
65 Peru368
1China Shanghai, China580
2 Hong Kong, China555
3 Singapore551
4 Japan547
5 Finland545
6 Estonia541
7 South Korea538
8 Vietnam528
9 Poland526
10= Liechtenstein525
10= Canada525
12 Germany524
13 Taiwan523
14= Netherlands522
14= Ireland522
16= Macau, China521
16= Australia521
18 New Zealand516
19 Switzerland515
20= Slovenia514
20= United Kingdom514
22 Czech Republic508
23 Austria506
24 Belgium505
25 Latvia502
26 France499
27 Denmark498
28 United States497
29= Spain496
29= Lithuania496
31 Norway495
32= Italy494
32= Hungary494
34= Luxembourg491
34= Croatia491
36 Portugal489
37 Russia486
38 Sweden485
39 Iceland478
40 Slovakia471
41 Israel470
42 Greece467
43 Turkey463
44 United Arab Emirates448
45 Bulgaria446
46= Serbia445
46= Chile445
48 Thailand444
49 Romania439
50 Cyprus438
51 Costa Rica429
52 Kazakhstan425
53 Malaysia420
54 Uruguay416
55 Mexico415
56 Montenegro410
57 Jordan409
58 Argentina406
59 Brazil405
60 Colombia399
61 Tunisia398
62 Albania397
63 Qatar384
64 Indonesia382
65 Peru373
1China Shanghai, China570
2 Hong Kong, China545
3 Singapore542
4 Japan538
5 South Korea536
6 Finland524
7= Taiwan523
7= Canada523
7= Ireland523
10 Poland518
11= Liechtenstein516
11= Estonia516
13= Australia512
13= New Zealand512
15 Netherlands511
16= Macau, China509
16= Switzerland509
16= Belgium509
19= Germany508
19= Vietnam508
21 France505
22 Norway504
23 United Kingdom499
24 United States498
25 Denmark496
26 Czech Republic493
27= Austria490
27= Italy490
29 Latvia489
30= Luxembourg488
30= Portugal488
30= Spain488
30= Hungary488
34 Israel486
35 Croatia485
36= Iceland483
36= Sweden483
38 Slovenia481
39= Lithuania477
39= Greece477
41= Russia475
41= Turkey475
43 Slovakia463
44 Cyprus449
45 Serbia446
46 United Arab Emirates442
47= Thailand441
47= Chile441
47= Costa Rica441
50 Romania438
51 Bulgaria436
52 Mexico424
53 Montenegro422
54 Uruguay411
55 Brazil410
56 Tunisia404
57 Colombia403
58 Jordan399
59 Malaysia398
60= Argentina396
60= Indonesia396
62 Albania394
63 Kazakhstan393
64 Qatar388
65 Peru384


The PISA 2009 cycle included results in mathematics, science and reading for all 36 OECD member countries and 37 partner countries.[5][10][11]

Of the partner countries, only selected areas of three countries—India, Venezuela and China—were assessed. PISA 2009+, released in December 2011, included data from 10 additional partner countries which had testing delayed from 2009 to 2010 because of scheduling constraints.[6][12]

OECD members as of the time of the study are in boldface.Participants in PISA 2009+, which were tested in 2010 after the main group of 65, are italicized.
1China Shanghai, China600
2 Singapore562
3 Hong Kong, China555
4 South Korea546
5 Taiwan543
6 Finland541
7 Liechtenstein536
8 Switzerland534
9 Japan529
10 Canada527
11 Netherlands526
12 Macau, China525
13 New Zealand519
14 Belgium515
15 Australia514
16 Germany513
17 Estonia512
18 Iceland507
19 Denmark503
20 Slovenia501
21 Norway498
22 France497
23 Slovakia497
24 Austria496
25 Poland495
26 Sweden494
27 Czech Republic493
28 United Kingdom492
29 Hungary490
30 Luxembourg489
31 United States487
32 Portugal487
33 Ireland487
34 Spain483
35 Italy483
36 Latvia482
37 Lithuania477
38 Russia468
39 Greece466
40 Malta463
41 Croatia460
42 Israel447
43 Turkey445
44 Serbia442
45 Azerbaijan431
46 Bulgaria428
47 Uruguay427
48 Romania427
49 United Arab Emirates421
50 Chile421
51 Mauritius420
52 Thailand419
53 Mexico419
54 Trinidad and Tobago414
55 Costa Rica409
56 Kazakhstan405
57 Malaysia404
58 Montenegro403
59 Moldova397
60Venezuela Miranda, Venezuela397
61 Argentina388
62 Jordan387
63 Brazil386
64 Colombia381
65 Georgia379
66 Albania377
67 Tunisia371
68 Indonesia371
69 Qatar368
70 Peru365
71 Panama360
72India Tamil Nadu, India351
73India Himachal Pradesh, India338
74 Kyrgyzstan331
1China Shanghai, China575
2 Finland554
3 Hong Kong, China549
4 Singapore542
5 Japan539
6 South Korea538
7 New Zealand532
8 Canada529
9 Estonia528
10 Australia527
11 Netherlands522
12 Liechtenstein520
13 Germany520
14 Taiwan520
15 Switzerland517
16 United Kingdom514
17 Slovenia512
18 Macau, China511
19 Poland508
20 Ireland508
21 Belgium507
22 Hungary503
23 United States502
24 Norway500
25 Czech Republic500
26 Denmark499
27 France498
28 Iceland496
29 Sweden495
30 Latvia494
31 Austria494
32 Portugal493
33 Lithuania491
34 Slovakia490
35 Italy489
36 Spain488
37 Croatia486
38 Luxembourg484
39 Russia478
40 Greece470
41 Malta461
42 Israel455
43 Turkey454
44 Chile447
45 Serbia443
46 Bulgaria439
47 United Arab Emirates438
48 Costa Rica430
49 Romania428
50 Uruguay427
51 Thailand425
52Venezuela Miranda, Venezuela422
53 Malaysia422
54 Mauritius417
55 Mexico416
56 Jordan415
57 Moldova413
58 Trinidad and Tobago410
59 Brazil405
60 Colombia402
61 Tunisia401
62 Montenegro401
63 Argentina401
64 Kazakhstan400
65 Albania391
66 Indonesia383
67 Qatar379
68 Panama376
69 Georgia373
70 Azerbaijan373
71 Peru369
72India Tamil Nadu, India348
73 Kyrgyzstan330
74India Himachal Pradesh, India325
1China Shanghai, China556
2 South Korea539
3 Finland536
4 Hong Kong, China533
5 Singapore526
6 Canada524
7 New Zealand521
8 Japan520
9 Australia515
10 Netherlands508
11 Belgium506
12 Norway503
13 Estonia501
14 Switzerland501
15 Poland500
16 Iceland500
17 United States500
18 Liechtenstein499
19 Sweden497
20 Germany497
21 Ireland496
22 France496
23 Taiwan495
24 Denmark495
25 United Kingdom494
26 Hungary494
27 Portugal489
28 Macau, China487
29 Italy486
30 Latvia484
31 Greece483
32 Slovenia483
33 Spain481
34 Czech Republic478
35 Slovakia477
36 Croatia476
37 Israel474
38 Luxembourg472
39 Austria470
40 Lithuania468
41 Turkey464
42 Russia459
43 Chile449
44 Costa Rica443
45 Malta442
46 Serbia442
47 United Arab Emirates431
48 Bulgaria429
49 Uruguay426
50 Mexico425
51 Romania424
52Venezuela Miranda, Venezuela422
53 Thailand421
54 Trinidad and Tobago416
55 Malaysia414
56 Colombia413
57 Brazil412
58 Montenegro408
59 Mauritius407
60 Jordan405
61 Tunisia404
62 Indonesia402
63 Argentina398
64 Kazakhstan390
65 Moldova388
66 Albania385
67 Georgia374
68 Qatar372
69 Panama371
70 Peru370
71 Azerbaijan362
72India Tamil Nadu, India337
73India Himachal Pradesh, India317
74 Kyrgyzstan314


OECD members as of the time of the study are in boldface.Reading scores for the United States were disqualified.
1 Taiwan549
2 Finland548
3 Korea547
4 Hong Kong, China547
5 Netherlands531
6 Switzerland530
7 Canada527
8 Macau, China525
9 Liechtenstein525
10 Japan523
11 New Zealand522
12 Belgium520
13 Australia520
14 Estonia515
15 Denmark513
16 Czech Republic510
17 Iceland506
18 Austria505
19 Slovenia504
20 Germany504
21 Sweden502
22 Ireland501
23 France496
24 United Kingdom495
25 Poland495
26 Slovakia492
27 Hungary491
28 Norway490
29 Luxembourg490
30 Lithuania486
31 Latvia486
32 Spain480
33 Russia476
34 Azerbaijan476
35 United States474
36 Croatia467
37 Portugal466
38 Italy462
39 Greece459
40 Israel442
41 Serbia435
42 Uruguay427
43 Turkey424
44 Thailand417
45 Romania415
46 Bulgaria413
47 Chile411
48 Mexico406
49 Montenegro399
50 Indonesia391
51 Jordan384
52 Argentina381
53 Colombia370
54 Brazil370
55 Tunisia365
56 Qatar318
57 Kyrgyzstan311
1 Finland563
2 Hong Kong, China542
3 Canada534
4 Taiwan532
5 Japan531
6 Estonia531
7 New Zealand530
8 Australia527
9 Netherlands525
10 Liechtenstein522
11 Korea522
12 Slovenia519
13 Germany516
14 United Kingdom515
15 Czech Republic513
16 Switzerland512
17 Austria511
18 Macau, China511
19 Belgium510
20 Ireland508
21 Hungary504
22 Sweden503
23 Poland498
24 Denmark496
25 France495
26 Croatia493
27 Iceland491
28 Latvia490
29 United States489
30 Slovakia488
31 Spain488
32 Lithuania488
33 Norway487
34 Luxembourg486
35 Russia479
36 Italy475
37 Portugal474
38 Greece473
39 Israel454
40 Chile438
41 Serbia436
42 Bulgaria434
43 Uruguay428
44 Turkey424
45 Jordan422
46 Thailand421
47 Romania418
48 Montenegro412
49 Mexico410
50 Indonesia393
51 Argentina391
52 Brazil390
53 Colombia388
54 Tunisia386
55 Azerbaijan382
56 Qatar349
57 Kyrgyzstan322
1 Korea556
2 Finland547
3 Hong Kong, China536
4 Canada527
5 New Zealand521
6 Ireland517
7 Australia513
8 Liechtenstein510
9 Poland508
10 Sweden507
11 Netherlands507
12 Belgium501
13 Estonia501
14 Switzerland499
15 Japan498
16 Taiwan496
17 United Kingdom495
18 Germany495
19 Denmark494
20 Slovenia494
21 Macau, China492
22 Austria490
23 France488
24 Iceland484
25 Norway484
26 Czech Republic483
27 Hungary482
28 Latvia479
29 Luxembourg479
30 Croatia477
31 Portugal472
32 Lithuania470
33 Italy469
34 Slovakia466
35 Spain461
36 Greece460
37 Turkey447
38 Chile442
39 Russia440
40 Israel439
41 Thailand417
42 Uruguay413
43 Mexico410
44 Bulgaria402
45 Serbia401
46 Jordan401
47 Romania396
48 Indonesia393
49 Brazil393
50 Montenegro392
51 Colombia385
52 Tunisia380
53 Argentina374
54 Azerbaijan353
55 Qatar312
56 Kyrgyzstan285


The results for PISA 2003 were released on 14 December 2004. This PISA cycle tested 275,000 15 year-olds on mathematics, science, reading and problem solving and involved schools from 30 OECD member countries and 11 partner countries.[13] Note that for Science and Reading, the means displayed are for "All Students", but for these two subjects (domains), not all of the students answered questions in these domains. In the 2003 OECD Technical Report (pages 208, 209), there are different country means (different than those displayed below) available for students who had exposure to these domains.[14]

OECD members at the time of the study are in boldface.The United Kingdom was disqualified due to a low response rate.
MathematicsScienceReadingProblem solving
1 Hong Kong, China550
2 Finland544
3 Korea542
4 Netherlands538
5 Liechtenstein536
6 Japan534
7 Canada532
8 Belgium529
9 Macau, China527
10 Switzerland527
11 Australia524
12 New Zealand523
13 Czech Republic516
14 Iceland515
15 Denmark514
16 France511
17 Sweden509
18 Austria506
19 Germany503
20 Ireland503
21 Slovakia498
22 Norway495
23 Luxembourg493
24 Poland490
25 Hungary490
26 Spain485
27 Latvia483
28 United States483
29 Russia468
30 Portugal466
31 Italy466
32 Greece445
33 Serbia437
34 Turkey423
35 Uruguay422
36 Thailand417
37 Mexico385
38 Indonesia360
39 Tunisia359
40 Brazil356
1 Finland548
2 Japan548
3 Hong Kong, China539
4 Korea538
5 Liechtenstein525
6 Australia525
7 Macau, China525
8 Netherlands524
9 Czech Republic523
10 New Zealand521
11 Canada519
12 Switzerland513
13 France511
14 Belgium509
15 Sweden506
16 Ireland505
17 Hungary503
18 Germany502
19 Poland498
20 Slovakia495
21 Iceland495
22 United States491
23 Austria491
24 Russia489
25 Latvia489
26 Spain487
27 Italy486
28 Norway484
29 Luxembourg483
30 Greece481
31 Denmark475
32 Portugal468
33 Uruguay438
34 Serbia436
35 Turkey434
36 Thailand429
37 Mexico405
38 Indonesia395
39 Brazil390
40 Tunisia385
1 Finland543
2 Korea534
3 Canada528
4 Australia525
5 Liechtenstein525
6 New Zealand522
7 Ireland515
8 Sweden514
9 Netherlands513
10 Hong Kong, China510
11 Belgium507
12 Norway500
13 Switzerland499
14 Japan498
15 Macau, China498
16 Poland497
17 France496
18 United States495
19 Denmark492
20 Iceland492
21 Germany491
22 Austria491
23 Latvia491
24 Czech Republic489
25 Hungary482
26 Spain481
27 Luxembourg479
28 Portugal478
29 Italy476
30 Greece472
31 Slovakia469
32 Russia442
33 Turkey441
34 Uruguay434
35 Thailand420
36 Serbia412
37 Brazil403
38 Mexico400
39 Indonesia382
40 Tunisia375
1 Korea550
2 Hong Kong, China548
3 Finland548
4 Japan547
5 New Zealand533
6 Macau, China532
7 Australia530
8 Liechtenstein529
9 Canada529
10 Belgium525
11 Switzerland521
12 Netherlands520
13 France519
14 Denmark517
15 Czech Republic516
16 Germany513
17 Sweden509
18 Austria506
19 Iceland505
20 Hungary501
21 Ireland498
22 Luxembourg494
23 Slovakia492
24 Norway490
25 Poland487
26 Latvia483
27 Spain482
28 Russia479
29 United States477
30 Portugal470
31 Italy469
32 Greece448
33 Thailand425
34 Serbia420
35 Uruguay411
36 Turkey408
37 Mexico384
38 Brazil371
39 Indonesia361
40 Tunisia345


The results for the first cycle of the PISA survey were released on 14 November 2001. 265,000 15 year-olds were tested in 28 OECD countries and 4 partner countries on mathematics, science and reading. An additional 11 countries were tested later in 2002.[15]

OECD members as of the time of the study are in boldface.The 11 partner countries tested in 2002 after the main group of 32 are italicized.
1 Hong Kong, China560
2 Japan557
3 Korea547
4 New Zealand537
5 Finland536
6 Australia533
7 Canada533
8 Switzerland529
9 United Kingdom529
10 Belgium520
11 France517
12 Austria515
13 Denmark514
14 Iceland514
15 Liechtenstein514
16 Sweden510
17 Ireland503
18 Norway499
19 Czech Republic498
20 United States493
21 Germany490
22 Hungary488
23 Russia478
24 Spain476
25 Poland470
26 Latvia463
27 Italy457
28 Portugal454
29 Greece447
30 Luxembourg446
31 Israel433
32 Thailand432
33 Bulgaria430
34 Argentina388
35 Mexico387
36 Chile384
37 Albania381
38 Macedonia381
39 Indonesia367
40 Brazil334
41 Peru292
1 Korea552
2 Japan550
3 Hong Kong, China541
4 Finland538
5 United Kingdom532
6 Canada529
7 New Zealand528
8 Australia528
9 Austria519
10 Ireland513
11 Sweden512
12 Czech Republic511
13 France500
14 Norway500
15 United States499
16 Hungary496
17 Iceland496
18 Belgium496
19 Switzerland496
20 Spain491
21 Germany487
22 Poland483
23 Denmark481
24 Italy478
25 Liechtenstein476
26 Greece461
27 Russia460
28 Latvia460
29 Portugal459
30 Bulgaria448
31 Luxembourg443
32 Thailand436
33 Israel434
34 Mexico422
35 Chile415
36 Macedonia401
37 Argentina396
38 Indonesia393
39 Albania376
40 Brazil375
41 Peru333
1 Finland546
2 Canada534
3 New Zealand529
4 Australia528
5 Ireland527
6 Hong Kong, China525
7 Korea525
8 United Kingdom523
9 Japan522
10 Sweden516
11 Austria507
12 Belgium507
13 Iceland507
14 Norway505
15 France505
16 United States504
17 Denmark497
18 Switzerland494
19 Spain493
20 Czech Republic492
21 Italy487
22 Germany484
23 Liechtenstein483
24 Hungary480
25 Poland479
26 Greece474
27 Portugal470
28 Russia462
29 Latvia458
30 Israel452
31 Luxembourg441
32 Thailand431
33 Bulgaria430
34 Mexico422
35 Argentina418
36 Chile410
37 Brazil396
38 Macedonia373
39 Indonesia371
40 Albania349
41 Peru327

Comparison with other studies[edit]

The correlation between PISA 2003 and TIMSS 2003 grade 8 country means is 0.84 in mathematics, 0.95 in science. The values go down to 0.66 and 0.79 if the two worst performing developing countries are excluded. Correlations between different scales and studies are around 0.80. The high correlations between different scales and studies indicate common causes of country differences (e.g. educational quality, culture, wealth or genes) or a homogenous underlying factor of cognitive competence. European Economic Area countries perform slightly better in PISA; the Commonwealth of Independent States and Asian countries in TIMSS. Content balance and years of schooling explain most of the variation.[16]


For many countries, the results from PISA 2000 were surprising. In Germany and the United States, for example, the comparatively low scores brought on heated debate about how the school system should be changed.[citation needed] Some headlines in national newspapers, for example, were:

The results from PISA 2003 and PISA 2006 were featured in the 2010 documentary Waiting for "Superman".[21]


Students from Shanghai, China, had the top scores of every category (Mathematics, Reading and Science) in PISA 2009 and 2012. In discussing these results, PISA spokesman Andreas Schleicher, Deputy Director for Education and head of the analysis division at the OECD’s directorate for education, described Shanghai as a pioneer of educational reform in which "there has been a sea change in pedagogy". Schleicher stated that Shanghai abandoned its "focus on educating a small elite, and instead worked to construct a more inclusive system. They also significantly increased teacher pay and training, reducing the emphasis on rote learning and focusing classroom activities on problem solving."[22]

Schleicher also states that PISA tests administered in rural China have produced some results approaching the OECD average: Citing further, as-yet-unpublished OECD research, Schleicher said, "We have actually done Pisa in 12 of the provinces in China. Even in some of the very poor areas you get performance close to the OECD average."[23] Schleicher says that for a developing country, China's 99.4% enrollment in primary education is "the envy of many countries". He maintains that junior secondary school participation rates in China are now 99%; and in Shanghai, not only has senior secondary school enrollment attained 98%, but admissions into higher education have achieved 80% of the relevant age group. Schleicher believes that this growth reflects quality, not just quantity, which he contends the top PISA ranking of Shanghai's secondary education confirms.[23] Schleicher believes that China has also expanded school access and has moved away from learning by rote.[24] According to Schleicher, Russia performs well in rote-based assessments, but not in PISA, whereas China does well in both rote-based and broader assessments.[23]

Critics of PISA counter that in Shanghai and other Chinese cities, most children of migrant workers can only attend city schools up to the ninth grade, and must return to their parents' hometowns for high school due to hukou restrictions, thus skewing the composition of the city's high school students in favor of wealthier local families. A population chart of Shanghai reproduced in The New York Times shows a steep drop off in the number of 15-year-olds residing there.[25] According to Schleicher, 27% of Shanghai's 15-year-olds are excluded from its school system (and hence from testing). As a result, the percentage of Shanghai's 15-year-olds tested by PISA was 73%, lower than the 89% tested in the US.[26]

Education professor Yong Zhao has noted that PISA 2009 did not receive much attention in the Chinese media, and that the high scores in China are due to excessive workload and testing, adding that it's "no news that the Chinese education system is excellent in preparing outstanding test takers, just like other education systems within the Confucian cultural circle: Singapore, Korea, Japan, and Hong Kong."[27]


The stable, high marks of Finnish students have attracted a lot of attention. According to Hannu Simola[28] the results reflect a paradoxical mix of progressive policies implemented through a rather conservative pedagogic setting, where the high levels of teachers' academic preparation, social status, professionalism and motivation for the job are concomitant with the adherence to traditional roles and methods by both teachers and pupils in Finland's changing, but still quite paternalistic culture. Others advance Finland's low poverty rate as a reason for its success.[29][30] Finnish education reformer Pasi Sahlberg attributes Finland's high educational achievements to its emphasis on social and educational equality and stress on cooperation and collaboration, as opposed to the competition among teachers and schools that prevails in other nations.[31]


Of the 74 countries tested in the PISA 2009 cycle including the "+" nations, the two Indian states came up 72nd and 73rd out of 74 in both reading and maths, and 73rd and 74th in science. India's poor performance may not be linguistic as some suggested. 12.87% of US students, for example, indicated that the language of the test differed from the language spoken at home. while 30.77% of Himachal Pradesh students indicated that the language of the test differed from the language spoken at home, a significantly higher percent[32] However, unlike American students, those Indian students with a different language at home did better on the PISA test than those with the same language.[32] India's poor performance on the PISA test is consistent with India's poor performance in the only other instance when India's government allowed an international organization to test its students[33] and consistent with India's own testing of its elite students in a study titled Student Learning in the Metros 2006. [34] These studies were conducted using TIMSS questions. The poor result in PISA was greeted with dismay in the Indian media.[35] The BBC reported that as of 2008, only 15% of India's students reach high school.[36]

India pulled out of the 2012 round of PISA testing, in August 2012, with the Indian government attributing its action to the unfairness of PISA testing to Indian students.[37] The Indian Express reported on 9/3/2012 that "The ministry (of education) has concluded that there was a socio-cultural disconnect between the questions and Indian students. The ministry will write to the OECD and drive home the need to factor in India's "socio-cultural milieu". India's participation in the next PISA cycle will hinge on this".[38] The Indian Express also noted that "Considering that over 70 nations participate in PISA, it is uncertain whether an exception would be made for India".

In June 2013, the Indian government, still concerned with the future prospect of fairness of PISA testing relating to Indian students, again pulled India out from the 2015 round of PISA testing.[39]


In 2013, the Times Educational Supplement (TES) published an article, "Is PISA Fundamentally Flawed?" by William Stewart, detailing serious critiques of PISA's conceptual foundations and methods advanced by statisticians at major universities.[40]

In the article, Professor Harvey Goldstein of the University of Bristol was quoted as saying that when the OECD tries to rule out questions suspected of bias, it can have the effect of "smoothing out" key differences between countries. "That is leaving out many of the important things,” he warned. "They simply don't get commented on. What you are looking at is something that happens to be common. But (is it) worth looking at? PISA results are taken at face value as providing some sort of common standard across countries. But as soon as you begin to unpick it, I think that all falls apart."

University of Copenhagen Professor Svend Kreiner, who examined in detail PISA's 2006 reading results, noted that in 2006 only about ten percent of the students who took part in PISA were tested on all 28 reading questions. "This in itself is ridiculous,” Kreiner told Stewart. "Most people don't know that half of the students taking part in PISA (2006) do not respond to any reading item at all. Despite that, PISA assigns reading scores to these children."[41]

Queen's University Belfast mathematician Dr. Hugh Morrison stated that he found the statistical model underlying PISA to contain a fundamental, insoluble mathematical error that renders Pisa rankings "valueless".[42] Goldstein remarked that Dr. Morrison's objection highlights “an important technical issue” if not a “profound conceptual error”. However, Goldstein cautioned that PISA has been "used inappropriately", contending that some of the blame for this "lies with PISA itself. I think it tends to say too much for what it can do and it tends not to publicise the negative or the weaker aspects.” Professors Morrison and Goldstein expressed dismay at the OECD's response to criticism. Morrison said that when he first published his criticisms of PISA in 2004 and also personally queried several of the OECS's "senior people" about them, his points were met with “absolute silence” and have yet to be addressed. “I was amazed at how unforthcoming they were,” he told TES. “That makes me suspicious.” “Pisa steadfastly ignored many of these issues,” he says. “I am still concerned.”[43]

Professor Kreiner agreed: “One of the problems that everybody has with PISA is that they don’t want to discuss things with people criticising or asking questions concerning the results. They didn’t want to talk to me at all. I am sure it is because they can’t defend themselves.[43]

United States[edit]

Two studies have compared high achievers in mathematics on the PISA and those on the U.S. National Assessment of Educational Progress (NAEP). Comparisons were made between those scoring at the "advanced" and "proficient" levels in mathematics on the NAEP with the corresponding performance on the PISA. Overall, 30 nations had higher percentages than the U.S. of students at the "advanced" level of mathematics. The only OECD countries with worse results were Portugal, Greece, Turkey, and Mexico. Six percent of U.S. students were "advanced" in mathematics compared to 28 percent in Taiwan. The highest ranked state in the U.S. (Massachusetts) was just 15th in the world if it was compared with the nations participating in the PISA. 31 nations had higher percentages of "proficient" students than the U.S. Massachusetts was again the best U.S. state, but it ranked just ninth in the world if compared with the nations participating in the PISA.[44][45]

Comparisons with results for the Trends in International Mathematics and Science Study (TIMSS) appear to give different results—suggesting that the U.S. states actually do better in world rankings.[46] This can likely be traced to the different material being covered and the United States teaching mathematics in a style less harmonious with the "Realistic Mathematics Education" which forms the basis of the exam[47]. Countries that commonly use this teaching method score higher on PISA, and less highly on TIMSS and other assessments. [48]


Stephen Krassen, professor emeritus at the University of Southern California,[49] and Mel Riddile of the NASSP attributed the relatively low performance of students in the United States to the country's high rate of child poverty, which exceeds that of other OECD countries.[29][30] However, individual US schools with poverty rates comparable to Finland's (below 10%), as measured by reduced-price school lunch participation, outperform Finland; and US schools in the 10–24% reduced-price lunch range are not far behind.[50]

Reduced school lunch participation is the only available intra-poverty indicator for US schoolchildren. In the United States, schools in locations in which less than 10% of the students qualified for free or reduced-price lunch averaged PISA scores of 551 (higher than any other OECD country). This can be compared with the other OECD countries (which have tabled figures on children living in relative poverty):[30]

CountryPercent of reduced school lunches (US)[30]

Percent of relative child poverty (Other OECD countries)[51]

PISA score[52]
United States< 10%551
United States10%–24.9%527
New Zealand16.3%521
United States25–49.9%502
United States50–74.9%471
Russian Federation58.3%459
United States> 75%446

Sampling errors[edit]

In 2013 Martin Carnoy of the Stanford University Graduate School of Education and Richard Rothstein of the Economic Policy Institute released a report, "What do international tests really show about U.S. student performance?", analyzing the 2009 PISA data base. Their report found that U.S. PISA test scores had been lowered by a sampling error that over-represented adolescents from the most disadvantaged American schools in the test-taking sample.[53] The authors cautioned that international test scores are often “interpreted to show that American students perform poorly when compared to students internationally” and that school reformers then conclude that “U.S. public education is failing.” Such inferences, made before the data has been carefully analyzed, they say, “are too glib”[54] and "may lead policymakers to pursue inappropriate and even harmful reforms."[55]

Carnoy and Rothstein observe that in all countries, students from disadvantaged backgrounds perform worse than those from advantaged backgrounds, and the US has a greater percentage of students from disadvantaged backgrounds. The sampling error on the PISA results lowered U.S. scores for 15-year-olds even further, they say. The authors add, however, that in countries such as Finland, the scores of disadvantaged students tends to be stagnant, whereas in the U.S the scores of disadvantaged students have been steadily rising over time, albeit still lagging behind their those of their more advantaged peers. When the figures are adjusted for social class, the PISA scores of all US students would still remain behind those of the highest scoring countries, nevertheless, the scores of US students of all social backgrounds have shown a trajectory of improvement over time, notably in mathematics, a circumstance PISA's report fails to take into account.

Carnoy and Rothstein write that PISA spokesman Schleicher has been quoted saying that “international education benchmarks make disappointing reading for the U.S.” and that “in the U.S. in particular, poverty was destiny. Low-income American students did (and still do) much worse than high-income ones on PISA. But poor kids in Finland and Canada do far better relative to their more privileged peers, despite their disadvantages” (Ripley 2011)."[56] Carnoy and Rothstein state that their report's analysis shows Schleicher and Ripley's claims to be untrue. They further fault the way PISA's results have persistently been released to the press before experts have time to evaluate them; and they charge the OECD reports with inconsistency in explaining such factors as the role of parental education. Carnoy and Rothstein also note with alarm that the US secretary of education Arne Duncan regularly consults with PISA's Andreas Schleicher in formulating educational policy before other experts have been given a chance to analyze the results.[57] Carnoy and Rothstein's report (written before the release of the 2011 database) concludes:

We are most certain of this: To make judgments only on the basis of national average scores, on only one test, at only one point in time, without comparing trends on different tests that purport to measure the same thing, and without disaggregation by social class groups, is the worst possible choice. But, unfortunately, this is how most policymakers and analysts approach the field.

The most recent test for which an international database is presently available is PISA, administered in 2009. A database for TIMSS 2011 is scheduled for release in mid-January 2013. In December 2013, PISA will announce results and make data available from its 2012 test administration. Scholars will then be able to dig into TIMSS 2011 and PISA 2012 databases so they can place the publicly promoted average national results in proper context. The analyses we have presented in this report should caution policymakers to await understanding of this context before drawing conclusions about lessons from TIMSS or PISA assessments.[58]

Research on possible causes of PISA disparities in different countries[edit]

Although PISA and TIMSS officials and researchers themselves generally refrain from hypothesizing about the large and stable differences in student achievement between countries, since 2000 an large literature on the differences in PISA and TIMSS results and their possible causes has emerged.[59] Data from PISA have furnished several economists, notably Eric Hanushek, Ludger Wosserman, Heiner Rindermann, and Stephen J. Ceci, with material for books and articles about the relationship between student achievement and economic development,[60] democratization, and health;[61] as well as the roles of such single educational factors as high-stakes exams,[62] the presence or absence of private schools, and the effects and timing of ability tracking.[63]

See also[edit]


  1. ^ PISA 2009 Technical Report, 2012, OECD,
  2. ^ Hefling, Kimberly. "Asian nations dominate international test". Yahoo!. 
  3. ^ "Chapter 2 of the publication 'PISA 2003 Assessment Framework'" (pdf). 
  4. ^ Baldi, Stéphane; Jin, Ying; Skemer, Melanie; Green, Patricia J; Herget, Deborah; Xie, Holly (2007-12-10), Highlights From PISA 2006: Performance of U.S. 15-Year-Old Students in Science and Mathematics Literacy in an International Context, NCES, retrieved 2013-12-14, PISA 2006 reading literacy results are not reported for the United States because of an error in printing the test booklets. Furthermore, as a result of the printing error, the mean performance in mathematics and science may be misestimated by approximately 1 score point. The impact is below one standard error. 
  5. ^ a b PISA 2009 Results: Executive Summary, OECD, 2010-12-07 
  6. ^ a b ACER releases results of PISA 2009+ participant economies, ACER, 2011-12-16 
  7. ^ a b PISA 2012 Results in Focus, OECD, 3 December 2013, retrieved 4 December 2013 
  8. ^ C. Füller: Pisa hat einen kleinen, fröhlichen Bruder. taz, 5.12.2007 [1]
  9. ^ Stanat, P; Artelt, C; Baumert, J; Klieme, E; Neubrand, M; Prenzel, M; Schiefele, U; Schneider, W (2002), PISA 2000: Overview of the study—Design, method and results, Berlin: Max Planck Institute for Human Development 
  10. ^ Multi-dimensional Data Request, OECD, 2010, retrieved 2012-06-28 
  11. ^ PISA 2009 Results: Executive Summary (Figure 1 only), OECD, 2010, retrieved 2012-06-28 
  12. ^ Walker, Maurice (2011), PISA 2009 Plus Results, OECD, retrieved 2012-06-28 
  13. ^ Learning for Tomorrow’s World First Results from PISA 2003, OECD, 2004-12-14, retrieved 2014-01-06 
  14. ^ PISA 2003 Technical Report, OECD 
  15. ^ Literacy Skills for the World of Tomorrow: Further Results from PISA 2000, OECD, 2003, retrieved 2014-01-06 
  16. ^ M. L. Wu: A Comparison of PISA and TIMSS 2003 achievement results in Mathematics. Paper presented at the AERA Annual Meeting, New York, March 2008.
  17. ^ Kronholz, June (2004-11-07), "Economic Time Bomb: U.S. Teens Are Among Worst at Math", The Wall Street Journal, retrieved 2014-01-06 
  18. ^ Preocupe-se. Seu filho é mal educado, Veja, 2007-11-07, retrieved 2013-04-13 
  19. ^ Aunión, Juan Antonio (2007-12-05), "La educación española retrocede", El País, retrieved 2014-01-06 
  20. ^ "Finnish teens score high marks in latest PISA study", Helsingin Sanomat, 2007-11-30, retrieved 2014-01-06 
  21. ^ "Waiting for "Superman" trailer". Retrieved 8 October 2010. 
  22. ^ Gumbel, Peter (7 December 2010), "China Beats Out Finland for Top Marks in Education", Time, retrieved 27 June 2012 
  23. ^ a b c Cook, Chris (7 December 2010), "Shanghai tops global state school rankings", Financial Times, retrieved 28 June 2012 
  24. ^ Mance, Henry (7 December 2010), "Why are Chinese schoolkids so good?", Financial Times, retrieved 28 June 2012 
  25. ^ Helen Gao, "Shanghai Test Scores and the Mystery of the Missing Children", New York Times, January 23, 2014. For Schleicher's initial response to these criticisms see his post, "Are the Chinese Cheating in PISA Or Are We Cheating Ourselves?" on the OECD's website blog, Education Today, December 10, 2013.
  26. ^ William Stewart, "More than a quarter of Shanghai pupils missed by international Pisa rankings", Times Educational Supplement, March 6, 2014.
  27. ^ Yong Zhao (10 December 2010), A True Wake-up Call for Arne Duncan: The Real Reason Behind Chinese Students Top PISA Performance 
  28. ^ Simola, Hannu (2005), "The Finnish miracle of PISA: Historical and sociological remarks on teaching and teacher education", Comparative Education 41 (4): 455–470, doi:10.1080/03050060500317810 
  29. ^ a b "The Economics Behind International Education Rankings" National Educational Association
  30. ^ a b c d Riddile, Mel (15 December 2010), PISA: It's Poverty Not Stupid, National Association of Secondary School Principals 
  31. ^ Cleland, Elizabeth. "What Americans Keep Ignoring About Finland's School Success – Anu Partanen". The Atlantic. 
  32. ^ a b "Database – PISA 2009". 
  33. ^
  34. ^ Initiatives, Educational (November 2006), "Student Learning in the Metros", Educational Initiatives 
  35. ^ Vishnoi, Anubhuti (7 January 2012), "Poor PISA ranks: HRD seeks reason", The Indian Express 
  36. ^ Masani, Zareer (27 February 2008). "India still Asia's reluctant tiger". BBC News. 
  37. ^ Hemali Chhapia, TNN (3 August 2012). "India backs out of global education test for 15-year-olds". The Times of India. 
  38. ^ "Poor PISA score: Govt blames ‘disconnect’ with India". The Indian Express. 3 September 2012. 
  39. ^ "India chickens out of international students assessment programme again". The Times of India. 1 June 2013. 
  40. ^ William Stewart, "Is Pisa fundamentally flawed?" Times Educational Supplement, July 29, 2013.
  41. ^
  42. ^,387514,en.pdf
  43. ^ a b Stewart, "Is PISA fundamentally flawed?" TES (2013).
  44. ^ Paul E. Peterson, Ludger Woessmann, Eric A. Hanushek, and Carlos X. Lastra-Anadón (2011) "Are U.S. students ready to compete? The latest on each state's international standing." Education Next 11:4 (Fall): 51–59.
  45. ^ Eric A. Hanushek, Paul E. Peterson, and Ludger Woessmann (2011) "Teaching math to the talented." Education Next 11, no. 1 (Winter): 10–18.
  46. ^ Gary W. Phillips (2007) Chance favors the prepared mind: Mathematics and science indicators for comparing states. Washington: American Institutes for Research (14 November); Gary W. Phillips (2009) The Second Derivative:International Benchmarks in Mathematics For U.S. States and School Districts. Washington, DC: American Institutes for Research (June).
  47. ^ "PISA Mathematics: A Teacher’s Guide". 
  48. ^ Loveless, Tom. "International Tests Are Not All the Same". Brookings Institute. 
  49. ^ quoted in Valerie Strauss, "How poverty affected U.S. PISA scores", The Washington Post, December 9, 2010.
  50. ^ "Stratifying PISA scores by poverty rates suggests imitating Finland is not necessarily the way to go for US schools". Simply Statistics. 23 August 2013. 
  51. ^ "Child poverty statistics: how the UK compares to other countries", The Guardian. The same UNICEF figures were used by Riddile.
  52. ^ Highlights From PISA 2009, Table 3.
  53. ^ See, Martin Carnoy and Richard Rothstein, "What do international tests really show about U.S. student performance?", Economic Policy Institute, January 28, 2013.
  54. ^ Valerie Strauss, "U.S. scores on international test lowered by sampling error: report", Washington Post, January 15, 2013.
  55. ^ Carnoy and Rothstein, "What do international tests really show about U.S. student performance?", Economic Policy Institute, January 28, 2013
  56. ^ Schleicher was quoted by Amanda Ripley to this effect in her 2011 book, The Smartest Kids in The World (Simon and Schuster).
  57. ^ Carnoy and Rothstein, "What do international tests really show about U.S. student performance?", EPI, January 28, 2013. Another scholar, Matthew di Carlo of the Albert Shanker Institute, criticized PISA for reporting its results in the form of national rankings, since rankings can give a misleading impression that differences between countries' scores are far larger than is actually the case. Di Carlo also faulted PISA's methodology for disregarding factors such as margin of error. See Matthew di Carlo, "Pisa For Our Time: A Balanced Look", Albert Shanker Institute website, January 10, 2011.
  58. ^ Carnoy and Rothstein, "What do international tests really show about U.S. student performance?", EPI, January 28, 2013.
  59. ^ Hanushek, Eric A., and Ludger Woessmann. 2011. "The economics of international differences in educational achievement." In Handbook of the Economics of Education, Vol. 3, edited by Eric A. Hanushek, Stephen Machin, and Ludger Woessmann. Amsterdam: North Holland: 89–200.
  60. ^ Hanushek, Eric; Woessmann, Ludger (2008), "The role of cognitive skills in economic development", Journal of Economic Literature 46 (3): 607–668, doi:10.1257/jel.46.3.607 
  61. ^ Rindermann, Heiner; Ceci, Stephen J (2009), "Educational policy and country outcomes in international cognitive competence studies", Perspectives on Psychological Science 4 (6): 551–577, doi:10.1111/j.1745-6924.2009.01165.x 
  62. ^ Bishop, John H (1997), "The effect of national standards and curriculum-based exams on achievement", American Economic Review 87 (2): 260–264 
  63. ^ Hanushek, Eric; Woessmann, Ludger (2006), "Does educational tracking affect performance and inequality? Differences-in-differences evidence across countries", Economic Journal 116 (510): C63–C76, doi:10.1111/j.1468-0297.2006.01076.x 

Further reading[edit]

Official websites and reports[edit]

Reception and political consequences[edit]



United Kingdom[edit]



External links[edit]

Video clips[edit]