WLD_2006_PISA_v01_M
Programme for International Student Assessment 2006
Name | Country code |
---|---|
Argentina | ARG |
Australia | AUS |
Austria | AUT |
Azerbaijan | AZE |
Belgium | BEL |
Bulgaria | BGR |
Brazil | BRA |
Canada | CAN |
Switzerland | CHE |
Chile | CHL |
Colombia | COL |
Czech Republic | CZE |
Germany | DEU |
Denmark | DNK |
Spain | ESP |
Estonia | EST |
Finland | FIN |
France | FRA |
United Kingdom | GBR |
Greece | GRC |
Hong Kong SAR, China | HKG |
Croatia | HRV |
Hungary | HUN |
Indonesia | IDN |
Ireland | IRL |
Iceland | ISL |
Israel | ISR |
Italy | ITA |
Jordan | JOR |
Japan | JPN |
Kyrgyz Republic | KGZ |
Korea, Rep. | KOR |
Liechtenstein | LIE |
Lithuania | LTU |
Luxembourg | LUX |
Latvia | LVA |
Macao SAR, China | MAC |
Mexico | MEX |
Montenegro | MNE |
Netherlands | NLD |
Norway | NOR |
New Zealand | NZL |
Poland | POL |
Portugal | PRT |
Qatar | QAT |
Romania | ROU |
Russian Federation | RUS |
Serbia | SRB |
Slovak Republic | SVK |
Slovenia | SVN |
Sweden | SWE |
Thailand | THA |
Tunisia | TUN |
Turkiye | TUR |
Taiwan, China | TWN |
Uruguay | URY |
United States | USA |
PISA 2006 is the third cycle of a data strategy defined in 1997 by participating countries.
The OECD Programme for International Student Assessment (PISA) is a collaborative effort undertaken by all member countries and a number of non-member partner countries to measure how well students, at age 15, are prepared to meet the challenges they may encounter in future life. Age 15 is chosen because at this age, in most OECD countries, students are approaching the end of compulsory schooling, and so, some measure of the knowledge, skills and attitudes accumulated over approximately ten years of education is gained from an assessment at this time. the PISA assessment takes a broad approach to assessing knowledge, skills and attitudes that reflect current changes in curricula, moving beyond the school based approach towards the use of knowledge in everyday tasks and challenges. the skills acquired reflect the ability of students to continue learning throughout their lives by applying what they learn in school to non-school environments, evaluating their choices and making decisions. the assessment, jointly guided by the participating governments, brings together the policy interests of countries by applying scientific expertise at both national and international levels.
PISA combines the assessment of domain-specific cognitive areas such as science, mathematics and reading with information on students' home background, their approaches to learning, their perceptions of their learning environments and their familiarity with computers. A high priority in PISA 2006 is an innovative assessment of student attitudes towards science - questions about this were contextualised within the cognitive part of the test. Bringing the attitude items closer to the cognitive questions allowed questions to be targeted at specific areas, with the focus on interest in science and students' support for scientific enquiry. Student outcomes are then associated with these background factors.
PISA uses: i) strong quality assurance mechanisms for translation, sampling and test administration; ii) measures to achieve cultural and linguistic breadth in the assessment materials, particularly through countries' participation in the development and revision processes for the production of the items; and iii) state of the art technology and methodology for data handling. the combination of these measures produces high quality instruments and outcomes with superior levels of validity and reliability to improve the understanding of education systems as well as students' knowledge, skills and attitudes.
PISA is based on a dynamic model of lifelong learning in which new knowledge and skills necessary for successful adaptation to a changing world are continuously acquired throughout life. PISA focuses on things that 15-year-old students will need in the future and seeks to assess what they can do with what they have learned. the assessment is informed, but not constrained, by the common denominator of national curricula. thus, while it does assess students' knowledge, PISA also examines their ability to reflect, and to apply their knowledge and experience to real world issues. For example, in order to understand and evaluate scientific advice on food safety an adult would need not only to know some basic facts about the composition of nutrients, but also to be able to apply that information. the term "literacy" is used to encapsulate this broader concept of knowledge and skills.
PISA is designed to collect information through three-yearly cycles and presents data on the reading, mathematical and scientific literacy of students, schools and countries. It provides insights into the factors that influence the development of skills and attitudes at home and at school, and examines how these factors interact and what the implications are for policy development.
PISA 2006 is the third cycle of a data strategy defined in 1997 by participating countries. the results allow national policy makers to compare the performance of their education systems with those of other countries. Similar to the previous cycles, the 2006 assessment covers the domains of reading, mathematical and scientific literacy, with the major focus on scientific literacy. Students also respond to a background questionnaire, and additional supporting information is gathered from the school authorities. Fifty-six countries and regions, including all 30 OECD member countries, are taking part in the PISA 2006 assessment. together, they comprise almost 90% of the world's economy.
Since the aim of PISA is to assess the cumulative yield of education systems at an age where compulsory schooling is still largely universal, testing focused on 15-year-olds enrolled in both school-based and work-based educational programmes. Between 5 000 and 10 000 students from at least 150 schools will typically be tested in each country, providing a good sampling base from which to break down the results according to a range of student characteristics.
The primary aim of the PISA assessment is to determine the extent to which young people have acquired the wider knowledge and skills in reading, mathematical and scientific literacy that they will need in adult life. the assessment of cross-curricular competencies continues to be an integral part of PISA 2006. the main reasons for this broadly oriented approach are:
• Although specific knowledge acquisition is important in school learning, the application of that knowledge in adult life depends crucially on the acquisition of broader concepts and skills. In science, having specific knowledge, such as the names of plants and animals, is of less value than understanding broad topics such as energy consumption, biodiversity and human health in thinking about the issues under debate in the adult community. In reading, the capacity to develop interpretations of written material and to reflect on the content and qualities of text are central skills. In mathematics, being able to reason quantitatively and to represent relationships or dependencies is more apt than the ability to answer familiar textbook questions when it comes to deploying mathematical skills in everyday life.
• In an international setting, a focus on curriculum content would restrict attention to curriculum elements common to all or most countries. this would force many compromises and result in an assessment too narrow to be of value for governments wishing to learn about the strengths and innovations in the education systems of other countries.
• Certain broad, general skills are essential for students to develop. they include communication, adaptability, flexibility, problem solving and the use of information technologies. these skills are developed across the curriculum and an assessment of them requires a broad cross-curricular focus.
PISA is not a single cross-national assessment of the reading, mathematics and science skills of 15-year-old students. It is an ongoing programme that, over the longer term, will lead to the development of a body of information for monitoring trends in the knowledge and skills of students in various countries as well as in different demographic subgroups of each country. On each occasion, one domain will be tested in detail, taking up nearly two-thirds of the total testing time. the major domain was reading literacy in 2000 and mathematical literacy in 2003, and is scientific literacy in 2006. this will provide a thorough analysis of achievement in each area every nine years and a trend analysis every three. Similar to previous cycles of PISA, the total time spent on the PISA 2006 tests by each student is two hours, but information is obtained on about 390 minutes worth of test items. the total set of questions is packaged into 13 linked testing booklets. each booklet is taken by a sufficient number of students for appropriate estimates to be made of the achievement levels on all items by students in each country and in relevant sub-groups within a country (such as males and females, and students from different social and economic contexts). Students also spend 30 minutes answering questions for the context questionnaire.
The PISA assessment provides three main types of outcomes:
• Basic indicators that provide baseline profile of the knowledge and skills of students.
• Contextual indicators that show how such skills relate to important demographic, social, economic and educational variables.
• Indicators on trends that emerge from the on-going nature of the data collection and that show changes in outcome levels and distributions, and in relationships between student-level and school-level background variables and outcomes.
The scope of Programme for International Student Assessment (PISA) 2006 includes:
OECD countries
Partner countries/economies
Name |
---|
Organisation for Economic Co-operation and Development (OECD) |
The questionnaires seek information about:
• Students and their family backgrounds, including their economic, social and cultural capital
• Aspects of students' lives, such as their attitudes towards learning, their habits and life inside school, and their family environment
• Aspects of schools, such as the quality of the schools' human and material resources, public and private control and funding, decision-making processes, and staffing practices
• Context of instruction, including institutional structures and types, class size, and the level of parental involvement
• Strategies of self-regulated learning, motivational preferences and goal orientations, self-related cognition mechanisms, action control strategies, preferences for different types of learning situations, learning styles, and social skills required for co-operative or competitive learning
• Aspects of learning and instruction in science, including students' motivation, engagement and confidence with science, and the impact of learning strategies on achievement related to the teaching and learning of science
Two additional questionnaires are offered as international options:
• A computer familiarity questionnaire focusing on: i) availability and use of information and communications technology (ICT), including the location where Ic is mostly used as well as the type of use; ii) Ic confidence and attitudes, including self-efficacy and attitudes towards computers; and iii) learning background of ICT, focusing on where students learned to use computers and the Internet. the OECD published a report resulting from analysis of data collected via this questionnaire in 2003, Are Students Ready for a Technology-Rich World? What PISA Studies Tell Us (OECD , 2005).
• A parent questionnaire focusing on a number of topics including the student's past science activities, parents' views on the student's school, parents' views on science in the student's intended career and the need for scientific knowledge and skills in the job market, parents' views on science and the environment, the cost of education services, and parents' education and occupation.
Start | End |
---|---|
2006 | 2006 |
The design and implementation of PISA for the 2000, 2003 and 2006 data collections has been the responsibility of an international consortium led by the Australian Council for Educational Research (ACER) with Ray Adams as international project director. The other partners in this consortium have been the National Institute for Educational Measurement (Cito Group) in the Netherlands, Unité d’analyse des systèmes et des pratiques d’enseignement (aSPe) at Université de Liège in Belgium, Westat and the Educational Testing Service (ETS) in the United States and the National Institute for Educational Research (NIER) in Japan.
The consortium implements PISA within a framework established by the PISA Governing Board (PGB) which includes representation from all participating countries at senior policy levels. The PGB established policy priorities and standards for developing indicators, for establishing assessment instruments, and for reporting results. Experts from participating countries served on working groups linking the programme policy objectives with the best internationally available technical expertise in the three assessment areas. These expert groups were referred to as Subject Matter Expert Groups (SMEGs) (see Appendix 8 for members). By participating in these expert groups and regularly reviewing outcomes of the groups’ meetings, countries ensured that the instruments were internationally valid and that they took into account the cultural and educational contexts of the different OECD member countries, that the assessment materials had strong measurement potential, and that the instruments emphasised authenticity and educational validity.
Each of the participating countries appointed a National Project Manager (NPM), to implement PISA nationally. The NPM ensured that internationally agreed common technical and administrative procedures were employed. These managers played a vital role in developing and validating the international assessment instruments and ensured that PISA implementation was of high quality. The NPMs also contributed to the verification and evaluation of the survey results, analyses and reports.
The OECD Secretariat had overall responsibility for managing the programme. It monitored its implementation on a day-to-day basis, served as the secretariat for the PGB, fostered consensus building between the countries involved, and served as the interlocutor between the PGB and the international consortium.
Use of the dataset must be acknowledged using a citation which would include:
Example:
Organisation for Economic Co-operation and Development (OECD). Programme for International Student Assessment (PISA) 2006. Ref. WLD_2006_PISA_v01_M. Dataset downloaded from www.microdata.worldbank.org on [date].
The user of the data acknowledges that the original collector of the data, the authorized distributor of the data, and the relevant funding agency bear no responsibility for use of the data or for interpretations or inferences based upon such uses.
DDI_WLD_2006_PISA_v02_M
DDI Document - Version 02 - (04/21/21)
This version is identical to DDI_WLD_2006_PISA_v01_M but country field has been updated to capture all the countries covered by survey.