
Dr Jarrod Hingston
Head of Division
Our division is driven by a clear purpose: to give educators the evidence they need to understand each learner’s progress and to act on it with confidence. Every day, we see this purpose come to life in classrooms. We see school leaders applying evidence to guide teaching, educators gaining clarity through our workshops, and students demonstrating measurable progress in their learning informed by evidence.
Our division is driven by a clear purpose: to give educators the evidence they need to understand each learner’s progress and to act on it with confidence. Every day, we see this purpose come to life in classrooms. We see school leaders applying evidence to guide teaching, educators gaining clarity through our workshops, and students demonstrating measurable progress in their learning informed by evidence.
This year, our focus was on deepening the impact of our programs. We prioritised research-informed design to enhance our assessments and tools so they better support the needs of educators and learners.
We recorded strong growth in engagement across our programs – with increased participation in workshops, broader adoption of our assessment tools, and more conversations about evidence-informed practice. These are tangible indicators of the expanding reach and influence of our work in Australia and around the world.
To support the needs of early years teachers, we reimagined the assessment experience through the development of PAT Early Literacy, a new addition to our flagship PAT suite. Grounded in a new framework, PAT Early Literacy invites young learners on an engaging quest, guided by friendly characters. Its creation brought together our user experience, assessment development, and delivery teams to rethink how assessments can be designed and delivered. Early trial feedback has been encouraging, with students responding enthusiastically to the new format and we look forward to sharing its impact as the assessment rolls out in 2026.
Work is also underway for the delivery of the Early Numeracy Screener at the beginning of 2027, with trialling to start in 2026. This important tool will provide teachers of year 1 students with information on number representation, relations and operations – areas where limited understanding is known to be a risk factor for challenges in future maths learning. The approach aligns with Australia’s Better and Fairer Schools Agreement which requires schools to identify student learning needs early and employ effective interventions for improvement.
We have also continued to refine the educator experience across our platforms, addressing key pain points such as candidate management and session invigilation in close collaboration with our User Experience team. We look forward to further enhancing our services to reduce any administrative load on educators, and to ensure we deliver clear insights, quickly.
This year we also successfully delivered fair and reliable assessments for several admissions processes, including entry into Victorian Selective Entry High Schools on behalf of the Victoria Department of Education, and the Western Australian Academic Selective Entrance Test on behalf of the Western Australian Department of Education. These exams sit within our wider selection and scholarships portfolio that provides schools across Australia with the data needed to make informed decisions.
Our work with international schools also continues to grow, reinforcing the relevance of our assessments across diverse curricula and contexts. Educators globally face similar questions: Where are my students in their learning? What are their next steps? How much progress have they made?
In this sector, we support schools through the long-standing International Schools’ Assessment (ISA) program and the newer Progressive Achievement for International Schools (PAIS) assessment, which adapts PAT’s research and technology for international settings.
The education sector is continuing to shift from a narrow focus on marks and rankings toward broader demonstrations of student capability. We aim to contribute to this shift by providing tools that support meaningful learning outcomes.
We recognise the increasing demands on educators to collect and interpret data. Our focus remains on delivering high-quality, purposeful data – not data for its own sake. Through ACER’s psychometric expertise, we help educators understand where a student is on a learning continuum, measuring and informing their progress over time.
Dayton Primary School in Western Australia uses PAT as a core diagnostic tool to understand student progress and strengthen teaching across the school. Principal Dr Ray Boyd describes PAT as “a muscle biopsy” that provides a precise snapshot of where learners are, so teachers can target instruction where it is needed most.
As a new school that first opened in 2023, Dayton Primary established PAT Reading and PAT Maths as baseline measures, testing early in the year and again each November to track growth. The school analyses scale-score patterns and cohort trends to determine expected progress and compares each cohort’s development against national benchmarks to identify strengths and gaps.
PAT helps ensure no students are left behind. Dayton Primary School has seen some of its most vulnerable learners make substantial scale-score gains within a single year, demonstrating the impact of targeted teaching and well-structured intervention.
Teachers use the school’s data dashboards to identify common learning needs, such as shared gaps in grammar, reading or mathematical concepts, and adjust upcoming instruction accordingly. “PAT opens up the lines of inquiry,” says Dr Boyd. “It tells us what to look for when we walk into classrooms.”
By combining explicit teaching with nationally normed evidence, Dayton is strengthening consistency, raising expectations, and ensuring every learner shows measurable progress.
countries represented
schools participated
assessments delivered
tests delivered
students assessed
educators accessed teacher resources
positive feedback from educators
tests adminsitered
Lisa Norris
Head of Division
Our Tertiary and Industry Tools (TaIT) team ensures that adult learners can be fairly and equitably assessed so they can succeed in their chosen profession, whatever it may be. We specialise in the development and delivery of high-quality educational products and services for tertiary and vocational learners and those seeking professional entry or accreditation.
Our Tertiary and Industry Tools (TaIT) team ensures that adult learners can be fairly and equitably assessed so they can succeed in their chosen profession, whatever it may be. We specialise in the development and delivery of high-quality educational products and services for tertiary and vocational learners and those seeking professional entry or accreditation.
Our assessments are designed by ACER’s leading experts and validated by psychometricians to ensure confidence, rigour and fairness. We work with an incredibly diverse range of clients around the world and pride ourselves on service that is reliable, responsive and built on long-term partnership. Whether supporting correctional facilities in Australia or universities in Denmark, Malaysia, Ireland, the UK or Papua New Guinea, our purpose remains the same: to give every candidate a genuine opportunity to demonstrate their capability.
This year, TaIT delivered more than 328,000 assessments across the globe, including nearly 92,000 sittings in test centres and more than 101,000 sessions via remote proctoring. Our candidates came from over 100 countries, reflecting the scale and reach of our work. What drives us is knowing that every assessment represents a person taking an important step toward their future.
We also continued to support police recruitment processes in Queensland and Victoria. Through ACER’s police entrance exams, applicants from more than 65 countries were assessed for their readiness to join Queensland Police via remote proctoring, ensuring a consistent and reliable assessment experience regardless of location. Our Financial Adviser Exam also remained a key professional gateway, with more than 1,000 candidates participating across four test sessions this year. Since its introduction in 2018, more than 20,000 financial advisors have successfully completed the test.
The Skills for Tertiary Admissions Test (STAT) remains an important alternative pathway into tertiary study for mature-age applicants, those without an ATAR, or those seeking a more affordable entry route. TaIT supported around 3,000 candidates in Australia and around 17,000 in Papua New Guinea to sit the relevant STAT in 2024/25.
A key focus this year has been on strengthening our systems, processes and operating rhythm. We undertook a multi-project process audit, progressed improvements to core IT systems and, with the support of colleagues in ACER India and the UK, expanded our remote-proctoring assistance model. This has allowed us to provide 24/7 support during test windows and significantly improve response times for candidates, enhancing their overall testing experience
A strong education system that ensures all Australian students benefit from outstanding teaching is built on teachers who are well trained, competent and informed. TaIT continues to play a critical national role through the Literacy and Numeracy Test for Initial Teacher Education Students (LANTITE). The test supports classroom readiness by ensuring that all new teachers are in the top 30 per cent of the Australian adult population in literacy and numeracy achievement.
In 2025, following the National Teacher Workforce Action Plan trial, education ministers agreed to remove limits on the number of test attempts for LANTITE candidates. The trial found that removing test limits and offering improved feedback encouraged more candidates to attempt and achieve the required standard.
Our work in health also continues to expand. The Graduate Medical School Admissions Test (GAMSAT) is the leading graduate medical and allied health entrance assessment in Australia, Ireland and the UK, and in 2024/25, candidates sat the test remotely from more than 50 countries.
Our focus in the year ahead is on continuing to strengthen the systems, technology and candidate experience that underpin TaIT’s global operations. As demand for flexible, secure and globally accessible assessment grows, TaIT will continue to innovate while staying anchored in ACER’s commitment to evidence, quality and service.
STAT was introduced in Papua New Guinea in 2016 and is held in November every year.
Combined with Grade 12 National Exam results, it is used to gain admission to the Papua New Guinea University of Technology, The University of Goroka and The Pacific Adventist University.
By evaluating broad academic abilities instead of specialised subject knowledge, STAT enables our partner universities to identify candidates who possess the skills essential for success in higher education.
In 2024/25, STAT-P received 16,800 registrations, with 86 test sittings across 19 venues. Drawing on deep in-country experience in PNG and strong partnerships with local providers, ACER successfully navigated logistical challenges - from transport delays to supply disruptions - to ensure secure and timely delivery of testing materials, providing thousands of aspiring students with a reliable and fair pathway into higher education.
assessments delivered
centre-based sittings
remote-proctored sittings
teachers
health professionals
police recruits
people attempted one or both LANTITE components since 2016
countries sat GAMSAT
STAT candidates in Australia
Dr Eveline Gebhardt
Head of Division
The purpose of the Assessment Development, Implementation and Reporting (ADIR) Division is to improve learning by developing, delivering and reporting high-quality national and international assessments. Our work gives teachers, schools and policymakers clear and reliable evidence about what students know and can do—and where additional support is needed. We partner with governments, education departments and international organisations to design strong assessment frameworks, run assessments that are fair and secure, and produce reporting that can be trusted. Everything we do aims to make assessment useful, meaningful and accessible for the people who rely on it.
The purpose of the Assessment Development, Implementation and Reporting (ADIR) Division is to improve learning by developing, delivering and reporting high-quality national and international assessments. Our work gives teachers, schools and policymakers clear and reliable evidence about what students know and can do—and where additional support is needed. We partner with governments, education departments and international organisations to design strong assessment frameworks, run assessments that are fair and secure, and produce reporting that can be trusted. Everything we do aims to make assessment useful, meaningful and accessible for the people who rely on it.
This year, ADIR delivered a broad suite of national and international assessments. Our teams worked across sampling, framework design, item and questionnaire development, test delivery, coding, scaling and reporting to ensure each assessment met high technical standards and produced evidence that helps improve learning.
A major milestone was the release of the Trends in International Mathematics and Science Study (TIMSS) 2023 results for Australia. Australian year 4 students achieved their best-ever performance in mathematics and science, while year 8 performance remained stable. The findings also reinforce that strong teaching, positive school climate, student wellbeing and equity remain central to improving outcomes. These insights will guide improvement efforts across Australian schooling systems.
We also completed the analysis and prepared the Australian national report for the Teaching and Learning International Survey (TALIS) 2024, released in early FY25/26. TALIS highlighted key strengths such as high teacher job satisfaction, strong collegiality and thoughtful use of AI for adapting classroom materials. It also identified key areas of stress for teachers, including heavy administrative workload, teacher shortages and increasingly diverse classrooms. These findings will inform policy and practice around Australian teacher preparation, professional support and retention.
Another significant achievement has been supporting the roll-out of the Programme for International Student Assessment (PISA) 2025 – the world’s largest assessment project – and ACER is proud to support it. With 93 countries participating, new online delivery systems and an innovative domain focused on how students use digital and computational tools, PISA 2025 required years of planning and coordination. We provided guidance on sampling, quality assurance and platform delivery to national centres across the globe.
Alongside these major programs, work progressed on item and questionnaire development for national sample assessments in ICT Literacy and Science Literacy. Across all of this, I am proud of our team’s commitment to accuracy, fairness and providing evidence that strengthens teaching and supports students.
In the coming year, ADIR will begin continue planning for PISA 2029, including development of frameworks, technical standards and operational timelines. At the same time we will analyse the data for Australia’s national PISA 2025 report, which will be released in September 2026. This will include reporting on the 9 interactive modules ACER designed for the innovative domain Learning in the Digital World. We will also progress the national sample assessments in ICT Literacy and Science Literacy, and prepare for PIRLS fieldwork ahead of year 4 reading literacy testing in 2026.
By Dr Goran Lazendic PISA 2025 Chief International Survey Director
When I think about PISA 2025, I’m reminded of the scale of what we’ve undertaken, not just in size, but in significance. I think about the scale of what we are doing and why it matters. This cycle brings together 93 countries and economies from every region of the world, including 10 taking part for the first time. For many of them, this is a major step toward understanding how their systems are preparing young people for life beyond school and improving learning outcomes on the basis of insights from PISA. But, this is not about ranking countries, it is a global conversation about the future of learning and ensuring that young people have a chance to thrive regardless of where they are living. It is a privilege to make sure this dialogue is built on sound facts – and it is a privilege to help them do that well.
One of the biggest milestones for this cycle has been the transition to fully online delivery. Now, for the first time, everything runs through a central platform. This is a major shift. It means more consistency in test delivery, stronger quality control and faster, data.
Because of this change in ground operations, a major focus this cycle has been supporting national centres in preparing for and running testing in their schools. That meant working closely with National Programme Managers and other colleagues from national centres and responding quickly to any changes to schools and test sessions they had to make. It’s intense work, but when you know what’s at stake, you do it.
The numbers give you a sense of the scale: 76 national centres completed testing; more than 50 languages were accommodated; around 681,000 online sessions and 68,000 offline sessions were completed. And less than 1 per cent of sessions had any technical issues, which shows what is possible when countries and ACER’s expertise and care work together.
We also developed a new science test for PISA 2025. This assessment reflects an updated understanding that science is not just knowledge, but the ability to search for, evaluate and use information to make good decisions. Young people face complex issues – from health to climate to technology – and science education needs to prepare them for it.
One of the highlights of this cycle was hosting the global meeting of National Project Managers in Melbourne in 2024. About 200 colleagues came together – some new to PISA, others with decades of experience. Countries like Rwanda are joining PISA for the first time, using it as part of their journey toward rebuilding and strengthening their education systems. Others, like Chile, have used PISA data for many years to drive improvements for their most vulnerable learners. Hearing their stories reminded me again why this work matters.
PISA gives countries a clearer picture of how their students are doing and where they need to focus. It’s not perfect, and it’s never simple, but when the evidence is solid, decisions get better. And when decisions get better, young people throughout the world benefit.
countries featured
national centres completed testing
languages accommodated
online sessions completed
offline sessions, including student assessments, teacher and school questionnaires, completed
Dr Dan Edwards
Head of Division
Education Research, Policy and Development (ERPD) leads ACER’s research agenda on educational improvement and learning. We partner with governments, NGOs, multilateral organisations and education providers around the world to build evidence-informed policies and practices that drive system transformation. At the heart of our work is a belief that education should create equity and opportunity for all learners. By working closely with educators, leaders and policy makers, and by deeply understanding the contexts they operate in, we develop insights that guide practical improvement and build local capability. Our team is grounded in ACER’s 95-year history of independent research and is made up of subject matter experts who are passionate about helping education systems work better for every learner.
Education Research, Policy and Development (ERPD) leads ACER’s research agenda on educational improvement and learning. We partner with governments, NGOs, multilateral organisations and education providers around the world to build evidence-informed policies and practices that drive system transformation. At the heart of our work is a belief that education should create equity and opportunity for all learners. By working closely with educators, leaders and policy makers, and by deeply understanding the contexts they operate in, we develop insights that guide practical improvement and build local capability. Our team is grounded in ACER’s 95-year history of independent research and is made up of subject matter experts who are passionate about helping education systems work better for every learner.
ERPD comprises four specialist programs:
This year, we delivered more than 80 projects for clients around the world, spanning early childhood through to tertiary and professional education. Our clients include national ministries, development agencies, foundations and service providers, and the breadth of work we deliver reflects their confidence in ACER’s evidence, independence and practical expertise.
A major highlight was the progress of the Assessment for Minimum Proficiency Levels (AMPL), developed with the UNESCO Institute for Statistics to support SDG 4.1.1. AMPL is now implemented in 15 low- and middle-income countries, including 4 new pilot programs in 2025. For many, AMPL is the first step toward building national assessment capability and creating fairer, more transparent systems.
In Southeast Asia, we partnered with ASEAN and the UK Government on the SAGE program, developing a pioneering data framework and thematic studies to better understand the barriers faced by girls, women and marginalised groups. This work provides a strong evidence base for targeted regional action on gender equity.
In Australia, our ERPD team also contributed to a nationally consistent framework for university admissions, strengthening equity, transparency and student pathways at a time of rapid change across the tertiary landscape.
Our researchers also undertook major studies in specialist areas. In New South Wales, a survey of 706 primary teachers across 95 schools is already shaping improvements in initial teacher education for music.
Across early learning, we led the second cycle of the International Early Learning and Child Well-being Study, surveying more than 25,000 five-year-olds in nine countries. The study provides deep insights into children’s foundational learning, executive function and social-emotional development.
Our Preschool Outcomes Measure (POM) project also advanced significantly, moving Australia closer to a national formative assessment tool for the year before school (see case study).
In the coming year, we will continue partnering with governments and development agencies to build strong, evidence-driven systems that improve learning for all children. We will deepen our focus on foundational skills, teacher capability and early learning, where practical insights can make the greatest difference. A major priority will be the 2025 National Applied Trial of the Preschool Outcomes Measure (POM), bringing Australia closer to a consistent, culturally safe assessment tool that supports educators and families to understand children’s development before school. Across all programs, our goal remains the same: to translate high-quality evidence into improvements that educators and learners can feel in their everyday experience.
The Preschool Outcomes Measure (POM) is being developed to give every child in Australia a fair, consistent and culturally safe start to their learning journey. As a national formative assessment tool for the year before full-time school, it helps teachers understand each child’s strengths, needs and progress so they can tailor support with confidence.
Equity sits at the centre of POM. Developed through a partnership between ACER, Ninti One and Goodstart Early Learning, the tool has been shaped through deep engagement with First Nations communities and educators in every state and territory. This ensures that the tool reflects diverse cultural contexts, is respectful and inclusive, and supports teachers working with children from remote, regional and culturally diverse communities.
For educators, POM offers practical, observation-based insights that align with real practice and children’s natural learning. For families, it provides clear, strengths-focused information that supports meaningful conversations about their child’s development. For systems, it creates a consistent national picture of children’s early learning, enabling better planning, targeted support and improved transition to school.
By laying strong foundations during the preschool year, POM has the potential to positively shape long-term learning outcomes – helping ensure that every child begins school with confidence, and that communities and educators have the tools they need to support them.
children and
services participated in the Small-Scale Trial of the Preschool Outcomes Measure
five-year-olds surveyed across 9 countries in the International Early Learning and Child Well-being Study
projects delivered globally
AMPL
implemented in
countries
Strong collaborations with
DFAT, UNESCO, SEAMO
and national ministries
Dr Nathan Zoanetti
Head of Division
The Measurement, Analytics and Technologies (MAT) Division applies advanced statistical, psychometric and analytical techniques to produce robust, fair and trustworthy assessment results that are fit-for-purpose. Our capabilities are highly specialised and reflect best-practice approaches in psychometrics, sampling, statistical modelling and data processing. Clients rely on us because they understand the importance of meaningful, defensible results – especially when learner outcomes are at stake. We love our work.
The Measurement, Analytics and Technologies (MAT) Division applies advanced statistical, psychometric and analytical techniques to produce robust, fair and trustworthy assessment results that are fit-for-purpose. Our capabilities are highly specialised and reflect best-practice approaches in psychometrics, sampling, statistical modelling and data processing. Clients rely on us because they understand the importance of meaningful, defensible results – especially when learner outcomes are at stake. We love our work. It’s intellectually challenging and involves constant problem-solving and optimisation. When we apply our methods and processes well, we generate high-quality evidence that supports educators and systems, strengthens policy decisions and ultimately improves outcomes for learners.
This year, we supported 90 ACER projects spanning every corner of our work, from school-based assessments to large-scale international programs. Much of our work happens behind the scenes, but its impact is far-reaching: if ACER delivers high-quality results, it is because the data has been prepared, validated, processed and scrutinised by our teams.
A significant focus this year has been the uplift of ACER’s data-related work through contemporary data science tools and scripting languages. We’ve embedded a new, future-oriented research program – Data Science and Automation – to accelerate cross-cutting data-driven capabilities across the organisation. This includes using machine learning and Natural Language Processing techniques that will enhance our products and strengthen the insights provided to clients and learners.
As part of our expanded remit, we’re excited to welcome our colleagues from the Assessment Process Automation team to MAT. Their expertise in complex item authoring, test form construction, project documentation and end-to-end assessments testing, is essential for delivering large, complex programs such as PAT, GAMSAT and PISA. Their integration strengthens our ability to deliver major programs with consistency, rigour and reliability.
Through the work of the Statistical Software and Analytics team, we’ve made strong strides in modernising our reporting systems. This includes developing a new app to replace legacy Microsoft Access tools, enhancing offline reporting for the International Benchmark Tests (IBT) with parallel cloud processing, and upgrading ACER Maple for PISA 2025 with automated test coverage and improved school sampling algorithms. ACER Signum continues to grow as a versatile tool for standard-setting and paired-comparison studies across different educational contexts.
Our Psychometrics team supported a wide range of assessments across ACER’s portfolio, including Student Learning and Progress programs, Tertiary and Industry Tools examinations, national and international large-scale assessments, and medical and professional accreditation tests. Their work underpins the validity, fairness and clarity of scores used by learners, educators and policymakers.
The Sampling team has also had a significant year, ensuring that ACER’s assessments accurately represent their intended populations. A major highlight was selecting school samples for all 93 countries participating in the PISA 2025 Main Study and configuring Maple for within-school sampling tailored to each country. The team also contributed to sample design and weighting for the OECD’s PISA for Schools program; multiple countries participating in SEA-PLM and AMPL; all national surveys under Australia’s National Assessment Program; and the Pacific Islands Literacy and Numeracy Assessment (PILNA) 2025. In the Pacific, the team also delivered a capacity-building workshop to strengthen local sampling and weighting expertise.
In the coming year, MAT will deepen its focus on automation, data science and modern analytics to support ACER’s next generation of assessments. We remain committed to precision, efficiency and continual improvement – quietly building the foundational capabilities that allow ACER to deliver high-quality evidence at scale. Our work will continue to evolve as assessment technologies and statistical methods advance, but our goal remains steady: to ensure that every ACER assessment is fair, defensible and grounded in world-class methodology.
In March 2022, education ministers agreed that NAPLAN would shift to Term 1 from 2023 onwards, with the goal of delivering results earlier in the school year. By moving the assessment window forward, teachers, schools and education authorities would have more time to use insights from NAPLAN to inform teaching and learning programs within the same calendar year. As of 2023, NAPLAN is now conducted in Term 1 across Australia.
This year we celebrated a major milestone in national student assessment as we made major improvements to our software. Contracted by the Australian Curriculum, Assessment and Reporting Authority (ACARA) to carry out the NAPLAN analysis, these improvements helped them achieve their fastest ever turnaround from testing to national results
Since NAPLAN’s inception in 2008, ACER has played a central role in analysing national results. This year our team processed data from 4.5 million online tests taken by over 1.3 million students.
This was made possible by a new high-speed estimation algorithm developed in-house and implemented in ACER’s ConQuest software. It’s a breakthrough that enhances the speed and precision of large-scale assessment cycles like NAPLAN and PISA.
Faster analysis means quicker insights for policymakers, school systems and the public – supporting timely, evidence-based decisions.
It’s a powerful example of how ACER’s innovation, expertise and collaboration are delivering real impact in Australian education.
supported
ACER projects
NAPLAN
results were delivered
faster than 2024
Estabalished our new
Data Science & Automation
capability




