All the times the SACE Board got exam questions wrong – and other controversies
Can the SACE Board be trusted to get the exam questions right and mark your work fairly? Here’s the evidence.
Education
Don't miss out on the headlines from Education. Followed categories will be added to My News.
With the major SACE Year 12 exam period kicking off on Monday, November 7, it’s only natural that students will be nervous about making a mistake here and there.
While striving to do your best, it’s wise to keep it all in perspective and realise that nobody’s perfect.
Case in point – the SACE Board, responsible for setting the exams – hasn’t always got the questions right.
It’s also been mired in controversies ranging from IT failures in electronic exams to accusations of bias against boys, marking failures and staff quitting en masse.
We take a look at all the times the heat has turned from the students sitting the exams back onto the SACE Board itself.
2021: Mass exodus of SACE Board staff sparks concerns of wrong or late grades
Year 12 students were at serious risk of being awarded “incorrect” grades or receiving results late because so many disgruntled SACE Board workers quit, staff claimed in October 2021.
More than 40 current and former staff signed a letter from the Public Service Association to then board chief executive Martin Westwell, raising “serious concerns” about “leadership and workplace culture”.
“This has led to high staff dissatisfaction and turnover and puts at risk the ability of the Board to ensure the integrity and timely delivery of SACE results this year,” the letter said.
It said 51 staff had quit within a year, with eight more to go at end of that year, equating to a turnover rate of more than 50 per cent.
Earlier in 2021, the public sector I Work For SA survey found 29 per cent of SACE Board staff thought recruitment and promotion decisions were fair – 21 per cent below the average for comparable small agencies.
MORE: YOUR ULTIMATE GUIDE TO EVERY SACE EXAM
Prof Westwell dismissed concerns over the exam and results period, saying there was “enough expertise and experience” to run it smoothly.
He emphatically rejected accusations of favouritism and said open, competitive recruitment processes were always run for permanent positions.
Prof Westwell said many of the staff who had left had done so to take up roles with the Education Department for a major curriculum project.
2020: Almost 3000 students had their final exam interrupted
Thousands of Year 12 students were left in distress when their Psychology exam being conducted electronically had to be abandoned because of a technical problem.
The exam, being taken by 2720 students, included a video clip in one of the questions, with that data file pinpointed as the source of the problem.
A St Peter’s Girls School student said “there were some tears” among her classmates as the exam crashed.
She battled “an hour and a half of errors (error messages) … going ‘retry’ and ‘try again’”.
“Teachers started running around asking IT for help,” said the student, who did not want to be named.
“One of the teachers came in near the end and said it’s cancelled. I was just kind of shocked.”
Then SACE Board chief executive Martin Westwell apologised for the failure.
The psychology exam was worth 30 per cent of students’ final grade.
Prof Westwell said it could not be rescheduled because students had seen the questions and they had focused for this point in time. Instead, a “derived” score would be created from the teacher’s prediction of how the student would perform – lodged two weeks earlier – and a statistical process applied by the SACE Board which considered all of the earlier work.
The method had been shown to be accurate to within one grade increment – for example, a ‘B+’ instead of a ‘B’ – 97 per cent of the time.
Errors mostly resulted in a more favourable result for the student, Prof Westwell said.
2019: Economics question wasn’t on the money
The SACE Board acknowledged there was a mistake in the Year 12 Economics exam paper after a student contacted The Advertiser.
The student said the way a question was posed was “contradictory”.
“It was a five-mark question from a total of a 65-mark exam,” the student said.
“Naturally, it really throws me off and a lot of others, I am sure. (It) was not a grammar error, it was written completely wrong.”
The student expressed surprise that the SACE Board had not taken action.
The Board said the “minor error” had been identified “prior to marking” and did not warrant contacting schools about, and that it had not received any complaints.
A spokesman said assessors had “adapted their marking guidelines to ensure that no student was disadvantaged”.
2019: What’s the formula for failure?
A maths exam sat by more than 3000 students contained a question with an incorrect formula.
The SACE Board said there had been “a minor error in a three-mark question in last Tuesday’s mathematical methods exam”.
“The error, which was an incorrect formula, will be resolved during the marking process by markers not counting the question, ensuring that no student is disadvantaged,” a spokesman said.
“Schools were notified.”
2019: SACE biased towards girls, boys schools claims
The SACE is biased against boys and must be reformed to produce more even results, elite private boys school Prince Alfred College claimed after commissioning research showing the huge results gap between the genders.
PAC said the focus on large numbers of assignments, many of them long essays, favoured the strengths of girls, while the decreasing emphasis on exams also disadvantaged boys.
“We do not believe boys are fundamentally less intelligent than girls and measures need to be taken to address this (inequity),” then principal Bradley Fenner said.
“This is not about our students, but rather about the plight of boys across the state. We feel an obligation to advocate for all boys, across independent, government, single sex and coeducational schools.”
The Adelaide University research findings included that the proportion of grades awarded to girls that were ‘As’ was more than 50 per cent higher than for boys for full-year subjects.
Then SACE Board chief executive Martin Westwell said girls generally outperformed boys around the world, with girls studying more productively and achieving “mastery” of subjects, while boys excelled at “cramming for exams”, especially if they required a lot of memorisation.
Mr Westwell said a range of assessment types were needed to form a “balanced judgment” of each student.
2017: English too hard? Then just scrap the requirement to read books
Students could pass the toughest Year 12 English course reading one novel and one play, it was revealed, sparking calls that it must be made more rigorous by requiring study of more long-form fiction.
Teachers said the then-new SACE English Literary Studies course, a revamp of the former English Studies subject, has reduced the number of major “texts” students have to study from seven to five. Up to two of those can be movies, and a selection of poetry counts as another.
The SA English Teachers Association backs the changes, saying they allowed for more in-depth study of each text with “far greater academic rigour”.
But university experts warned the curriculum was not comprehensive enough, especially when young people read far less of their own volition compared to previous generations.
The SACE Board said it worked with school teachers and university experts to produce the new subject content, which was further refined after public consultation.
“While the reading requirements are similar, the new subject asks for higher-order thinking that better prepares students for life in our complex world,” then chief executive Neil McGoran said.
2017: Hoax letter tells students they’ll have to resit their exam
A hoax letter on Facebook suggested thousands of Year 12 Biology students would have to resit their exam.
The letter, using SACE Board letterhead, claimed there was a “significant breach in the integrity” of the exam sat by 3645 students.
It claimed the Board was “not at liberty to divulge the exact nature of the breach” but it involved a significant number of students being “unfairly advantaged”.
The Board identified the student responsible for the prank.
“The SACE Board would like to reassure students that this letter was a hoax and there was no breach, and so no student will need to resit this examination,” it said.
2014: Maths exam doesn’t add up
A bungled question left 960 Year 12 students scratching their heads in their Specialist Mathematics exam.
The SACE Board was forced to notify schools when a printing error was discovered in the first part of Question 7, which asked students to find the value of an equation.
A plus sign was mistakenly used in place of a colon.
That section of the question was worth five marks out of a total of 150 for the whole exam.
A parent who contacted The Advertiser said: “The kids would have been rattled and possibly lost time over it.”
Once it became aware of the mistake, the SACE Board contacted all school exam co-ordinators and exam supervisors by email so, where possible, they could correct the error during the three-hour exam.
Carol Noule, the executive officer of the Mathematical Association of SA, the state affiliate of the Australian Mathematics Teachers Association, said mistakes in exam papers “should never ever happen”.
“What they can never estimate is the impact on individual kids,” she said.
The SACE Board asked schools to advise of any impacts on students of the mistake and vowed no student would be left worse off.
“We will ensure no student is disadvantaged by this error, and we apologise for any confusion or distress caused for students,” then chief executive Dr Neil McGoran said.
2014: Teachers’ marking keeps missing the mark
Teachers were wrongly grading close to one in five pieces of school-assessed, non-exam Year 12 work, SACE Board data revealed.
They were five times more likely to give marks that were too high than too low.
Moderators assessing student work against the standards set for each subject found 18 per cent of grades were incorrect the previous year – 15 per cent had to be marked down and 3 per cent marked up.
School-assessed work counted for 70 per cent of the final mark in each subject, with samples from multiple students in each Year 12 class going through a statewide moderation process overseen by the SACE Board.
Modern Language Teachers Association of SA president Joe van Dalen, an experienced moderator, said at the time a grade for a student could be lowered only as far as the next-ranked student in their class.
2014: SACE Board found to be a bit of a luddite
An independent review found the SACE Board had fallen far behind modern technology and was hobbled by inefficient administrative systems that were “strikingly paper-based”.
The review found it was “heavily reliant on manual processing … with little realisation of the opportunities for electronic business”, despite its sensitive and data-heavy work being “fundamentally suited to high levels of computerisation”.
“As just one example, during a given year, over 180,000 sheets are scanned (including) 109,000 examination marks sheets,” the report from consultant Executive Advisory Services said.
It also noted there had been fears in previous years that year 12 results might not be delivered “accurately and on time” because of IT failures.
The review blamed funding diversion for IT projects not being done.
Then Education Minister Jennifer Rankine said $3.6m would be spent on IT upgrades and
a SACE “data warehouse” introduced “to tailor data and information for schools, to monitor trends and help identify further strategies for improving student outcomes”.
2012: Music exam failed to sing off the page
An error in the Musicianship exam was feared to have affected the performance of up to 260 students.
The question had the stems of notes in a six-bar musical piece pointed down rather than up.
The SACE Board scrambled to find out how many students attempted to answer the part of the question which contained the error because students could choose to answer one of three parts.
Then SACE Board chief executive Dr Paul Kilvert said that the question would still be counted.
“The question can still be answered in different ways,” he said.
“The examiners will reward all possible answers appropriately.
“Each examination paper will then be supervised by the chief assessor to ensure that no student is disadvantaged. The SACE Board apologises for any concern or distress.”
2010: SACE’s questionable maths methods
A question had to be pulled from the Mathematical Methods exam because some approved calculators had been unable to help answer it.
Three of the 17 graphic calculators approved for use in the exam were not equipped to allow students to complete one part of the examination paper in the way in which it was intended.
The SACE Board chief executive Dr Paul Kilvert later wrote to all schools to advise them that the question, worth 11 marks out of a total of 156, had been excluded.
But that was cold comfort to students who had spent time answering it, especially those with calculators that were of no use to them for it.
From the 1381 students sitting the exam, 78 had used one of the calculators involved and the Board said the question had been excluded from marking “in fairness to these students”.
The Board apologised for the bungle.