‘AI won’t replace you, but someone who uses it will’, says Thomson Reuters boss Steve Hasker
Generative AI will change the way universities train lawyers, accountants, journalists and other professions, up-ending the ‘apprenticeship pathway’ says leading CEO.
Generative AI will change the way universities train lawyers, accountants and journalists as the technology sparks a disruption bigger than the launch of personal computers, the internet, mobile and social media.
This is according to Steve Hasker — chief executive and president of media and information conglomerate Thomson Reuters — who says AI models have become so sophisticated they can produce work on par with someone who has three or four years’ experience.
AI has up-ended the “apprenticeship pathway” — traditionally involving long hours performing “grunt work” while learning the ropes — with graduates now expected to be able to hit the ground running.
“The law schools, tax and accounting schools are going to have to prepare graduates who come in with a level of judgement and an ability to communicate with colleagues and with clients that is two or three or four years advanced than where it perhaps has been for the last 20 or 30 years,” Mr Hasker said.
“The good news is younger generations have a mindset that means they’re going to be ready for that much earlier than I was. You talk to university graduates and they sort of expect to be in my role, you know, in a year or two or three. They’re not quite attuned to the idea of sort of grinding it out for 20 or 30 years before sort of being in a leadership position, and perhaps neither should they be.”
But Mr Hasker doesn’t believe AI is about to put scores of people out of work. But he says those who don’t embrace the technology might struggle to find a job.
“You’re not going to be replaced by AI, but you might be replaced by someone who uses it.
“I think this is real upside here. I really do. The idea of lots of headcount displacement … you know, I was working for McKinsey as a consultant when Google was invented, and I remember at the time thinking this is probably going to make some portion of that sort of research and input to the consulting profession redundant.
“Well, it didn’t. It just made us more productive. We got through more work in a shorter period of time and got to an insight versus sort of having to sort of grill your way through different, disparate sources of information that may or may not have been relevant.”
Thomson Reuters has launched a suite of AI-powered products across its legal and accounting divisions, which aim to remove mundane tasks such as document analysis, drafting advice, contract review, preparation of depositions, preparation and responses to compliance.
In June, the company acquired Casetext — a legal start-up with an AI-powered assistant for law professionals — for $US650m. It will also invest an additional $US100m a year in developing and integrating AI across its product portfolio, including its flagship legal research tool Westlaw Precision.
In the Reuters news agency — one of the world’s oldest and biggest — AI can already generate story ideas for its financial journalists from analysing share market data and company filings and can even generate news alerts.
“It initiates the sorts of things that our talented financial journalists are already starting to think about, and it gives them, in a sense, a first draft for them to say, ‘okay this is this news, and what can add to it?’,” Mr Hasker said.
“Most of them say to me, ‘this is great’. It increases their productivity and makes the job more fun because they’re better applying their brain power rather than going all the way back to first principles every single time.”
But does this mean fundamental skills — essential to the legal, accounting and media professions — risk being lost, in a similar way satellite navigation systems have displaced the ability to read a map?
“There could be some atrophy around the skills. I think that’s a risk. I really do,” Mr Hasker said.
“Best case scenario is we all stop spending as much time on grunt work as uninteresting work. The worst case scenario is — and we’ve seen this in a couple of examples in the US in the legal profession — where someone just enters a question in the ChatGPT and outcomes an answer, and they put that in front of the judge. That has led to some very poor outcomes for litigators.
“I think the risk is going to be there, just as it has been with the use of Microsoft Excel, relative to sort of doing a sort of more manual calculation. Yeah, it does a lot more work in a much shorter period of time, but ultimately we’ve all made mistakes with that, that perhaps we wouldn’t have had we been working with just pen and paper.”
And this is why it is essential that humans remain in control of AI, using it as a tool to assist in tasks rather than complete a finished product.
“Machine cannot take on the rights and obligations of a practising attorney or lawyer, right there has to be a human in that loop,” Mr Hasker said.