- Analysis
- World
- North America
- AI
This was published 7 months ago
Now hiring: Part-time AI chatbot tutors, no experience necessary
By Yiwen Lu
After her second child was born, Chelsea Becker took an unpaid, year of leave from her full-time job as a flight attendant. After watching a video on TikTok, she found a side hustle: training artificial intelligence models for a website called Data Annotation Tech.
For a few hours each day, Becker, 33, who lives in Schwenksville, Pennsylvania, would sit at her laptop and interact with an AI-powered chatbot. For every hour of work, she was paid $20 to $40. From December to March, she made more than $10,000.
The boom in AI technology has put a more sophisticated spin on a kind of gig work that doesn’t require leaving the house. The growth of large language models such as the technology powering OpenAI’s ChatGPT has fuelled the need for trainers like Becker, fluent English speakers who can produce quality writing.
It is not a secret that AI models learn from humans. For years, makers of AI systems such as Google and OpenAI have relied on low-paid workers, typically contractors employed through other companies, to help computers visually identify subjects. They might label vehicles and pedestrians for self-driving cars or identify images on photos used to train AI systems.
But as AI technology has become more sophisticated, so has the job of people who must painstakingly teach it. Yesterday’s photo tagger is today’s essay writer.
There are usually two types of work for these trainers: supervised learning, where the AI learns from human-generated writing, and reinforcement learning from human feedback, where the chatbot learns from how humans rate their responses.
It’s fundamentally not a good idea to outsource or crowdsource concerns about safety and ethics.
James Muldoon, University of Essex management professor
Companies that specialise in data curation, including the San Francisco-based startups Scale AI and Surge AI, hire contractors and sell their training data to bigger developers. Developers of AI models, such as the Toronto-based startup Cohere, also recruit in-house data annotators.
It is difficult to estimate the total number of these gig workers, researchers said. But Scale AI, which hires contractors through its subsidiaries, Remotasks and Outlier, said it was common to see tens of thousands of people working on the platform at a given time.
But as with other types of gig work, the ease of flexible hours comes with its own challenges. Some workers said they never interacted with administrators behind the recruitment sites, and others had been cut off from the work with no explanation. Researchers have also raised concerns over a lack of standards, since workers typically don’t receive training on what are considered to be appropriate chatbot answers.
To become one of these contractors, workers have to pass an assessment, which includes questions such as whether a social media post should be considered hateful, and why.
Another one requires a more creative approach, asking contracting prospects to write a fictional short story about a green dancing octopus, set in Sam Bankman-Fried’s FTX offices on November 8, 2022. (That was the day Binance, an FTX competitor, said it would buy Bankman-Fried’s company before later quickly backing out of the deal.)
Sometimes, companies look for subject-matter experts. Scale AI has posted jobs for contract writers who hold master’s or doctoral degrees in Hindi and Japanese. Outlier has job listings that mention requirements including academic degrees in math, chemistry and physics.
“What really makes the AI useful to its users is the human layer of data, and that really needs to be done by smart humans and skilled humans and humans with a particular degree of expertise and a creative bent,” said Willow Primack, vice president of data operations at Scale AI. “We have been focusing on contractors, particularly within North America, as a result.”
Alynzia Fenske, a self-published fiction writer, had never interacted with an AI chatbot before hearing a lot from fellow writers who considered AI a threat. So when she came across a video on TikTok about Data Annotation Tech, part of her motivation was to learn as much about AI as she could and see for herself whether the fears surrounding AI were warranted.
“It’s giving me a whole different view of it now that I’ve been working with it,” said Fenske, 28, who lives in Oakley, Wisconsin. “It is comforting knowing that there are human beings behind it.” Since February, she has been aiming for 15 hours of data annotation work every week, so she can support herself while pursuing a writing career.
Ese Agboh, 28, a master’s student studying computer science at the University of Arkansas, was given the task of coding projects, which paid $40 to $45 an hour. She would ask the chatbot to design a motion sensor program that helps gym goers count their repetitions, and then evaluate computer codes written by AI.
In another case, she would load a data set about grocery items to the program and ask the chatbot to design a monthly budget. Sometimes she would even evaluate other annotators’ codes, which experts said are used to ensure data quality.
She made $2500. But her account was permanently suspended by the platform for violating its code of conduct. She did not receive an explanation, but she suspected that it was because she worked while in Nigeria, since the site wanted workers based in only certain countries.
That is the fundamental challenge of online gig work: it can disappear at any time. With no one available for help, frustrated contractors turned to social media, sharing their experiences on Reddit and TikTok. Jackie Mitchell, 26, gained a large following on TikTok because of her content on side hustles, including data annotation work.
“I get the appeal,” she said, referring to side hustles as an “unfortunate necessity” in this economy and “a hallmark of my generation and the generation above me.”
Public records show that Surge AI owns Data Annotation Tech. Neither the company nor its CEO, Edwin Chen, responded to requests for comments.
It is common for companies to hire contractors through subsidiaries. They do so to protect the identity of their customers, and it helps them avoid bad press associated with working conditions for its low-paid contract workers, said James Muldoon, a University of Essex management professor whose research focuses on AI data work.
Much of today’s data workers depend on wages from their gig work. Milagros Miceli, a sociologist and computer scientist researching labor conditions in data work, said that while “a lot of people are doing this for fun, because of the gamification that comes with it,” a bulk of the work is still “done by workers who actually really need the money and do this as a main income.”
Researchers are also concerned about the lack of safety standards in data labelling. Workers are sometimes asked to address sensitive issues such as whether certain events or acts should be considered genocide or what gender should appear in an AI-generated image of a soccer team, but they are not trained on how to make that evaluation.
“It’s fundamentally not a good idea to outsource or crowdsource concerns about safety and ethics,” Muldoon said. “You need to be guided by principles and values, and what your company actually decides as the right thing to do on a particular issue.”
This article originally appeared in The New York Times.
The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.