E-Safety Commissioner Julie Inman Grant and the battle to civilise cyberspace
A torrent of online child sexual abuse and exploitation material could turn into a tsunami. Julie Inman Grant is on a mission to civilise cyberspace.
First you hear a car coming to a stop. Then you see a child’s face appear on the screen. It’s clear she’s in the back seat, recording the video on her phone. You hear the brakes engage and the handbrake pulled. Two women get out of the front seats. One says, “We’ll be back in a sec, sweetie.” The doors slam shut, and then you hear the child saying to a person on the other end of the phone, “What do you want me to do?”
Toby Dagg, head of a team of cyber investigators working in the office of Australia’s eSafety Commissioner, is describing one online video he can’t erase from his mind. Among the thousands of appalling images he has witnessed in his eight years in the job, he can’t shake this one. “The person is obviously conveying instructions to the girl in some way we can’t see, likely by text,” he says. “They direct her to an adult pornographic website on her phone, and then say something along the lines of: ‘Like that.’ She turns her phone’s camera towards her genital area, pulls aside her underpants and films herself. She quickly rearranges herself as the two adults come back to the car, get in and drive off. She would have been six or seven years old. The whole thing took no more than 30 seconds.”
Dagg’s team came across the video on a paedophile website, but not before potentially hundreds of thousands of people saw it. The footage almost certainly came from the US, he says, judging by the accents, and the girl and the perpetrator were clearly in a pre-existing relationship. “I often think about that clip. It’s chilling in a number of ways,” he says. “How young she is. How quickly it all takes place. Where it takes place. The grooming and the ongoing undiscovered relationship with the abuser. And just how low the opportunity cost is for child sex abuse offenders in today’s society. They don’t even need to be physically present to abuse a child with the technology available today.”
Like all countries, Australia is grappling with how it can deal with the avalanche of such material in the virtual free-for-all of the mobile-enabled internet. But unlike other countries that leave it to NGOs such as the US-based National Center for Missing and Exploited Children to pursue these matters, Australia has the world’s only stand-alone government-run online safety regulator, the eSafety Commissioner. At the helm is Julie Inman Grant. She arguably has one of the most important jobs in the country: leading an office charged with bringing some order to the mayhem and lawlessness of cyberspace in Australia.
In Pyrmont, on the western fringe of Sydney’s CBD,is a nondescript modern office building. Some floors are tenanted by property developers, another by a film company. But the fifth floor is a country mile from considerations of returns per square metre or weekend box office takings. Despite the workaday streetscape, what is taking place on this floor of the Pyrmont office block is unique in the world, though countries from the UK to South Korea are looking to duplicate it.
Julie Inman Grant’s office has a smog-diminished view through parkland treetops and north towards the harbour. If the trees’ gentle sway invokes a languid calm, it doesn’t permeate through the window. It’s just after 9am but Inman Grant, a four times a week jogger, is already running hard. She greets me outside her glass-walled office overlooking a 60-strong staff scattered across one open-plan floor. As in many modern workplaces, screens rule and desk phones gather dust. But if I think it’s just like any other office, just wait, she says. Just wait until I’ve spoken with Dagg, the head of the investigations team, or been to the secure room to talk with David, a 2m-tall former cop and one of his investigators, about the material they see in there. But for now Inman Grant holds court, explaining the functions of her office that, to the layman, seem mind-boggling in scope.
What started in 2015 as the Children’s eSafety Commissioner has quickly expanded to regulate the online safety of all Australians. Its remit sounds almost Pollyanna-ish: to safeguard people from online harm, and to promote safer, more positive online experiences for children and adults. Practically, day to day, it’s far darker work that examines complaints about serious cases of children being cyberbullied – harassed or humiliated online. It investigates complaints of child exploitation and abuse, including where children are groomed and duped into providing a sexually compromising image and then blackmailed for even more explicit material. It regulates image-based abuse (in layman’s terms, revenge porn), in which a photo or video never meant to be shared ends up splashed across the internet without the subject’s permission. And it deals with abhorrent violent content, which includes both child penetrative rape and terrorism, an example being the livestreaming of the Christchurch terrorist massacre last year in which 51 people were killed at two mosques. Despite new laws giving the eSafety Commissioner the power to order internet service providers to block websites hosting the material, the video still does the rounds on some of the web’s darkest foreign-based sites.
The eSafety Commissioner has an investigation and an education function but doesn’t prosecute criminal behaviour, leaving that to law enforcement. It works closely with police and ensures unacceptable content is taken down from online platforms including Facebook, Instagram and other more obscure sites as quickly as possible. In some but not all cases it has legal power to demand rather than persuade. A critical element is the office’s relationship with major content providers. Sometimes an approach to an online service about content their own compliance teams have missed will see material removed within 30 minutes. Sometimes there is resistance. The remit is immense, and if the federal government has its way, it will become even bigger.
Inman Grant offers the seat with the harbour glimpse at a small meeting table in her office. On her desk are small pieces of artwork done by her three children, a 13-year-old girl and twin eight year-olds, one girl, one boy. She finishes her tea from a mug adorned with one of her children’s faces. Two similar mugs sit on the desk. No favourites, she says. “They aren’t why I got into the world of online safety,” she says of her children. “I’ve been in this space for a long time. But they are definitely part of the reason I’m so passionate about it.”
Seattle-born Inman Grant, 51, has worked at the intersection of cyber safety, public policy and technology since her early days in government relations for Microsoft. “It was 25 years ago now that I helped organise the first online safety summit with the Clinton administration. My interest in the interface between social justice and online safety has been long held.”
After the Clinton years in DC, she came to Sydney in 2000 to build Microsoft’s profile with government and industry in Australia and New Zealand. Four months later, the course of her life changed. “I was having coffee in my pyjamas in my courtyard when my future husband poked his head over the fence. He says, ‘Hi. I’m staying at your neighbour’s place and I’ve locked myself out. I broke a window to get in and just wanted to let you know I’m not a robber.’” This was the start of two decades of continent-shifting for the family, including another stint at Microsoft’s head office in Washington State in a global privacy and safety role from 2009-12. “I had my twins while we were in the US. And at that stage my husband had moved for me, so it was his turn, and we came back to Australia both because a job opportunity had opened up for him and because we made a considered decision to raise our children here.”
In 2017, Inman Grant was working as a public policy executive for Twitter in Australia and New Zealand when she was snapped up by the federal government to be the second eSafety Commissioner. It’s a small agency and she wears many hats – CEO, COO, chief strategist, ambassador – but she is determined to remain at the coalface by seeing what is coming in through the complaints channels about cyber-bullying and online child sexual abuse. “I will see something almost every day. I don’t feel I can adequately do my job and set the right strategies if I don’t know what the victims are experiencing.”
Every investigator who works in the space of online child sexual abuse will tell you there are one or two cases that stick with them, she says. “For me, it was 10 years ago when I saw some material Interpol had of a father penetrating an infant, less than one year old.” The visual sits heavily in the room. Her voice wavers. “This is still one of my biggest drivers, I always think of how we can do our best for the victims.
“Seeing that image was no doubt an inflection point. My eldest was three at the time and in daycare. Through my job I knew that 99 per cent of child sexual assault perpetrators are men. There was a male child care worker there, who I knew was kind and good at his job. But even so I will admit that when I changed my daughter’s diaper I would check her. And I would ask her to make sure she spent her time playing in a group of friends.
“Even now, regardless of who I am, I know that my kids are vulnerable and that I can’t be with them every minute of the day. All I can do is try to give them the clearest messages about online safety that I can, and to work as hard as I can in this role to ensure those prevention messages are getting to as many parents as possible.”
She offers some numbers about what her office is up against. Its work is the finger in the dyke against the online harm of children, she concedes, pointing to a global threat assessment released in December last year that predicted a coming “global tsunami” of online child sexual abuse and exploitation. The report cited an Interpol report estimating an additional 1.8 million more men with a sexual interest in children will be using the internet in 2020 compared to a year ago, saying the number was almost certainly conservative. It noted just one publicly available website containing child sexual exploitation material had 6.5 million views in its first month of operation last year.
Feeding the increasing demand is a rapidly growing supply. The worldwide proliferation of highly sexualised images and videos created by children using mobile devices is the current trend that most disturbs Inman Grant. Children simply aren’t capable of making mature judgments about the harm, she says, and in many cases may be acting on instruction and under duress. “We are seeing the first generation of children capable of being sexually abused online. Sometimes we don’t realise when we provide our children with internet-enabled devices and leave them unattended, we may be unwittingly inviting strangers into our own homes. As just one example, our team has dealt with a case where a young girl recorded herself emulating sex acts with a foreign object in her bathroom, while her parents’ voices were clearly audible from the next room.”
Are parents hearing you? “Our research shows that parents are simply overwhelmed. They are using devices as a digital babysitter. Ninety-four per cent of Australian children have access to a digital device by age four. By age two, it is 42 per cent. While parents might think they have a handle on their child’s use of Instagram or even Snapchat, you’d be surprised how many parents don’t know that a gaming site like Fortnite pairs their child with 99 other players, and there is text and audio interaction accompanying that. Similarly with TikTok, which has a chat function. The internet has created a vast repository for predators to groom unsuspecting young people.”
Rather than “device denial”, she encourages parents to adopt an alert-not-alarmed approach and open dialogue within the family. On the premise that prevention is key, her office’s website provides education, advice and information on online safety for parents, young people, children and even seniors. It works directly with schools, clubs and sporting associations. And recently it teamed with the children’s TV show Play School, using Big Ted and Jemima to spread the online safety message.
But educating parents about online dangers is only one side of the ledger, I suggest. What about the responsibility of the tech companies to ensure they aren’t hosting and profiting from illegal, offensive or harmful content? “The default position of the tech industry is that it wants to be left alone by governments,” says Inman Grant. “There continues to be a fervent belief by industry that a completely unregulated environment is the best way to innovate and drive profits, as well as create jobs and maintain the sanctity of the First Amendment.”
Communications minister Paul Fletcher agrees more needs to be done and calls on the tech industry to take more responsibility. “Unfortunately the risks continue to evolve and some sectors of the internet industry have been slow to meet the community’s expectations when it comes to online safety,” he says. Fletcher is proposing new legislation to introduce faster mandatory takedown times by tech companies in response to a complaint (from 48 to 24 hours) and a new online safety charter that sets out government expectations of social media services, content hosts and other technology companies. One component includes giving the eSafety Commissioner power to request that internet companies report regularly on what they are doing to meet the government’s online safety expectations. If a company fails to report, it will attract a penalty.
Our interview is just a small part of Inman Grant’s day. Before lunch she will meet with Netflix about proposed content for a program it has commissioned called Clickbait, a thriller about a gruesome crime fuelled by social media. Later she will meet with Twitter executives to cement relationships; then there is a series of HR meetings as she looks to recruit a key executive. “Go and see Toby,” she says. “He sees a lot. But pop by on your way out.”
On the far side of the office, Dagg’s investigations team works behind frosted glass and security-coded doors. Other staff are denied access for their own welfare. Pot plants adorn the space, a botanical antidote to the heaviness of the atmosphere and an attempt to keep a psychological space between investigators. Screens are black while I’m here. A fussball table and a kids’ basketball hoop are nearby, but both look little used.
Each morning, investigators are greeted with between 30 and 50 complaints from the public made via the commissioner’s website, the majority involving images of child sexual abuse. Separate teams in the organisation handle claims of cyber-bullying and image-based abuse.
I ask Dagg, a former NSW police officer who has spent most of his 17-year career in the field of juvenile justice and child protection, if Inman Grant’s theory about one piece of footage staying with you applies to him. He says two spring to mind; the first was the little girl filming herself in the car. His second is more prosaic but no less disturbing. “A complaint came in about a person on Instagram live-streaming himself masturbating while holding up images of young girls. We couldn’t catch that act, it wasn’t recorded, but on his account was other footage of concern to us. It was of a child in a school uniform, about 13 years old, walking and talking into her phone. He’s clearly filming her through her phone. You could see her walk through her front gate into her house. You could easily have worked out where it was.
“While there was nothing else on that video, it obviously raised worries about the security of the child. In that case we notified Instagram and law enforcement, which was the most effective response we could identify, and it led to the account being removed. Instagram notified the international authorities, who were able to act quickly.”
In the 2019 financial year the team finalised almost 12,000 investigations into child sexual abuse material, leading to the removal of about 70,000 images and video. Even so, the investigators admit that it feels like they are barely touching the surface, given the amount of content they see each day. They work for the small wins, for the victims, for the girl walking into her house unknowingly being filmed by a creep. They know the research shows the longer an image of child sexual abuse remains online, the greater the emotional damage to the victim. Knowing an image may remain online forever is itself retraumatising for victims.
But what of the trauma to the investigators themselves? What does it take to sit there and see this material day in day out? David, the former policeman turned investigator who prefers to keep his surname private, is philosophical. “There is real satisfaction when a piece of content is taken down. You do think to yourself, ‘We are removing a kid from harm’s way’,” he says. “But it can be frustrating when an image you have seen before reappears in a different online location. That’s when it feels a bit like whack-a-mole.”
Staff have self-care plans. They see a psychologist every three months. “We look after each other,” says David. “We have interviews just to see how it’s going. We are encouraged to self-manage. If it gets too distressing we just walk away for a while.”
I drop in to the office of Kellie Britnell, senior education outreach officer. The education component is a key plank of the commissioner’s office and its materials include information for parents and educators, young adults and seniors on how to navigate their cyber experience more safely. For seniors the information can be as simple as how to pay for things safely online, or how to ensure passwords are protected. The commissioner also trains people to go into schools to provide e-safety information and advice to students.
“Online safety permeates so many parts of society,” Britnell says. “One example that springs to mind is advice to young teachers about navigating their online relationships with students. While it is fairly obvious advice to not ‘friend’ a student on a social media platform, we’ve had to look more recently at other areas, for instance that teachers should not ‘game’ with students as part of a professional boundary issue.”
A growing element of the commissioner’s work is a service for people under 18 to report cyberbullying. More than 500 complaints were made last financial year, a 30 per cent spike from the previous year. Inman Grant says one of the more disturbing recent trends is the falling average age of those being cyberbullied – the highest number of complaints involved 13-year-olds.
Children made the majority of approaches to the commissioner, though a significant proportion came from parents. The most common complaint was serious name-calling and nasty online comments, followed by offensive or upsetting pictures or videos, and fake accounts or impersonation. The office provides children with tools such as instructions on how to block, delete and report unwanted material, and it works directly with schools to resolve complaints in the most serious of cases. Children are then directed to mental health support services such as Kids Helpline.
In Inman Grant’s office later that afternoon we talk about the future. Where can the big gains be made in the protection of people online? What might success look like? What would failure mean? A key issue yet to be resolved is end-to-end encryption, she says. Huge volumes of direct messages on social media remain undetectable by authorities, raising the threat of unmonitored grooming and child sexual images going undetected. Facebook’s What’sApp and Apple’s iMessage are fully encrypted and can’t be monitored; Facebook Messenger has optional encryption and must make mandatory reports of exploitation and abuse to US authorities. It is proposing a move to full encryption.
Inman Grant says the large online platforms are putting money ahead of online safety and the cost is “the detritus of abused children left behind”. “Facebook will say it can’t do anything. Technically it could scan for child sexual images before they are posted. Facebook says it is working on this but the technology will be another two years away. That isn’t good enough. That said, it’s not just Facebook. All platforms need to do better. Microsoft, Apple, they all have culpability.”
One of Inman Grant’s online safety mantras is “safety by design”, which she sees as a potential game-changer. She is leading the world on the issue, working with big tech companies to ensure new products, tomorrow’s Fortnite or TikTok, have online safety “baked in” to the original design rather than being tacked on afterwards. “We almost take for granted the built-in safety design features of a car, such as seatbelts and airbags,” she says. “Shouldn’t we expect that the technology companies systematically deploy virtual seatbelts to protect our children from the darkest recesses of the web?”
Inman Grant couches the future in the language of war – a crusade, a campaign. “Make no mistake,” she says. “We are in an arms race against the worst of the worst, who are adept at using technology for their own ends. We have to arm our citizens with the best possible skills and information to minimise the risk, and where there are harms, to do our best to minimise them by being as efficient as we can in our takedown efforts.
“I can imagine success. Ultimate success would be doing myself out of a job. It would be the next generation learning about online safety from the start and throughout their educational journey. It would be the tech companies embracing safety by design so their platforms became so good at ferreting out abuse that we wouldn’t need a regulator. But there’s always the human factor. I know we’ll never be totally rid of child sexual abuse or serious cyber-bullying. Humans are too predictably imperfect. So if we can harden the target and make the internet a more inhospitable place for paedophiles and other abusers, that’s practical success.”
And failure? “There’s a really high societal cost if we don’t get ahead of this,” says Inman Grant. “The worst-case scenario is losing lives, losing quality of life. The quality of lives of our kids is at stake, it is nothing short of that.”