Gretchen Whitmer kidnap plot foiled after Facebook recommended Wolverine Watchmen to future FBI informant
A secret Facebook algorithm recommended an alleged militia group that plotted to kidnap a state governor to a man who would ensure its downfall.
A secret Facebook algorithm recommended an alleged militia group that plotted to kidnap a state governor to a man who later informed the FBI of their plan, according to reports out of the US.
The confidential informant testified on Friday at the Jackson County Courthouse in Michigan.
The video feed was cut off to protect his identity while the court heard audio of his testimony.
According to the Detroit Free Press, the informant said he came across the group for the Wolverine Watchmen after it was recommended to him on Facebook, which the former Army veteran and self-described Libertarian believed was suggested to him based on his interactions with Facebook pages about the US right to bear arms and firearms training.
After answering a few questions he was allowed into the group and invited to use an encrypted messaging app to communicate with other members of the group.
RELATED: Twitter introducing ‘five strikes’ policy
RELATED: Jones: ‘I wish I’d never met Trump’
He became concerned when he learned some members in the group wanted to target police officers, and told a friend in law enforcement.
That would soon lead to him becoming a valuable informant for the FBI as he accompanied the alleged militia members, sometimes while wearing a wire, on rural training exercises and protests outside the Michigan Capitol building.
Fourteen men are accused of plotting to kidnap Michigan governor Gretchen Whitmer over restrictions brought in to try and stop the spread of COVID-19.
Friday’s hearing was for three of them, and for the judge to decide whether there was enough evidence to take them to trial.
The next hearing is scheduled for March 29.
RELATED: Facebook to pay $838m ‘expeditiously’
RELATED: ‘I’m fine with this’: FB exec’s callous reply
The revelation that the plot to kidnap the Michigan governor may have been partially foiled by the same Facebook algorithms that have been accused of directing people to radicalising content is a new twist in the tale. But the informant is perhaps more worthy of credit than the computer code, which also recommended other people to other militia groups hosted on the platform, until (and even after) it banned them.
Facebook head honcho Mark Zuckerberg admitted there was an “operational mistake” that prevented the page of a Kenosha, Wisconsin militia group from being removed until after two people were shot dead at a Black Lives Matter rally in the city last year. He said the company had not found any link between the militia and the 17-year-old who was later charged over the shooting.
The company received more than 450 reports on the militia page and its potential to incite violence in the hours before the shooting. It was cleared four times by moderators.
RELATED: Zuck ‘changed the rules’ for conspiracist
RELATED: Two deaths before Facebook ban
Facebook and other platforms’ algorithms have been the subject of criticism in recent years due to their “rabbit hole” effect of recommending people content based on what they’ve already interacted with in the hopes of keeping them engaged for longer — and looking at ads while they’re there.
Last year, the Netflix documentary The Social Dilemma characterised Facebook’s recommendation algorithm as a “rage machine” (like many parts of the documentary that characterisation is both an exaggeration and an oversimplification).
Facebook hit back soon after, arguing its “algorithm is not ‘mad.’ It keeps the platform relevant and useful”.
The company also said it conducts its own research and funds independent academics to do the same, to “better understand how our products might contribute to polarisation so that we can continue to manage this responsibly”.
It doesn’t always do anything with that understanding, however.
In 2018 the company commissioned research that informed high-level executives that the algorithms “exploit the human brain’s attraction to divisiveness” and would feed users “more and more divisive content in an effort to gain user attention and increase time on the platform” if nothing was done.
In May last year the Wall Street Journal reported that nothing was done, with internal documents and insiders revealing “Mr Zuckerberg and other senior executives largely shelved the basic research, and weakened or blocked efforts to apply its conclusions to Facebook products”.
“In essence, Facebook is under fire for making the world more divided. Many of its own experts appeared to agree — and to believe Facebook could mitigate many of the problems. The company chose not to,” the WSJ reported.