We support our Publishers and Content Creators. You can view this story on their website by CLICKING HERE.
I’ve been covering Minnesota Attorney General since he first ran for Congress in 2006. Most recently, I summarized some of my research on Ellison in the 2023 City Journal review/essay “The anti-cop attorney general” and in the 2024 Power Line post “Ellison denies and defames.” Suffice it to say I’ve been beating my head against the wall to expose Ellison for a long time.
Ellison now seeks to lead the left’s current crusade to suppress speech in the name of “misinformation.” Ellison himself presents as a case study in the uses of “misinformation.” That is the theme of my work on him.
Now we can turn to “Misinformation expert cites non-existent sources in Minnesota deep fake case” by Christopher Ingraham in the Minnesota Reformer, based on the current federal lawsuit challenging Minnesota’s newly enacted “deep fake” law. Forgive the long excerpt — please take this in:
A leading misinformation expert is being accused of citing non-existent sources to defend Minnesota’s new law banning election misinformation.
Professor Jeff Hancock, founding director of the Stanford Social Media Lab, is “well-known for his research on how people use deception with technology,” according to his Stanford biography.
At the behest of Minnesota Attorney General Keith Ellison, Hancock recently submitted an affidavit supporting new legislation that bans the use of so-called “deep fake” technology to influence an election. The law is being challenged in federal court by a conservative YouTuber and Republican state Rep. Mary Franson of Alexandria for violating First Amendment free speech protections.
Hancock’s expert declaration in support of the deep fake law cites numerous academic works. But several of those sources do not appear to exist, and the lawyers challenging the law say they appear to have been made up by artificial intelligence software like ChatGPT.
For instance, the declaration cites a study titled “The Influence of Deepfake Videos on Political Attitudes and Behavior,” and says that it was published in the Journal of Information Technology & Politics in 2023. But no study by that name appears in that journal; academic databases don’t have any record of it existing; and the specific journal pages referenced contain two entirely different articles.
“The citation bears the hallmarks of being an artificial intelligence (AI) ‘hallucination,’ suggesting that at least the citation was generated by a large language model like ChatGPT,” attorneys for the plaintiffs write. “Plaintiffs do not know how this hallucination wound up in Hancock’s declaration, but it calls the entire document into question.”
Separately, libertarian law professor Eugene Volokh found that another citation in Hancock’s declaration, to a study allegedly titled “Deepfakes and the Illusion of Authenticity: Cognitive Processes Behind Misinformation Acceptance,” does not appear to exist.
If the citations were generated by artificial intelligence software, it’s possible that other parts of Hancock’s 12-page declaration were as well. It’s unclear whether the non-existent citations were inserted by Hancock, an assistant, or some other party. Neither Hancock nor the Stanford Social Media Lab replied to repeated requests for comment. Nor did Ellison’s office.
Frank Bednarz, an attorney for the plaintiffs in the case, said that proponents of the deep fake law are arguing that, “unlike other speech online, AI-generated content supposedly cannot be countered by fact-checks and education.”
However, he added, “by calling out the AI-generated fabrication to the court, we demonstrate that the best remedy for false speech remains true speech — not censorship.”
Ingraham reports above that neither Hancock nor the Stanford Social Media Lab replied to repeated requests for comment, nor did Ellison’s office. In the Alpha News story following Ingraham’s, Jenna Gloeb reports that she sought comment from Ellison regarding the allegations of fabricated citations in the affidavit, the process used to verify the affidavit’s accuracy, and the broader implications of the law’s restrictions on speech and that Ellison did not respond to Alpha News’ request for comment.