That’s So Meta
By Mitch Berg
“Expert on Misinformation” , in a case on “deep fakes”, uses a source that doesn’t exist…
…and might be AI generated:
At the behest of Minnesota Attorney General Keith Ellison, Hancock recently submitted an affidavit supporting new legislation that bans the use of so-called “deep fake” technology to influence an election. The law is being challenged in federal court by a conservative YouTuber and Republican state Rep. Mary Franson of Alexandria for violating First Amendment free speech protections.
Hancock’s expert declaration in support of the deep fake law cites numerous academic works. But several of those sources do not appear to exist, and the lawyers challenging the law say they appear to have been made up by artificial intelligence software like ChatGPT.
For instance, the declaration cites a study titled “The Influence of Deepfake Videos on Political Attitudes and Behavior,” and says that it was published in the Journal of Information Technology & Politics in 2023. But no study by that name appears in that journal; academic databases don’t have any record of it existing; and the specific journal pages referenced contain two entirely different articles.
“The citation bears the hallmarks of being an artificial intelligence (AI) ‘hallucination,’ suggesting that at least the citation was generated by a large language model like ChatGPT,” attorneys for the plaintiffs write. “Plaintiffs do not know how this hallucination wound up in Hancock’s declaration, but it calls the entire document into question.”
And that’s not all
Separately, libertarian law professor Eugene Volokh found that another citation in Hancock’s declaration, to a study allegedly titled “Deepfakes and the Illusion of Authenticity: Cognitive Processes Behind Misinformation Acceptance,” does not appear to exist.
If the citations were generated by artificial intelligence software, it’s possible that other parts of Hancock’s 12-page declaration were as well. It’s unclear whether the non-existent citations were inserted by Hancock, an assistant, or some other party. Neither Hancock nor the Stanford Social Media Lab replied to repeated requests for comment. Nor did Ellison’s office.
“Bad AI Deep Fake” would explain a lot of Kamala Harris speeches, on the other hand.





November 21st, 2024 at 2:49 pm
ONE WOULD HOPE, given that this is also in a federal court, that something along these lines would happen (from the linked article):
Clumsy use of artificial intelligence software has caused numerous embarrassments across the legal system in recent years. In 2023, for instance, two New York lawyers were sanctioned by a federal judge for submitting a brief containing citations of non-existent legal cases made up by ChatGPT.
One would also hope that the judge would rule in favor of Franson and the YouTuber post haste if the judge agrees that those cases submitted into evidence were made up by AI. (Granted, IANAL and have no idea if that is a condition sufficient enough for the judge to rule for Franson).
And hopefully Hancock would get charged with perjury. OR if he can show he wasn’t aware, it would damage his reputation and he would no longer be called upon as an expert.
November 21st, 2024 at 3:33 pm
The bright news here is that Cyberdyne’s algorithms seem to need a little more work.
The hilarious news is that a Stanford professor is apparently unaware that other people have ways of figuring out if he did his job or not. It’s especially funny given that many/most professors in even second/third tier colleges routinely use algorithms to scan for AI generation and fraudulent citations, so when the issue is a big one, you’re almost guaranteed someone’s BS detector will trigger and those algorithms will be used.
What to do with Hancock? Well, he gave that deposition under oath, knowing that he hadn’t done the work and that he’d never found the sources he’d referenced. I think perjury charges are in order here, combined with loss of his academic job. Give him the full Michael Bellesiles.
November 22nd, 2024 at 12:33 am
[…] Trudeaus, and They comfort me Shark Tank: Salazar Joins Congressional DOGE Caucus Shot In The Dark: That’s So Meta, What Does An Unlubricated Proctology Exam Look Like? That’s A Bold Strategy, Cotton, […]