And let’s hit some of those negative impacts.
The privacy of several teens at a New Jersey high school was violated when AI-generated nude images of them were circulated in group chats. Teen girls at Westfield High School in New Jersey learned last month that fake nude images of them were shared among other students, the Wall Street Journal reported. Concerns were raised to administrators on Oct. 20, but the photos were shared over the summer, according to an email sent to parents by Westfield High’s principal. The investigation and response involved the Westfield Police Department, the school’s resource officer, the counseling department, and the administration.
And we have some data about hallucinations. According to a New York Times article, chatbot software, including OpenAI’s ChatGPT, Google’s chatbot, and Microsoft’s Bing chatbot, has been found to frequently make up information. A start-up called Vectara, founded by former Google employees, estimates that chatbots can invent information at least 3 percent of the time and possibly as high as 27 percent.
Why do we care?
The incident in New Jersey illustrates the potential misuse of AI technology to generate deepfakes, which can have severe personal and legal implications. MSPs must know the tools and techniques to safeguard digital identities and privacy. They may need to offer monitoring solutions, detect and respond to such misuse, and educate their clients about these risks.
The frequency of AI hallucinations or the generation of false information is a significant concern for any organization that relies on data integrity and accuracy. MSPs should prioritize the validation and verification of AI outputs as part of their services. This could involve implementing additional layers of fact-checking or using AI solutions known for their lower rates of misinformation.
The good news is that this highlights that service providers should be in the sweet spot for – Risk Management, Compliance, and Governance and educating clients. It combines technical and ethical considerations… which is excellent news.