Recently, Meta's AI assistant mistakenly claimed that the failed assassination attempt on former President Donald Trump never occurred, drawing widespread attention. Meta executives have expressed regret over this error.

In a company blog post, Meta's Global Policy Head, Joel Kaplan, acknowledged that this mistake was due to the technology supporting AI systems like chatbots.

Initially, Meta programmed its AI to avoid answering questions about the attempted assassination of Trump, but after users began to notice this, Meta removed the restriction. This change led the AI to continue providing incorrect answers in some cases, even asserting that the event did not happen. Kaplan noted that such occurrences, known as "hallucinations," are not uncommon in the industry and represent a common challenge for generative AI.

image.png

Google has also faced similar issues. Recently, Google had to deny claims that its search autocomplete function was censoring results related to the attempted assassination of Trump. Trump expressed strong dissatisfaction on social media, calling it another attempt to manipulate elections, and urged attention to Meta and Google's actions.

Since the advent of generative AI like ChatGPT, the tech industry has been striving to address the issue of AI generating false information. Companies like Meta have attempted to improve their chatbots by providing high-quality data and real-time search results. However, this incident shows that these large language models are still prone to generating erroneous information, which is an inherent flaw in their design.

Kaplan stated that Meta will continue to work on solving these issues and improve their technology based on user feedback to better handle real-time events. These series of incidents not only highlight the potential problems of AI technology but also raise public concerns about AI accuracy and transparency.

Key Points:

1. 🤖 Meta AI incorrectly claimed the Trump assassination attempt did not happen, drawing attention.

2. ❌ Executives refer to this error as a "hallucination," a common industry issue.

3. 🔍 Google also accused of censoring related search results, sparking Trump's discontent.