Research Shows: ChatGPT Exhibits Left-Leaning Bias at Times
站长之家
23
The researchers at the University of Norwich in the UK have developed a method to assess whether the output of ChatGPT exhibits political bias. They instructed ChatGPT to impersonate individuals from different political factions, respond to a series of ideological questions, and compared these responses with its default answers. The findings revealed a tendency for ChatGPT to align more with the positions of the Democratic Party in the US, the Labour Party in the UK, and the left-wing parties in Brazil when answering these questions. The researchers are concerned that ChatGPT's political bias could influence users' political views and potentially affect election outcomes. They note that the political biases in AI systems could replicate or amplify the existing challenges brought about by the internet and social media. They call for future research to delve deeper into how training data and algorithms impact ChatGPT's output. The researchers consider the issue of ChatGPT's political bias to be a significant subject worthy of in-depth study. For users, it is crucial to understand ChatGPT's biases and maintain critical discernment to ensure access to fair and neutral information.
© Copyright AIbase Base 2024, Click to View Source - https://www.aibase.com/news/644