Game company Activision has announced the introduction of an AI-powered voice chat moderation system into its flagship shooter series, "Call of Duty." Developed by AI company Modulate, the system is named ToxMod. ToxMod can monitor real-time voice chats between players to identify any content that violates the game's community guidelines, such as hate speech, abuse, and harassment. Initially tested in North America, ToxMod is being deployed in early titles like "Call of Duty: Modern Warfare 2" and "Call of Duty: Warzone." On November 10th, when the latest installment, "Call of Duty: Modern Warfare 3," is released globally, ToxMod will also be rolled out, covering players worldwide. The system utilizes technologies such as voice transcription and sentiment analysis to distinguish between harmful content and normal speech. The final review is still conducted by human customer service to avoid incorrect judgments made directly by AI.