Meta Announces Early Access Testing for Multimodal AI Features, to be Applied to Meta Ray-Ban Smart Glasses, Capable of Informing Users of What They See and Hear Through the Glasses' Cameras and Microphones. Mark Zuckerberg Demonstrated the Multimodal AI Capabilities, Asking the Glasses for Suggestions on Pants to Pair With, and Having the AI Assistant Translate Text and Display Image Descriptions. The Chief Technology Officer of Meta Showcased Other Features of the Glasses, Including Requesting the Assistant to Add Captions, Translate, and Summarize Photos. The Testing Period Will Be Limited to a Small Number of People in the United States.