Meta has introduced Ego-Exo4D, a multimodal dataset designed to advance research in video learning and multimodal perception. Partners involved in this initiative include Meta FAIR, Project Aria, and 15 universities. The dataset features over 800 professionals across various fields such as sports, music, and cooking. It offers rich multimodal data to support future AI applications. Four benchmark tasks are included, focusing on self-recognition, skill estimation, self-external relations, and self-posture, thereby encouraging community efforts.