EgoLife is an AI assistant project focused on long-term, multi-modal, multi-view daily life. The project generated approximately 50 hours of video data by recording the shared living experiences of six volunteers for a week, covering daily activities and social interactions. Its multi-modal data (including video, gaze, and IMU data) and multi-view camera system provide rich contextual information for AI research. Furthermore, the project introduces the EgoRAG framework for addressing long-term context understanding tasks, advancing AI's capabilities in complex environments.