The University of Hong Kong has released OpenGraph, successfully overcoming three major challenges in the field of graph foundation models and achieving zero-shot learning. OpenGraph builds a universal graph model through a unified graph tokenizer, an extensible graph transformer, and knowledge distillation from large language models. Experimental validation shows that OpenGraph excels in cross-dataset prediction and graph tokenizer design, while the LLM-based knowledge distillation method proves effective. OpenGraph fills a gap in the field of graph foundation models.