["Amazon is training a large language model codenamed Olympus, with an astonishing 2 trillion parameters.", "In contrast, OpenAI's GPT-4 has 1 trillion parameters, making Olympus's parameters twice as many.", "Amazon has strong cloud computing resources and is expected to make waves in the AI field with Olympus.", "The number of parameters is not the only criterion for determining the strength of an AI model; the model's design and training data are equally important.", "It is still unclear when Amazon will publicly release Olympus."]