Amazon is training a large language model called Olympus with twice the parameters of GPT-4
站长之家
51
Translation:
According to a report on November 9th from Zhanzhangzhijia, Amazon is secretly developing a large-scale language model codenamed "Olympus" with an astonishing 2 trillion parameters, twice the number of parameters in OpenAI's latest GPT-4. This would make Olympus one of the most parameter-rich models currently available. Amazon's strong position in the cloud computing field could potentially make waves in the AI sector with Olympus. However, the number of parameters does not determine the quality of a model; the key lies in the design of the model structure and the quality of the training data. The specifics of Olympus and Amazon's release schedule remain unclear, and the industry will continue to monitor its developments.
© Copyright AIbase Base 2024, Click to View Source - https://www.aibase.com/news/3003