Wisdom Source Research Institute Successfully Trained a 100 Billion Parameter Large Model FLM with Only 100,000 USD

existartificial intelligence (AI)field, the cost of training large models has always been a challenge. However, a study by the Beijing Zhiyuan Artificial Intelligence Research Institute and the Institute of Computing Technology of the Chinese Academy of Sciences, among others, breaks this status quo. With a budget of only $100,000, they trained a brand new 100 billion parameter with 101 billion parameterslarge model--FLM.

The training cost of this model is much lower than other models on the market. For example, GPT-3 cost as much as $4.6 million to train, and Llama2 cost around that amount to train. However, FLM achieved results comparable to GPT-3 with only 2.171 TP3T spent. This result certainly opens up new avenues of research in the field of artificial intelligence.

The successful development of FLM was due to the research team's novel training strategy. They not only succeeded in reducing the training cost, but also improved the performance of the model. This result has been open-sourced and has attracted a lot of attention from the development community.

This research by organizations such as the Beijing Zhiyuan Institute of Artificial Intelligence and the Institute of Computing Technology of the Chinese Academy of Sciences demonstrates their deep strength and innovation in the field of artificial intelligence. Their success provides valuable experience for other research organizations and opens up new possibilities for the development of AI.

This article comes from users or anonymous contributions, does not represent the position of Mass Intelligence; all content (including images, videos, etc.) in this article are copyrighted by the original author. Please refer to this site for the relevant issues involvedstatement denying or limiting responsibilityPlease contact the operator of this website for any infringement of rights (Contact Us) We will handle this as stated. Link to this article: https://dzzn.com/en/2023/1281.html

Like (0)
Previous September 18th, 2023 at 11:19 am
Next September 18, 2023 at 12:32 pm

Recommended

Leave a Reply

Please Login to Comment