Meta has unveiled its latest cutting-edge artificial intelligence models, Llama 3 8B and Llama 3 70B, in a bid to compete with industry giants OpenAI and Google in the AI arena. The company boasts that these new models, equipped with a staggering 8 billion and 70 billion parameters respectively, offer superior performance compared to their predecessors.
According to Meta, the Llama 3 models have been trained on an extensive dataset of over 15 trillion tokens sourced from publicly available information. This training data is significantly larger than what was used for the previous Llama 2 models, with Meta claiming that the new models deliver unmatched generative AI capabilities.
One of the key highlights of the Llama 3 models is their training on custom-built 24,000 GPU clusters, a testament to Meta’s commitment to pushing the boundaries of AI research. With a focus on improving performance and scalability, Meta aims to establish the Llama 3 models as the forefront of generative AI technology.
With its impressive specifications and promising performance metrics, Meta’s Llama 3 models are poised to shake up the AI landscape and challenge the dominance of rivals like OpenAI and Google. As the AI battle intensifies, the unveiling of the Llama 3 models signals Meta’s determination to stay at the forefront of technological innovation in the field of artificial intelligence.