Microsoft Research and OpenAI have collaborated to make significant strides in integrating artificial intelligence (AI) capabilities into Microsoft’s products and services. As part of this partnership, Microsoft Research recently unveiled a new AI model called Orca, which has the ability to learn by imitating the reasoning processes of larger language models like GPT-4.
The unique feature of Orca is its ability to address the limitations of smaller AI models by leveraging the insights gained from emulating the reasoning of massive foundation models like GPT-4. This approach allows Orca to be optimized for specific tasks and trained using the knowledge and power of larger models. The advantage of Orca’s smaller size is that it requires fewer computing resources, allowing researchers to run their models independently without relying on a large data center.
According to a research paper introducing Orca, this AI model, powered by 13 billion parameters, successfully imitates and learns from larger language models like GPT-4. Built on the foundations of Vicuna, Orca has the ability to comprehend explanations, engage in step-by-step thought processes, and understand complex instructions with the assistance of GPT-4, which is rumored to have over one trillion parameters.
To facilitate progressive learning with Orca, Microsoft uses extensive and diverse imitation data. As a result, Orca has already outperformed Vicuna by 100% on complex zero-shot reasoning benchmarks, such as Big-Bench Hard (BBH). Additionally, Orca is claimed to be 42% faster than conventional AI models on AGIEval, a benchmark evaluation for AI models.
Despite its smaller size, Orca demonstrates comparable reasoning capabilities to ChatGPT, as evidenced by its performance on benchmarks like BBH. Moreover, Orca shows competitive performance on academic examinations such as SAT, LSAT, GRE, and GMAT, although it falls short of the expectations set by GPT-4.
The Microsoft research team behind Orca highlights that the model can learn from step-by-step explanations generated by both humans and more advanced language models. This allows Orca to continually enhance its skills and capabilities over time.
The collaboration between Microsoft and OpenAI, along with the introduction of Orca, exemplifies the ongoing efforts to advance the field of AI. By combining the strengths of large-scale models with more specialized counterparts, Microsoft aims to empower researchers and developers to create innovative applications that can effectively reason, understand complex instructions, and tackle a variety of tasks.
Through these advancements, Microsoft is making AI more accessible and efficient, marking a significant milestone in the evolution of AI technologies. With Orca’s smaller size and enhanced capabilities, AI researchers and developers can expect new opportunities to explore and create groundbreaking applications.