Microsoft Orca challenges ChatGPT and the future of AI as we know | Digit
Microsoft has unleashed Orca, a new large language model that could underpin the future of AI chatbots like ChatGPT. It is fairly smaller (13 billion parameters) than its popular counterparts but has already outsmarted Open AI’s ChatGPT and even given GPT 4 model (supposedly with 1 trillion parameters) a run for its money in some benchmarks. Speaking of money, it costs very little for training in comparison to these biggies. Also, it’s open-source. Well, that is the most special thing about it and guess, we should’ve started with it.
Anyways, let’s learn how its design and working matters to both contemporary and future AI research and development.
How Orca works and how it’s better than GPT
https://www.youtube.com/watch?v=–khbXchTeE
Before we get to Orca, let’s understand the pain point it is trying to solve.
For training any really large language models (LLMs) like Open AI’s GPT 4 or Google’s LaMDA or PaLM 2, it takes billions. That is millions each for collecting good data, training, refining, and reinforcing the learning with human feedback.
Also Read: 4 new features of the newly launched GPT 4 that make ChatGPT an advanced multimodal chatbot
Not all companies, let alone small research groups have this kind of money. Plus, GPT 4 and the likes are too powerful and polymath for their own good. The clients of these LLMs may not need all of this knowledge and chops. So, something cheaper, but at the same time more specialized, if you will, is what these clients need.
Orca not only fulfils these two requirements but is also smarter in some respects.
This is thanks to the way it is designed to learn things.
It learns from, or should we say imitates GPT 4 and similar LLM models. But it not only grasps what GPT 4 does but also how it does something and so-to-speak the thought process behind its action.
Microsoft has also introduced a new learning method for Orca:
Orca learns in a two-step process. It first solves simpler queries from ChatGPT and then it is made to use that experiential knowledge to learn from GPT 4 to solve more complex queries.
Also Read: Google Bard upgraded with amazing AI skills but is it better than ChatGPT?
So you see, it tries to emulate the reasoning capabilities of a human by imitating how humans think and perform. If it’s as effective as it is described to be, then we have an LLM or AI model capable of disrupting the future of AI.
Let’s see if Orca’s performance matches its promising theory. We look forward to this with some apprehension, of course.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.