The AI cold war between Google and Microsoft is over
I wonder how did Msft incorporate LLM into the search
Do they create or use another model ground-up that can do traditional search and the stuff that chat-gpt could?
Or they perform a traditional search and using the content of the search results to answer like a chat-got
Or some other ways?
Ultimately, the function of search , I would believe, is to find all relevant, reliable information. It isn’t to process or analyse information. Chat-gpt seems to incline more towards that. So perhaps its better to use these 2 separate , especially when one do not have enough subject matter knowledge.
When Sam came on stage he said that the new model was faster, more accurate and more capable than ChatGPT but based on GPT 3.5 and the learnings they did with the ChatGPT Research preview.
I look at this like more like a "Distilled ChatGPT" connected to the internet than a "sparse ultra large LLM" like GPT4 is supposed to be.
That would make sense because if they really want to scale this tech they need to reduce the compute necessary.
If the next model is so compute intensive that it is more in dollars per query than cents per query, I believe they will keep their flagship model to themselves and make it available via ChatGPT-Pro or ChatGPT-Plus to make up for the cost.
Having access to superhuman AI at all would be such a game changer that I still have a hard time truly grasping what that would mean. I guess we will find out soon enough.
There’s a new decentralized chatbot that will be coming out VERY soon. It’s been tested privately, and because it’s decentralized, it’s not biased like other chatbots. Are you willing to try it out once it’s available?