I agree that reasoning is quite important, but I think GPT-4 is already very capable and unlocks capabilities or levels that real use-cases can leverage today.
But I also think that increasing performance is just as much about curating data and model feedback, and better architecture, as it is about things like giant datasets. It looks like open source is catching up and will likely shortly reach a performance and efficiency level that rivals the current GPT-4.
Open source doesn't have to be as good as the latest closed models to be useful. Once it can get a little bit smarter and more convenient then we won't need OpenAI etc. to handle many complex tasks.
I'd agree if the scale of money being poured into closed source models was what it was only a year ago, but now it's 10x greater. Open source can't compete with hundreds of millions of dollars and the talent vacuum it creates.
But I also think that increasing performance is just as much about curating data and model feedback, and better architecture, as it is about things like giant datasets. It looks like open source is catching up and will likely shortly reach a performance and efficiency level that rivals the current GPT-4.
Open source doesn't have to be as good as the latest closed models to be useful. Once it can get a little bit smarter and more convenient then we won't need OpenAI etc. to handle many complex tasks.