ChatGPT went all in on increasing the amount of data to work with to boost the AI capabilities, but just like all other tech sectors, it seems pretty clear that we find diminishing returns with each doubling in the amount of data.
Currently we have reached that point where only trillion dollar companies are capable of building better and better AI fast.
General AI even now can regurgitate the best answers, but it hasn't shown the capacity to create something truly new, that comes with the randomness that emerges from biological evolution.
General AI is limited by the total knowledge of humanity being fed into it, beyond which it cannot grow no matter what. It is the world's smartest answering machine.
Will AI be able to replace humanity? No.
Will AI be able to make robotic equivalents to human labor? Yes.
Today, the most advanced AI in the world is Gemini by google.
It is taking the efforts of the 4th most valued company in the world to keep developing and upgrading AI further.
The current AI boom was a random discovery and it will slow down as fast as it rose up.
General AI will very likely peak at being 2-3 times smarter than the smartest human, then go no further.
This is because it cannot create anything new, it can only pattern match all the data that already exists out there.
Until and unless we give AI the ability to "mutate" like human DNA does, it will reach an upper limit and stagnate at that point.
Conclusion:
AI won't take over the world. It will be another tool to help humans create more things faster, and finish up all the current backlog of research projects.
Jump in the discussion.
No email address required.
You are fricking wrong. Emergent properties is what makes the AI of LLM so fricking great. They can solve tasks that models with fewer parameters couldn't solve without any programming done.
Read this: https://arxiv.org/abs/2206.07682
Jump in the discussion.
No email address required.
Nothing I said disagrees with what you said.
The issue is they are going to run out of data to feed the machine at some point. GPT 4 had 10x the parameters of GPT3. GPT 5 will probably need another 10x increase in parameters. This is clearly not sustainable.
Jump in the discussion.
No email address required.
Now you are talking about order of magnitudes and you are now arguing that they might run out of data. Those are things I agree with.
Your initial statement was "diminishing returns when we put in double the data", which is the opposite what happens.
Jump in the discussion.
No email address required.
its a technical gotcha
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context