OpenAI's John Schulman Says GPT-4 Performance Has Challenges From 'Limited Amount Of Data' After Sam Altman Called It 'Dumbest Model'

Zinger Key Points
  • OpenAI cofounder John Schulman, who played a key role in the launch of ChatGPT, has weighed in on the debate about GPT-4’s performance.
  • He thinks there could be some challenges with the amount of data available for training, but notes there’s a way out as well.
Loading...
Loading...

John Schulman, the co-founder of OpenAI, has shared his insights on the performance of GPT-4 and the potential challenges it faces due to a limited amount of training data being available. Schulman played a key role in the launch of ChatGPT in November 2022.

What Happened: In a recent episode of Dwarkesh Patel's podcast, Schulman addressed the potential limitations of GPT-4 and the future of AI models.

When asked if the AI community is approaching a “data wall,” where the benefits of memorizing a vast amount of pre-training data may not lead to significantly smarter models than GPT-4, Schulman admitted that there are a few challenges in that regard, but companies like OpenAI and others will have to change their training methods to overcome these challenges.

"There are definitely some challenges from the limited amount of data, but I wouldn’t expect us to immediately hit the data wall. However, I would expect the nature of pre-training to somewhat change over time as we get closer to it."

Subscribe to the Benzinga Tech Trends newsletter to get all the latest tech developments delivered to your inbox.

Schulman also discussed the challenges of generalization across different types of pre-training data, such as code and language reasoning, and the potential limitations of the abilities unlocked by specific training data.

He also says that bigger models like GPT-4 are more "sample efficient" than their predecessors like GPT-2 – this means these larger models can train to the same level of intelligence using less amounts of data.

While he says there's no good explanation for the scaling laws here, he notes that the increased processing capacity is ultimately the reason behind this.

See Also: Competition For Disney? Comcast To Offer Streaming Bundle Of Apple TV+, Netflix, Peacock At ‘Vastly Reduced Prices'

Why It Matters: Schulman’s remarks come in the wake of statements made by OpenAI CEO Sam Altman about the future of the company’s AI models. Altman hinted at significant advancements with each new model, suggesting a rapid progression in AI capabilities.

He also called GPT-4 the "dumbest model" from OpenAI that people will ever have to use again.

Altman has also expressed a strong commitment to the development of artificial general intelligence (AGI), regardless of the financial cost. His vision for an AI-powered iPhone aligns with the company’s ambitious goals for AI development.

Check out more of Benzinga’s Consumer Tech coverage by following this link.

Read Next: Elon Musk Recalls Breaking Friendship With Google’s Larry Page Over Ilya Sutskever: ‘Linchpin For OpenAI Being Successful’

Disclaimer: This content was partially produced with the help of Benzinga Neuro and was reviewed and published by Benzinga editors.

Photo courtesy: Shutterstock

Market News and Data brought to you by Benzinga APIs
Comments
Loading...
Posted In: NewsTechartificial intelligencebenzinga neuroChatGPTConsumer TechDwarkesh patelGPT-4John SchulmanOpenAiPeople In Tech
Benzinga simplifies the market for smarter investing

Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.

Join Now: Free!

Loading...