Home AI - Artificial Intelligence OpenAI Sets Modest Expectations with a More Understated, GPT-5-Free DevDay This Autumn

OpenAI Sets Modest Expectations with a More Understated, GPT-5-Free DevDay This Autumn

by admin

In San Francisco last year, OpenAI captured headlines with a high-profile media event, unveiling several new offerings, including the ultimately unsuccessful GPT Store, similar to an App Store.

Contrastingly, this year’s installment will take a more subdued approach. OpenAI revealed on Monday that its annual DevDay conference will transition from a major single-day event to a collection of localized developer meetings. Additionally, the company has decided to hold off on debuting its highly anticipated new flagship model during DevDay, opting instead to concentrate on enhancing its API and development tools.

According to a spokesperson from OpenAI in a conversation with TechCrunch, “Our upcoming model won’t be unveiled at DevDay. Our aim is to emphasize educating developers on the current tools and highlight stories from the developer community.”

The forthcoming DevDay sessions are scheduled for October 1 in San Francisco, October 30 in London, and November 1 in Singapore.

Lately, OpenAI has been focusing on steady progress and refinement over bold strides in the field of generative AI, as it develops successors to its leading models GPT-4o and GPT-4o mini. The company has introduced methods to boost model performance and reduce instances of models producing irrelevant or unexpected results. However, according to certain benchmarks, OpenAI may have relinquished its pioneering position in generative AI.

A key hurdle appears to be the scarcity of high-caliber training data.

Generative AI models from OpenAI are trained on extensive datasets collected from the web. However, with more creators limiting access to their content for fear of plagiarism or lack of compensation, procuring quality data is becoming challenging. Presently, over 35% of the top 1,000 global websites block OpenAI’s web crawler, as reported by Originality.AI. Moreover, about 25% of data from prime sources has been granted restricted access in the major datasets used for AI training, a study by the MIT Data Provenance Initiative reveals.

If the trend of restricting data access persists, Epoch AI forecasts a potential data scarcity for training generative AI models by 2026 to 2032.

OpenAI is reportedly working on a new reasoning method to significantly enhance its models’ responses, especially in areas like mathematics. The company’s CTO, Mira Murati, has guaranteed a forthcoming model with the intellectual capacity of a Ph.D. degree. Amidst this ambitious undertaking, OpenAI faces the immense challenge of managing the colossal expenses associated with training its models and retaining its highly compensated personnel, reportedly losing billions of dollars.

Compiled by Techarena.au.
Fanpage: TechArena.au
Watch more about AI – Artificial Intelligence

You may also like

About Us

Get the latest tech news, reviews, and analysis on AI, crypto, security, startups, apps, fintech, gadgets, hardware, venture capital, and more.

Latest Articles