DeepSeek's Breakthrough in AI: Navigating Costs and Consumer Impact

DeepSeek's smaller LLM shows promise for smartphones amid a stagnant consumer interest in AI. Jefferies highlights their efficient, cost-effective model as a positive indicator for AI's future, yet warns of unrecouped investments. Industry strategies may shift towards ROI-focused approaches, slowing computing power demand.


Devdiscourse News Desk | Updated: 28-01-2025 15:17 IST | Created: 28-01-2025 15:17 IST
DeepSeek's Breakthrough in AI: Navigating Costs and Consumer Impact
Representative Image. Image Credit: ANI
  • Country:
  • India

In a recent report, Jefferies, an investment banking and capital markets firm, spotlighted the advantages of DeepSeek's smaller variants of its open-source large language model (LLM) for smartphones as AI adoption among consumers remains lukewarm. The report also alludes to another company making headway in AI with potentially more advanced models or extensive datasets.

DeepSeek's progress hints at a brighter future for AI, despite the constraints faced by smaller models like those from Apple. 'If smaller models can work effectively, it could be advantageous for smartphones,' Jefferies noted, maintaining a bearish stance on AI's traction in the smartphone sector due to the lack of consumer interest. The report underlined that significant hardware upgrades are required for running larger models on phones, which translates to higher costs.

The report further explored the return on investment (RoI) in AI, hailing DeepSeek's LLM as an efficient model poised to redefine AI development. Developed by High-Flyer, a successful AI-driven quant fund, DeepSeek's LLM rivals the performance of industry leaders such as GPT-4 with reduced computational demands. Training costs amounted to just USD 5.6 million, a stark contrast to other premium models.

Highlighting RoI concerns, the report expressed skepticism regarding the demand for computing power. 'The market remains concerned about demand growth in computing power,' the report stated, emphasizing the limited returns from substantial GPU investments, predicted to exceed US$200bn for NVDA in 2024.

Jefferies pointed out that the colossal investments in AI haven't resulted in the anticipated returns due to the absence of tangible monetization. Regarding AI investment strategies, the report suggested two possibilities: pursuing greater computing power for accelerated improvements or shifting focus to efficiency and ROI, potentially reducing computing power demand next decade.

As scrutiny over bottom lines increases, especially for US-based AI firms, they face mounting pressure to justify rising capital expenditures, the report concluded. (ANI)

(With inputs from agencies.)

Give Feedback