Discussion
Loading...

Post

  • About
  • Code of conduct
  • Privacy
  • About Bonfire
sjvn
@sjvn@mastodon.social  ·  activity timestamp 24 hours ago

Linux Foundation Leader: We're Not in an #AI Bubble: https://thenewstack.io/linux-foundation-leader-were-not-in-an-ai-bubble/ via @TheNewStack & @sjvn

Linux Foundation Executive Director Jim Zemlin argued that “Artificial intelligence may not be in a full-blown bubble, but large language models [LLMs] just might be.”

The New Stack

Linux Foundation Leader: We're Not in an AI Bubble

However, said the foundation's executive director at Open Source Summit Japan, many companies are wasting money on brand-name large language models.  
  • Copy link
  • Flag this post
  • Block
TheStrangelet(mas) :bc:
@thestrangelet@beige.party replied  ·  activity timestamp 23 hours ago

@sjvn @TheNewStack That's a strange semantic argument for Zemlin to make given what most people refer to as "AI" are LLMs.

  • Copy link
  • Flag this comment
  • Block
sjvn
@sjvn@mastodon.social replied  ·  activity timestamp 23 hours ago

@thestrangelet @TheNewStack Read the article, and it makes sense. TLDR: He's talking about the Big AI companies.

  • Copy link
  • Flag this comment
  • Block
TheStrangelet(mas) :bc:
@thestrangelet@beige.party replied  ·  activity timestamp 23 hours ago

@sjvn @TheNewStack I did read the article. Economically, given that close to all of the over-investment is being made by Big Tech it's not a meaningful distinction. LLM Bubble, AI Bubble, same, same.

  • Copy link
  • Flag this comment
  • Block
sjvn
@sjvn@mastodon.social replied  ·  activity timestamp 22 hours ago

@thestrangelet @TheNewStack The open AI programs, however, are not the ones having billions thrown into them. They're funding is an order of magnitude less.

  • Copy link
  • Flag this comment
  • Block
Sriram "sri" Ramkrishna - 😼
@sri@mastodon.social replied  ·  activity timestamp 24 hours ago

@sjvn @TheNewStack There is some truth to that I think. LLMs are by enlarge want being pushed.

Machine learning and the like have always been here same with transformers. LLMs + agentic workflows etc is what is being heavily pushed and what is sucking up data centers to do inference.

  • Copy link
  • Flag this comment
  • Block
Log in

Gnar🔥 social

Federated bonfire social space for those into gnarly adventures. From shredding mountains to guitars if you enjoy gnar, come share your stoke and adventures!

Gnar🔥 social: About · Code of conduct · Privacy ·
Gnar-fire social · 1.0.0-rc.2.33 no JS en
Automatic federation enabled
  • Explore
  • About
  • Code of Conduct
Home
Login