AI Models: A Race To The Bottom

Date:



AI models are in a race to the underside. They’re working as hard as they’ll to make them each as low cost and as powerful as …

21 COMMENTS

  1. Costs went down so much that I can heat my home with AI that can reason better than o1-mini running on my gaming GPU.
    Of course if "Open"AI had their way we would have to pay for this tech a top buck.
    The only reason they made ChatGPT site with free conversations is that this tech is actually easy to make. We could have LLMs years ago – some times it is enough to make something to have explossion. Of course hardware had to mature – then again it was never an issue to make GPUs with much more VRAM. Training taking longer? No issue – it will just take longer. Less train data? LLMs would be dumber but still workable.

  2. Great analysis, bad framing or at least incomplete. There's capex/NRE requirements to make these things useful in the first place. The price still needs to come down. If you want computation/"intelligence" to be internet-like in terms of value and ubiquity. Consider how many people are even using it, and how often they're using it. Like imagine you have no primary income, etc and you're just going to start building a product or svc. What's the 1 quarter upfront cost of building a useful AI product? And, you have to keep improving because technological ceiling hasn't been found yet. A GPU-rich kid in Cambridge or Guanzhou can fuckup billions of dollars of previous investment.

    This is good, and what we should want and expect from innovation capital markets.

  3. its like bitcoin – the sensible price floor is the electricity cost of the execution time.
    The only competition is how little compute do you need for a "good enough"
    the long run thinking models, if running at profit per token, are where the money is; its a license to churn through huge amounts of tokens as fast as you can generate them.
    Running a basic q&a chatbot is going to get cheaper and cheaper, but the level of planned activity an AI can do within a reasonable cost/time threshold will balloon.

  4. The value for me is in the agentic features, the memory it has, the creativity and having something that has context or be a useful assistant, remember things and pack it all together. Open AI seem to be pretty good with that and why I haven't switched over. I use the chatbots not the API.

    Nevertheless the power of these models are phenomenal. When we all discovered chat gpt we all had our minds blown. Imagine if you'd seen that in 2019 or 2018? Its a huge leap in computing.

    They should focus less on pricing or underlying models. More on the value – i.e. understanding what's valuable to the user in the human world, creating use cases, becoming a actually useful assistant and personable, rather than churning out slop or bullet point lists.

    We really haven't seen what these models are capable of on that front. If that gets mastered they could charge thousands of dollars a month. If they can innovate or manage projects, then they could make millions in new value.

    This is still nerd territory and this power isn't really being used or applied yet. Its like a rocket engine that's revving up

  5. At the start of LLM's everyone assumed there would be a winner (Like google in the search engine wars). But I don't think this is that kind of product. Everyone can make LLM's and they are easily interchangeable. This isn't a winner takes all scenario.

  6. With the amount of GPUs, CPUs and Neural Network hardware they have, they should be switching by NOW to sell "Infra as a service" to their competitors, just because they can't compete and that is a fact.

    OpenAI is not able to compete in a long run, they are just too heavy, its like your uncle is carrying a bag of charcoal to the barbecue and wanting to race Usain Bolt; that amount of hardware they run is HELLA expensive and there is nothing they can do to circumvent it.

    They are betting on dropping a model so significantly better, powered by that behemoth of a structure, that the competitors will not be able to beat without also using heavy infra, and that their edge will then be too large for some good years, where they'll try to capitalize. I don't distrust it, but it is BOLD.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Our Newsletter

Subscribe Us To Receive Our Latest News Directly In Your Inbox!

We don’t spam! Read our privacy policy for more info.

Advertisement

Popular

More like this
Related

The Beyond Collective Rebrands Agence under one name

The union of the brand resulted in several changes...

Benefits flowing with a bank’s blood cans

Little progress in contemporary medicine has as many guarantees...

Failure is fascinating – Karen Walrond

How do you react to failure when...

Sunbathing Nemo: Clownfish will survive the heat wave by shrinking

It has been shown that clowns are shrinking to...