zlacker

[return to "Sam Altman Says AI Using Too Much Energy Will Require Breakthrough Energy Source"]
1. Dirak+mk[view] [source] 2024-01-22 23:21:55
>>Dyslex+(OP)
One should be suspicious of ulterior motives when the CEO of an AI company makes a claim like this.

On one hand, LLMs do require significant amounts of compute to train. But the other hand, if you amortize training costs across all user sessions, is it really that big a deal? And that’s not even factoring in Moore’s law and incremental improvements to model training efficiency.

◧◩
2. dathin+3v[view] [source] 2024-01-23 00:32:15
>>Dirak+mk
> is it really that big a deal

yes, at least as long as you constantly develop new AI models

and you still need to run the models, and e.g. for GPT4 that is alone already non trivial (energy cost/compute wise)

through for small LLMs if they are not run too much it might be not that bad

---

Generally I would always look for ulterior motives for any "relevant" public statement Sam Altman makes. As history has shown there often seems to be some (through in that usage "ulterior" has a bit too much of a "bad"/"evil" undertone).

To cut it short he seem to be invested in some Nuclear Fusion company, which is one of the potential ways to "solve" that problem. Another potential way is to use smaller LLMs but smaller LLMs can also be potentially a way how OpenAI loses their dominant position, as there is a much smaller barrier for training them.

[go to top]