zlacker

[parent] [thread] 0 comments
1. wddkcs+(OP)[view] [source] 2023-11-20 06:22:38
We're still in the early days of LLM training- saying that datasets are tapped is like saying we hit peak oil in 1880. This is still early days for the field, and it's not clear how efficient new training methods might become, or how small the smallest viable training set can be. There's more scrutiny now sure, but Altman was one of the people pushing for that scrutiny. He likely is capable of navigating the pit traps that would be there for competitors.
[go to top]