Wasn't a key enabler of early transitor work that required capital investment was modest?
SotA AI research seems to be well past that point.
They were simple in principle but expensive at scale. Sounds like LLMs.
My understanding was that practical results were indicating your model has to be pretty large before you start getting "magic."