zlacker

[parent] [thread] 1 comments
1. ogogma+(OP)[view] [source] 2024-01-23 02:16:25
Isn't ML moving towards low-precision floating point anyways?
replies(1): >>jacque+16
2. jacque+16[view] [source] 2024-01-23 03:01:30
>>ogogma+(OP)
Yes, but the whole reason that works is because a chain of such devices is still stable. A chain of analog components will just output random noise if it gets long enough. That's the main reason why digital won out over analog: repeatability even with long chains of computation.
[go to top]