zlacker

[parent] [thread] 0 comments
1. helloh+(OP)[view] [source] 2026-02-04 19:03:27
Well, for multiplication complexity is defined in terms of on the number of digits/bits digits directly. For attention, complexity is defined on terms of the number of input vectors which are all at fixed precision. I don't understand what happens to the method proposed in the paper at higher precision (since I don't understand the paper), but in reality in doesn't matter since there is no value in anything over float16 for machine learning.
[go to top]