zlacker

[parent] [thread] 0 comments
1. 6gvONx+(OP)[view] [source] 2022-05-23 22:47:39
I don’t care whether it reasons its way from “3 teddy bears below 7 flamingos” to a picture of that or if it gets there some other way.

But also, some of the magic in having good enough pretrained representations is that you don’t need to train them further for downstream tasks, which means non-differentiable tasks like logic could soon become more tenable.

[go to top]