zlacker

[parent] [thread] 7 comments
1. throwa+(OP)[view] [source] 2022-05-23 22:10:51
https://github.com/lucidrains/imagen-pytorch
replies(2): >>Cobras+t8 >>atty+wj
2. Cobras+t8[view] [source] 2022-05-23 23:09:27
>>throwa+(OP)
Is this a joke?
replies(1): >>throwa+Od
◧◩
3. throwa+Od[view] [source] [discussion] 2022-05-23 23:53:11
>>Cobras+t8
No
replies(1): >>w1nk+Bk1
4. atty+wj[view] [source] 2022-05-24 00:40:50
>>throwa+(OP)
Lucidrains is a champ. If theyre on HN, bravo and thanks for all the reference implementations!
◧◩◪
5. w1nk+Bk1[view] [source] [discussion] 2022-05-24 11:19:10
>>throwa+Od
To expand a bit for the grandparent, if you check out this authors other repos you'll notice they have a thing for implementing these papers (multiple DALLE-2 implementations for instance). You should expect to see an implementation there pretty quickly I'd guess.
replies(2): >>xtreme+gO1 >>Cobras+Mr3
◧◩◪◨
6. xtreme+gO1[view] [source] [discussion] 2022-05-24 14:16:20
>>w1nk+Bk1
Not to diminish their contribution but implementing the model is only one third of the battle. The rest is building the training dataset and training the model on a big computer.
replies(1): >>w1nk+TE2
◧◩◪◨⬒
7. w1nk+TE2[view] [source] [discussion] 2022-05-24 18:20:15
>>xtreme+gO1
You're not wrong that the dataset and compute are important, and if you browse the author's previous work, you'll see there are datasets available. The reproduction of DALL-E 2 required a dataset of similar size to the one imagen was trained on (see: https://arxiv.org/abs/2111.02114).

The harder part here will be getting access to the compute required, but again, the folks involved in this project have access to lots of resources (they've already trained models of this size). We'll likely see some trained checkpoints as soon as they're done converging.

◧◩◪◨
8. Cobras+Mr3[view] [source] [discussion] 2022-05-24 23:21:37
>>w1nk+Bk1
Thank you. I just saw a GitHub repo, empty except for a citation and a claim that it was an implementation of Imagen, and thought it was perhaps some satirical statement about open source or something. With the context it makes a lot more sense.
[go to top]