zlacker

[parent] [thread] 1 comments
1. joshcr+(OP)[view] [source] 2022-05-23 21:48:04
They're withholding the API, code, and trained data because they don't want it to affect their corporate image. The good thing is they released their paper which will allow easy reproduction.

T5-XXL looks on par with CLIP so we may not see an open source version of T5 for a bit (LAION is working on reproducing CLIP), but this is all progress.

replies(1): >>minima+wa
2. minima+wa[view] [source] 2022-05-23 22:50:59
>>joshcr+(OP)
T5 was open-sourced on release (up to 11B params): https://github.com/google-research/text-to-text-transfer-tra...

It is also available via Hugging Face transformers.

However, the paper mentions T5-XXL is 4.6B, which doesn't fit any of the checkpoints above, so I'm confused.

[go to top]