zlacker

[return to "Show HN: This Word Does Not Exist"]
1. crazyg+Hh[view] [source] 2020-05-13 19:54:47
>>turtle+(OP)
Wow. GPT2 is so, so, so much better than Markov chains. I'm reading these definitions, and the fact that the last few words of the sentence match the first few words subject-wise is pretty amazing. Just some random ones:

> denoting or relating to a word (e.g., al-Qadri), the first letter of which is preceded or followed by another letter

> a synthetic compound used in perfumery and cosmetic surgery to improve the appearance of skin tone and irritation

> a type of cookie made with dough, jelly, butter, or chocolate, often filled with extra flour

Pretty impressive. I've never seen fake text so real. (I mean none of these seem to quite make 100% logical sense, but if you were just skimming the sentence nothing would stand out as a red flag.)

◧◩
2. mycall+Js[view] [source] 2020-05-13 20:56:25
>>crazyg+Hh
If someone could make a billion word dictionary of these words, you could get excellent sentence compression rates out of standard English.
◧◩◪
3. minima+Rs[view] [source] 2020-05-13 20:57:53
>>mycall+Js
GPT-2 itself works off of excellent sentence compression (byte-pair encoding for the input and output)
[go to top]