zlacker

[parent] [thread] 0 comments
1. cosbyn+(OP)[view] [source] 2020-05-13 18:48:09
For example if it consumes lots of data that uses the pronoun "he" for doctor, it's much more likely to spit out "he" in a medical context.

Since the model lacks world knowledge for new words, sometimes it also ascribes words to a specific cultural origin / group that is completely wrong.

[go to top]