zlacker

[parent] [thread] 1 comments
1. xxbond+(OP)[view] [source] 2020-05-13 18:39:28
This is awesome! What kind of bias would we expect to see in the training set?
replies(1): >>cosbyn+H1
2. cosbyn+H1[view] [source] 2020-05-13 18:48:09
>>xxbond+(OP)
For example if it consumes lots of data that uses the pronoun "he" for doctor, it's much more likely to spit out "he" in a medical context.

Since the model lacks world knowledge for new words, sometimes it also ascribes words to a specific cultural origin / group that is completely wrong.

[go to top]