>>xxbond+93
For example if it consumes lots of data that uses the pronoun "he" for doctor, it's much more likely to spit out "he" in a medical context.
Since the model lacks world knowledge for new words, sometimes it also ascribes words to a specific cultural origin / group that is completely wrong.