LLMs are great at predicting and navigating human culture, at least the subset that can be captured in their training sets.
The ways in which we interact with other people are culturally mediated. LLMs are not people, but they can simulate that culturally-mediated communication well enough that we find it easy to anthropomorphise them.