>>gt_+(OP)
Gender bias in Word2Vec data is a well known problem. The article provides references. It's an unsupervised algorithm, and the Google pretrained vectors are trained from news coverage, so not really a researcher issue (insofar as they select an unusual datasets or something).
Edit: to clarify, your claim is that Word2Vec data isn't biased even though there is a link right there showing how it is? Why do you think that?
If you use that data in a system then you reinforce that bias.