Your daily vibe coding challenge: Get GPT-4o to output functional code which uses Google Vertex AI to generate a text embedding. If they can solve that one by July, then maybe we're on track for "curing all disease and aging, brain uploading, and colonizing the solar system" by 2030.
You may consider using search to be cheating, but we do it, so why shouldn't LLMs?
Search is totally reasonable, but in this case: Even Google's own documentation on these libraries is exceedingly bad. Nearly all the examples they give for them are for accessing the language models, not text embedding models; so GPT will also sometimes generate code that is perfectly correct for accessing one of the generative language models, but will swap e.g the "model: gemini-2.0" parameter for "model: text-embedding-005"; which also does not work.
o3-mini-high's output might work, but it isn't ideal: It immediately jumps to recommending avoiding all google cloud libraries and directly issuing a request to their API with fetch.