zlacker

[parent] [thread] 2 comments
1. mvkel+(OP)[view] [source] 2025-05-07 04:40:34
That's a bit like saying having access to Google is as good as being Google.

All they really see as a model provider is little fragments of the picture, like trying to reconstruct the Mona Lisa by knowing which paint swatches Leonardo used.

In other words, they only saw whatever Windsurf sent as context with a "fix the bugs" prompt stapled to it.

By owning Windsurf, they see the entire source code of what's being built, all the time, plus how the model is interacting with it.

There's a massive amount of value in what happens client-side, and behind the scenes. The "director's cut" of context.

Huge difference.

replies(1): >>pqtyw+FA
2. pqtyw+FA[view] [source] 2025-05-07 12:05:04
>>mvkel+(OP)
So just put up together a comparable VS Code based AI IDE in a couple of months and bundle it together with the ChatGPT subscription? They'd get loads of users very fast..
replies(1): >>sanxiy+QD
◧◩
3. sanxiy+QD[view] [source] [discussion] 2025-05-07 12:30:12
>>pqtyw+FA
I think it is exactly this. There is no doubt whatsoever OpenAI can do this, but they decided not to. The reason, I think, is that they don't want to be a couple of months late. In other words, they spent $3B to save a couple of months.
[go to top]