zlacker

[parent] [thread] 5 comments
1. librar+(OP)[view] [source] 2025-05-06 22:54:17
But is there a secret sauce in any of the coding agents (Copilot Agent, Windsurf, Claude Code, Cursor, Cline, Aider, etc)? Sure, some have better user experience than others, but what if anything makes one "better at coding" than another?

As this great blog post lays bare ("The Emperor Has No Clothes", https://ampcode.com/how-to-build-an-agent), the core tech of a coding agent isn't anything magic - it's a set of LLM prompts plus a main loop running the calls to the LLM and executing the tool calls that the LLM wants to do. The tools are pretty standard like, search, read file, edit file, execute a bash command, etc. etc. Really all the power and complexity and "coding ability is in the LLM itself. Sure, it's a lot of work to make something polished that devs want to use - but is there any more to it than that?

So what is the differentiator here, other than user experience (for which I prefer the CLI tools, but to each their own)? $3B is a lot for something that sure doesn't seem to have any secret sauce tech or moat that I can see.

replies(3): >>gregsc+N >>hello_+0f >>asdev+Vs
2. gregsc+N[view] [source] 2025-05-06 23:00:29
>>librar+(OP)
But one could have said the same thing of Whatsapp when they got acquired by Facebook, no? Just a messaging app, anyone can replicate.
replies(1): >>librar+h3
◧◩
3. librar+h3[view] [source] [discussion] 2025-05-06 23:25:15
>>gregsc+N
Yes and no. A messaging app with 450m users has very strong network effects. Users are sticky in a way they aren’t going to be with a VS Code fork which will be increasingly incompatible with the VS Code ecosystem. There are a lot of equally good alternatives to Windsurf and you don’t have to persuade all your friends and relatives to switch too.
4. hello_+0f[view] [source] 2025-05-07 01:40:24
>>librar+(OP)
The moat is Windsurf’s custom LLM and the ops around it (training pipelines, fine-tuning, infra).

Codeium (Windsurf’s parent) started as a GPU optimization company, so they have deep expertise there. Unlike most agents that might just wrap OpenAI/Claude/etc Windsurf’s own model powers its code edits, not external API calls.

That’s where the defensibility is. better in-house models + efficient infra = stronger long-term moat

replies(1): >>rhubar+Bk2
5. asdev+Vs[view] [source] 2025-05-07 04:32:09
>>librar+(OP)
the apply model for Cursor is really good and fast for multi line edits within files. not sure if others have caught up
◧◩
6. rhubar+Bk2[view] [source] [discussion] 2025-05-07 19:03:51
>>hello_+0f
I suspect it’s also around handling large code bases, building out a prompt that is maximally useful via more conventional processing before passing to the LLM
[go to top]