zlacker

[return to "Scaling long-running autonomous coding"]
1. tehsau+yO[view] [source] 2026-01-15 03:32:16
>>samwil+(OP)
I was excited to try it out so I downloaded the repo and ran the build. However there were 100+ compilation errors. So I checked the commit history on github and saw that for at least several pages back all recent commits had failed in the CI. It was not clear which commit I should pick to get the semi-working version advertised.

I started looking in the Cargo.toml to at least get an idea how the project was constructed. I saw there that rather than being built from scratch as the post seemed to imply that almost every core component was simply pulled in from an open source library. quickjs engine, wgpu graphics, winit windowing & input, egui for ui, html parsing, the list goes on. On twitter their CEO explicitly stated that it uses a "custom js vm" which seemed particularly misleading / untrue to me.

Integrating all of these existing components is still super impressive for these models to do autonomously, so I'm just at a loss how to feel when it does something impressive but they then feel the need to misrepresent so much. I guess I just have a lot less respect and trust for the cursor leadership, but maybe a little relief knowing that soon I may just generate my own custom cursor!

◧◩
2. wilson+JS6[view] [source] 2026-01-16 19:31:54
>>tehsau+yO
Thanks for the feedback. There were some build errors which have now been resolved; the CI test that was failing was not a standard check CI, and it's now been updated. Let me know if you have any further issues.

> On twitter their CEO explicitly stated that it uses a "custom js vm" which seemed particularly misleading / untrue to me.

The JS engine used a custom JS VM being developed in vendor/ecma-rs as part of the browser, which is a copy of my personal JS parser project vendored to make it easier to commit to.

I agree that for some core engine components, it should not be simply pulling in dependencies. I've begun the process of removing many of these and co-developing them within the repo alongside the browser. A reasonable goal for "from scratch" may be "if other major browsers use a dependency, it's fine to do so too". For example: OpenSSL, libpng, HarfBuzz, Skia. The current project can be moved more towards this direction, although I think using libraries for general infra that most software use (e.g. windowing) can be compatible with that goal.

I'd push back on the idea that all the agents did was wire up dependencies — the JS VM, DOM, paint systems, chrome, text pipeline, are all being developed as part of this project, and there are real complex systems being engineered towards the goal of a browser engine, even if not there yet.

◧◩◪
3. fwip+r47[view] [source] 2026-01-16 20:24:08
>>wilson+JS6
When you say "have now been resolved" - did the AI agent resolve it autonomously, did you direct it to, or did a human do it?
◧◩◪◨
4. neuron+zf7[view] [source] 2026-01-16 21:25:20
>>fwip+r47
Looks like Cursor Agent was at least somewhat involved: https://github.com/wilsonzlin/fastrender/commit/4cc2cb3cf0bd...
◧◩◪◨⬒
5. embedd+Ai7[view] [source] 2026-01-16 21:40:50
>>neuron+zf7
Looks like a bunch of different users (including Google's Jules made one commit) been contributing to the codebase, and the recent "fixes" includes switching between various git users. https://gist.github.com/embedding-shapes/d09225180ea3236f180...

This to me seems to raise more questions than it answers.

◧◩◪◨⬒⬓
6. mjmas+VE7[view] [source] 2026-01-17 00:18:56
>>embedd+Ai7
The ones at *.ec2.internal generally mean that the git config was never set up ans it defaults to $(id -un)@$(hostname)
[go to top]