zlacker

[parent] [thread] 6 comments
1. wilson+(OP)[view] [source] 2026-01-16 19:31:54
Thanks for the feedback. There were some build errors which have now been resolved; the CI test that was failing was not a standard check CI, and it's now been updated. Let me know if you have any further issues.

> On twitter their CEO explicitly stated that it uses a "custom js vm" which seemed particularly misleading / untrue to me.

The JS engine used a custom JS VM being developed in vendor/ecma-rs as part of the browser, which is a copy of my personal JS parser project vendored to make it easier to commit to.

I agree that for some core engine components, it should not be simply pulling in dependencies. I've begun the process of removing many of these and co-developing them within the repo alongside the browser. A reasonable goal for "from scratch" may be "if other major browsers use a dependency, it's fine to do so too". For example: OpenSSL, libpng, HarfBuzz, Skia. The current project can be moved more towards this direction, although I think using libraries for general infra that most software use (e.g. windowing) can be compatible with that goal.

I'd push back on the idea that all the agents did was wire up dependencies — the JS VM, DOM, paint systems, chrome, text pipeline, are all being developed as part of this project, and there are real complex systems being engineered towards the goal of a browser engine, even if not there yet.

replies(2): >>fwip+Ib >>polygl+B8f
2. fwip+Ib[view] [source] 2026-01-16 20:24:08
>>wilson+(OP)
When you say "have now been resolved" - did the AI agent resolve it autonomously, did you direct it to, or did a human do it?
replies(1): >>neuron+Qm
◧◩
3. neuron+Qm[view] [source] [discussion] 2026-01-16 21:25:20
>>fwip+Ib
Looks like Cursor Agent was at least somewhat involved: https://github.com/wilsonzlin/fastrender/commit/4cc2cb3cf0bd...
replies(1): >>embedd+Rp
◧◩◪
4. embedd+Rp[view] [source] [discussion] 2026-01-16 21:40:50
>>neuron+Qm
Looks like a bunch of different users (including Google's Jules made one commit) been contributing to the codebase, and the recent "fixes" includes switching between various git users. https://gist.github.com/embedding-shapes/d09225180ea3236f180...

This to me seems to raise more questions than it answers.

replies(1): >>mjmas+cM
◧◩◪◨
5. mjmas+cM[view] [source] [discussion] 2026-01-17 00:18:56
>>embedd+Rp
The ones at *.ec2.internal generally mean that the git config was never set up ans it defaults to $(id -un)@$(hostname)
replies(1): >>embedd+lD1
◧◩◪◨⬒
6. embedd+lD1[view] [source] [discussion] 2026-01-17 11:47:47
>>mjmas+cM
Indeed. Extra observant people will notice that the "Ubuntu" username was used only twice though, compared to "root" that was used +3700 times. And observant people who've dealt with infrastructure before, might recognize that username as the default for interactive EC2 instances :)
7. polygl+B8f[view] [source] 2026-01-21 18:06:05
>>wilson+(OP)
> there are real complex systems being engineered towards the goal of a browser engine, even if not there yet.

In various comments in >>46624541 I have explained at length why your fleet of autonomous agents failed miserably at building something that could be seen as a valid POC.

One example: your rendering loop does not follow the web specs and makes no sense.

https://github.com/wilsonzlin/fastrender/blob/19bf1036105d4e...

The above design document is simply nonsense; typical AI hallucinated BS. Detailed critique at >>46705625

The actual code is worse; I can only describe it as a tangle of spaghetti. As a Browser expert I can't make much, if anything, out of it. In comparison, when I look at code in Ladybird, a project I am not involved in, I can instantly find my way around the code because I know the web specs.

So I agree this isn't just wiring up of dependencies, and neither is it copied from existing implementations: it's a uniquely bad design that could never support anything resembling a real-world web engine.

Now don't get me wrong, I do think AI could be leveraged to build a web engine, but not by unleashing autonomous agents. You need humans in the loop at all levels of abstractions; the agents should only be used to bang out features re-using patterns established or vetted by human experts.

If you want to do this the right way, get in touch: https://github.com/gterzian

[go to top]