I'll admit I'm somewhat biased against Bun, but I'm honestly interested in knowing why people prefer Bun over Deno.
Even with a cold cache, `bun install` with a large-ish dependency graph is significantly faster than `npm install` in my experience.
I don't know if Deno does that, but some googling for "deno install performance vs npm install" doesn't turn up much, so I suspect not?
As a runtime, though, I have no opinion. I did test it against Node, but for my use case (build tooling for web projects) it didn't make a noticeable difference, so I decided to stick with Node.
Despite the page title being "Fullstack dev server", it's also useful in production (Ctrl-F "Production Mode").
Deno seems like the better replacement for Node, but it'd still be at risk of NPM supply chain attacks which seems to be the greater concern for companies these days.
Why? Genuine question, sorry if it was said/implied in your original message and I missed it.
I haven't had that experience with deno (or node)
The tools that the language offers to handle use after free is hardly any different from using Purify, Insure++ back in 2000.
Between that and the discord, I have gotten the distinct impression that deno is for "server javascript" first, rather than just "javascript" first. Which is understandable, but not very catering to me, a frontend-first dev.
The reason why you see so many GitHub issues about it is because that's where the development is. Deno is great. Bun is great. These two things can both be great and we don't have to choose sides. Deno has it's use case. Bun has it's. Deno want's to be secure and require permissions. Bun just wants to make clean, simple, projects. This fight between Rust vs The World is getting old. Rust isn't any "safer" when Deno can panic too.
So it seems odd to say that Bun is less dependent on the npm library ecosystem.
[1] It’s possible to use jsr.io instead: https://jsr.io/docs/using-packages
I don't know how Deno is today. I switched to Bun and porting went a lot smoother.
Philosophically, I like that Bun sees Node compatibility as an obvious top priority. Deno sees it as a grudging necessity after losing the fight to do things differently.
Telling prospective employees that if you're not ready to work 60-hour weeks, then what the fuck are you doing here? for one.
> Zig does force some safety with ReleaseSafe IIRC
which Bun doesn't use, choosing to go with `ReleaseFast` instead.
Bun is fast, and its worked as a drop in replacement for npm in large legacy projects too.
I only ever encountered one issue, which was pretty dumb, Amazon's CDK has hardcoded references to various package manager's lock files, and Bun wasn't one of them
https://github.com/aws/aws-cdk/issues/31753
This wasn't fixed till the end of 2024 and as you can see, only accidentally merged in but tolerated. It was promptly broken by a bun breaking change
https://github.com/aws/aws-cdk/issues/33464
but don't let Amazon's own incompetency be the confirmation bias you were looking for about using a different package manager in production
you can use SST to deploy cloud resources on AWS and any cloud, and that package works with bun
Who knows?
Besides, how are they going to get back the money spent on the acquisition?
Many times the answer to acquisitions has nothing to do with technology.
There are degrees to this though. A panic + unwind in Rust is clean and _safe_, thus preferable to segfaults.
Java and Go are another similar example. Only in the latter can races on multi-word data structures lead to "arbitrary memory corruption" [1]. Even in those GC languages there's degrees to memory safety.
Anthropic chose to use Bun to build their tooling.
From a long term design philosophy prospective, Bun seems to want to have a sufficiently large core and standard library where you won't need to pull in much from the outside. Code written for Node will run on Bun, but code using Bun specific features won't run on Node. It's the "embrace, extend, ..." approach.
Deno seems much more focused on tooling instead of expanding core JS, and seems to draws the line at integrations. The philosophy seems to be more along the lines of having the tools be better about security when pulling in libraries instead of replacing the need for libraries. Deno also has it's own standard library, but it's just a library and that library can run on Node.
Bun did prioritize npm compatibility earlier.
Today though there seems to be a lot of parity, and I think things like JSR and strong importmaps support start to weigh in Deno's favor.
JSC is still the JS engine for WebKit-based browsers, especially Safari, and per Apple App Store regulations the only JS engine supposedly allowable in all of iOS.
It's more "mature" than V8 in terms of predating it. (V8 was not a fork of it and was started from scratch, but V8 was designed to replace it in the Blink fork from WebKit.)
It has different performance goals and performance characteristics, but "less tested" seems uncharitable and it is certainly used in plenty of "real-world tasks" daily in iOS and macOS.
For deploy, usually running the attached terraform script takes more time.
So while a speed increase is welcome, but I don't feel it gives me such a boost.
Profit in those products has to justify having now their own compiler team for a JavaScript runtime.
I used bun briefly to run the output of my compiler, because it was the only javascript runtime that did tail calls. But I eventually added a tail call transform to my compiler and switched to node, which runs 40% faster for my test case (the compiler building itself).
Here are the Bun API’s:
https://bun.com/docs/runtime/bun-apis
Here are the Deno API’s:
The nice things:
1. It's fast.
2. The standard library is great. (This may be less of an advantage over Deno.)
3. There's a ton of momentum behind it.
4. It's closer to Node.js than Deno is, at least last I tried. There were a bunch of little Node <> Deno papercuts. For example, Deno wanted .ts extensions on all imports.
5. I don't have to think about JSR.
The warts:
1. The package manager has some issues that make it hard for us to use. I've forgotten why now, but this in particular bit us in the ass: https://github.com/oven-sh/bun/issues/6608. We use PNPM and are very happy with it, even if it's not as fast as Bun's package manager.
Overall, Deno felt to me like they were building a parallel ecosystem that I don't have a ton of conviction in, while Bun feels focused on meeting me where I am.
It's not clear to me why that requires creating a whole new runtime, or why they made the decisions they did, like choosing JSC instead of V8, or using a pre-1.0 language like Zig.
There's certainly an argument to be made that, like any good tool, you have to learn Deno and can't fall back on just reusing node knowledge, and I'd absolutely agree with that, but in that case I wanted to learn the package, not the package manager.
Edit: Also it has a nice standard library, not a huge win because that stuff is also doable in Deno, but again, its just a bit less painless
Faster install and less disk space due to hardlink? Not really all that important to me. Npm comes with a cache too, and I have the disk space. I don't need it to be faster.
With the old-school setup I can easily manually edit something in node_modules to quickly test a change.
No more node_modules? It was a cool idea when yarn 2 initially implemented it, but at the end of the day I prefer things to just work rather than debug what is and isn't broken by the new resolver. At the time my DevOps team also wasn't too excited about me proposing to put the dependencies into git for the zero-install.
We have yet to witness a segfault. Admitedly it's a bunch of micro services and not many requests/s (around 5k AVG).
What gives you this impression?
I directly created Zig to replace C++. I used C++ before I wrote Zig. I wrote Zig originally in C++. I recently ported Chromaprint from C++ to Zig, with nice performance results. I constantly talk about how batching is superior to RAII.
Everyone loves to parrot this "Zig is to C as Rust is to C++" nonsense. It's some kind of mind virus that spreads despite any factual basis.
I don't mean to disparage you in particular, this is like the 1000th time I've seen this.
Lol, yeah, this person is running a performance test on postgres, and attributing the times to JS frameworks.
Something about moral and philosophical flexibility.
A lot of stuff related to older languages is lost in the sands of time, but the same thing isn’t true for current ones.
More broadly, I think the observation tends to get repeated because C and Zig share a certain elegance and simplicity (even if C's elegance has dated). C++ is many things, but it's hardly elegant or simple.
I don't think anyone denies that Zig can be a C++ replacement, but that's hardly unusual, so can many other languages (Rust, Swift, etc). What's noteworthy here is that Zig is almost unique in having the potential to be a genuine C replacement. To its (and your) great credit, I might add.
>> At its core Zig is marketed as a competitor to C, not C++/Rust/etc, which makes me think it's harder to write working code that won't leak or crash than in other languages. Zig embraces manual memory management as well.
@GP: This is not a great take. All four languages are oriented around manual memory management. C++ inherits all of the footguns of C, whereas Zig and Rust try to sand off the rough edges.
Manual memory management is and will always remain necessary. The only reason someone writing JS scripts don't need to worry about managing their memory is because someone has already done that work for them.
1. https://github.com/denoland/deno/issues?q=is%3Aissue%20state... 2. https://github.com/oven-sh/bun/issues?q=is%3Aissue%20state%3...
Node.js is a no-brainer for anyone shipping a TS/JS backend. I'd rather deal with poor DX and slightly worse performance than risk fighting runtime related issues on deployment.
Linux needs to be a first class citizen for any runtime/langauge toolchain.
johnfn@mac ~ % time deno eval 'console.log("hello world")'
hello world
deno eval 'console.log("hello world")' 0.04s user 0.02s system 87% cpu 0.074 total
johnfn@mac ~ % time bun -e 'console.log("hello world")'
hello world
bun -e 'console.log("hello world")' 0.01s user 0.00s system 84% cpu 0.013 total
That's about 560% faster. Yes, it's a microbenchmark. But you said "absolutely zero chance", not "a very small chance".It feels like a very unfocused project, every few months they introduce a new feature that is to replace an entire class of tools, while the original promise of "drop in Node replacement" was never fulfilled (or at least, that was the vibe for the past years when I was still paying attention).
We'll see how this works out but speed + lack of scope never works out well in the long term.
But completely agree. Its a perfect replacement for C++ and I would say the natural spiritual successor of C.
I gave up using Rust for new projects after seeing the limitations for the kind of software I like to write and have been using Zig instead as it gives me the freedom I need without all the over-complication that languages like C++ and Rust bring to the table.
I think people should first experiment see for themselves and only then comment as I see a lot of misinformation and beliefs more based on marketing than reality.
Thank you very much for your wonderful work!
Curious about safety here: Are kernel / cross-thread resources (ex: a mutex/futex/fd) released on unwind (assuming the stack being unwound acquired those)?
All modern OS’s behave this way. When your process starts and is assigned an address, you get an area. It can balloon but it starts somewhere. When the process ends, that area is reclaimed.