zlacker

[return to ""]
1. zahlma+(OP)[view] [source] 2026-02-06 09:27:34
>>dmpetr+JW1
> Instead, it let's you run safely run Python code written by an LLM embedded in your agent, with startup times measured in single digit microseconds not hundreds of milliseconds.

Perhaps if the interpreter is in turn embedded in the executable and runs in-process, but even a do-nothing `uv` invocation takes ~10ms on my system.

I like the idea of a minimal implementation like this, though. I hadn't even considered it from an AI sandboxing perspective; I just liked the idea of a stdlib-less alternative upon which better-thought-out "core" libraries could be stacked, with less disk footprint.

Have to say I didn't expect it to come out of Pydantic.

2. dmpetr+JW1[view] [source] 2026-02-06 21:16:36
3. Cyphas+Pp2[view] [source] 2026-02-07 00:32:51
>>zahlma+(OP)
uv is written in Rust, not Python.
◧◩
4. zahlma+NA3[view] [source] 2026-02-07 15:24:36
>>Cyphas+Pp2
Yes. That's why I compare it (a compiled Rust executable) to Monty (a compiled Rust executable). The point is that loading large compiled executables into memory takes long enough to raise an objection to the "startup times measured in single digit microseconds not hundreds of milliseconds" claim.
[go to top]