zlacker

[parent] [thread] 1 comments
1. 16bitv+(OP)[view] [source] 2026-02-07 10:17:39
It's right there in the README.

> Monty avoids the cost, latency, complexity and general faff of using full container based sandbox for running LLM generated code.

> Instead, it let's you safely run Python code written by an LLM embedded in your agent, with startup times measured in single digit microseconds not hundreds of milliseconds.

replies(1): >>vghais+vs
2. vghais+vs[view] [source] 2026-02-07 15:11:17
>>16bitv+(OP)
Oh I did read the README, but still have the question: while it does save on cost, latency and complexity, the tradeoff is that the agents can't run whatever they want in a sandbox, which would make them less capable too.
[go to top]