zlacker

[parent] [thread] 1 comments
1. abshkb+(OP)[view] [source] 2026-02-02 20:33:33
We did train Codex models natively on Windows - https://openai.com/index/introducing-gpt-5-2-codex/ (and even 5.1-codex-max)
replies(1): >>hdjrud+mn
2. hdjrud+mn[view] [source] 2026-02-02 22:12:03
>>abshkb+(OP)
I appreciate this (as a Windows user) but I'm also curious how necessary this was.

Like I notice in Codex in PhpStorm it uses Get-Whatever style PowerShell commands but firstly, I have a perfectly working Git-Bash installed that's like 98% compatible with Linux and Mac. Could it not use that instead of being retrained on Windows-centric commands?

But better yet, probably 95% of the commands it actually needs to run are like cat and ripgrep. Can't you just bundle the top 20 commands, make them OS-agnostic and train on that?

The last tiny bit of the puzzle I would think is the stuff that actually is OS-specific, but I don't know what that would be. Maybe some differences in file systems, sandboxing, networking.

[go to top]