The script dependency metadata _is_ standardized[2], so other LSP servers could support a good experience here (at least in theory).
[1] The Ruff Language Server: https://astral.sh/blog/ruff-v0.4.5
[2] Inline script metadata: https://peps.python.org/pep-0723/
If you have some other tool manager on your system (e.g. mise) then you can likely install uv through that.
I have been working with python for over 10 years and have standardized my workflow to only use pip & setuptools for all dependency management and packaging [1]. It works great as a vanilla setup and is 100% standards based. I think uv and similar tools mostly shine when you have massive codebases.
Anyways, software supply chain security and Python & package build signing and then containers and signing them too
Conda-forge's builds are probably faster than the official CPython builds. conda-forge/python-feedstock//recipe/meta.yml: https://github.com/conda-forge/python-feedstock/blob/main/re...
Conda-forge also has OpenBLAS, blos, accelerate, netlib, and Intel MKL; conda-forge docs > switching BLAS implementation: https://conda-forge.org/docs/maintainer/knowledge_base/#swit...
From "Building optimized packages for conda-forge and PyPI" at EuroSciPy 2024: https://pretalx.com/euroscipy-2024/talk/JXB79J/ :
> Since some time, conda-forge defines multiple "cpu-levels". These are defined for sse, avx2, avx512 or ARM Neon. On the client-side the maximum CPU level is detected and the best available package is then installed. This opens the doors for highly optimized packages on conda-forge that support the latest CPU features.
> We will show how to use this in practice with `rattler-build`
> For GPUs, conda-forge has supported different CUDA levels for a long time, and we'll look at how that is used as well.
> Lastly, we also take a look at PyPI. There are ongoing discussions on how to improve support for wheels with CUDA support. We are going to discuss how the (pre-)PEP works and synergy possibilities of rattler-build and cibuildwheel
Linux distros build and sign Python and python3-* packages with GPG keys or similar, and then the package manager optionally checks the per-repo keys for each downloaded package. Packages should include a manifest of files to be installed, with per-file checksums. Package manifests and/or the package containing the manifest should be signed (so that tools like debsums and rpm --verify can detect disk-resident executable, script, data asset, and configuration file changes)
virtualenvs can be mounted as a volume at build time with -v with some container image builders, or copied into a container image with the ADD or COPY instructions in a Containerfile. What is added to the virtualenv should have a signature and a version.
ostree native containers are bootable host images that can also be built and signed with a SLSA provenance attestation; https://coreos.github.io/rpm-ostree/container/ :
rpm-ostree rebase ostree-image-signed:registry:<oci image>
rpm-ostree rebase ostree-image-signed:docker://<oci image>
> Fetch a container image and verify that the container image is signed according to the policy set in /etc/containers/policy.json (see containers-policy.json(5)).So, when you sign a container full of packages, you should check the package signatures; and verify that all package dependencies are identified by the SBOM tool you plan to use to keep dependencies upgraded when there are security upgrades.
e.g. Dependabot - if working - will regularly run and send a pull request when it detects that the version strings in e.g. a requirements.txt or environment.yml file are out of date and need to be changed because of reported security vulnerabilities in ossf/osv-schema format.
Is there already a way to, as a developer, sign Python packages built with cibuildwheel with Twine and TUF or sigstore to be https://SLSA.dev/ compliant?
Fwiw I’m building a thing [1] that does this. Current docs suggest Rye but will s/rye/uv/ shortly. It’s basically just some CLI commands and a Hatch/PDM plugin that injects needed stuff at build-time.
https://docs.astral.sh/uv/concepts/python-versions/#discover...
https://rye.astral.sh/guide/toolchains/#registering-toolchai...
https://prefix.dev/blog/uv_in_pixi
Reasons for liking pixi, over e.g. poetry:
- Like poetry, it keeps everything in a project-local definition file and environment
But unlike poetry, it also:
- Can install python itself, even ancient versions
- Can install conda packages
- Is extremely fast