For normal development, I am advocating an always auditable runtime that runs only public source code by design:- https://observablehq.com/@endpointservices/serverless-cells
Before sending data to a URL, you can look up the source code first, as the URL encodes the source location.
There is always the risk I decided to embed a trojan in the runtime (despite it being open source). However, if I am a service provider for 100k customers built upon the idea of a transparent cloud, then compromising the trust of one customer would cause loss of business across all customers. Thus, from a game-theoretic perspective, our incentives should align.
I think running public source code, which does not preclude injecting secrets and keeping data private, is something that normal development teams can do. No PhDs necessary, just normal development.
Follow me on https://twitter.com/tomlarkworthy if you want to see this different way of approaching privacy: always auditable source available server-side implementations. You can trust services implemented this way are safe, because you can always see how they process data. Even if you cannot be bothered to audit their source, the sheer fact that someone can, inoculates you against bad faith implementations.
I am building a transparent cloud. Everything is encoded in public notebooks and runs open-source https://observablehq.com/collection/@endpointservices/servic... There are other benefits, like being able to fork my implementations and customize, but primarily I am doing this for trust through transparency reasons.
Note the endpoint does a DYNAMIC lookup of source code. So you can kinda reassure yourself the endpoint is executing dynamic code just by providing your own source code.
It might be more obvious the runtime does nothing much if you see the runtime https://github.com/endpointservices/serverlesscells
The clever bits that actually implement services are all in the notebooks.
If I was evil, I wouldn't have a totally separate source tree and binary that I shipped; I'd have my CI process inject a patch file. As a result, everything would work as expected - including getting any changes from the public source code - but the created binaries would be backdoored.