To begin with, the premise would have to be challenged. Many, many bad regulations are bad because of incompetence or corruption rather than because better regulations are impossible. But let's consider the case where there really are no good regulations.
This often happens in situations where e.g. bad actors have more resources, or are willing to spend more resources, to subvert a system than ordinary people. For example, suppose the proposal is to ban major companies from implementing end-to-end encryption so the police can spy on terrorists. Well, that's not going to work very well because the terrorists will just use a different system that provides E2EE anyway and what you're really doing is compromising the security of all the law-abiding people who are now more vulnerable to criminals and foreign espionage etc.
The answer in these cases, where there are only bad policy proposals, is to do nothing. Accept that you don't have a good solution and a bad solution makes things worse rather than better so the absence of any rule, imperfect as the outcome may be, is the best we know how to do.
The classical example of this is the First Amendment. People say bad stuff, we don't like it, they suck and should shut up. But there is nobody you can actually trust to be the decider of who gets to say what, so the answer is nobody decides for everybody and imposing government punishment for speech is forbidden.
Or go further.
Sometimes the answer is to remove regulations. Specifically, those laws that protect wrongdoers and facilitators of problems. Then you just let nature take its course.
For the mostpart though, this is considered inhumane and unacceptable.
If you're talking about legalizing vigilantism, you would then have to argue that this is a better system and less prone to abuse than some variant of the existing law enforcement apparatus. Which, if you could do it, would imply that we actually should do that. But in general vigilantes have serious problems with accurately identifying targets and collateral damage.
It gets messy because, by definition the moment you remove the laws, the parties cease to be criminals... hence my Bushism "wrongdoers" (can't quite bring myself to say evil-doers :)
One hopes that "criminals" without explicit legal protection become disinclined to act, rather than become victims themselves. Hence my allusion to "nature", as in "Natural Law".
"Might is right" is no good situation either. But I feel there's a time and place for tactical selective removal of protectionism (and I am thinking giant corporations here) to re-balance things.
As a tepid example (not really relevant to this thread), keep copyright laws in place but only allow individuals to enforce them.
I've just finished recording a Cybershow episode with two experts in compliance (ISO42001 coming on the AI regulatory side - to be broadcast in January).
The conversation turned to what carrots can be used instead of sticks? Problem being that large corps simply incorporate huge fines as the cost of doing business (that probably is relevant to this thread)
So to legally innovate, instead, give assistance (legal aid, expert advisor) to smaller firms struggling with compliance. After all governments want companies to comply. It's not a punitive game.
Big companies pay their own way.
You can’t really put a corporation in jail, but you could cut it off from the world in the same way that a person in jail is cut off. Suspend the business for the duration of the sentence. Steal a few thousand bucks? Get shut down for six months, or whatever that sentence would be.
I have imagined a sci-fi skit where James works at CorpCo, a company that was caught doing something illegal and sentences to prison. As punishment James goes to work by reporting in at a prison at 8 am. He sits in his cell until his 'work day' is over and it's released at 5 pm to go home. It's boring, but hey, it pays well.
The point being to allow members of the public to submit a pull request and have their contributions incorporated into the officially-certified codebase if it's accepted, so the code ends up being actually good because the users (i.e. the public) are given the opportunity to fix what irks them.