So basically what already happens with reddit/twitter/etc but amplified because you give them a direct financial incentive to upvote low effort crap.
Even if they boost their own post a bit for it to get the attention of others, they're still paying $2 per upvote for that. And if their post is no good, people might even just cancel those out with downvotes.
I think the problem with karma/reputation systems is that the source of karma are fungible - anyone's upvote has the same effect on the reputation. And this makes it gameable.
A personalized system can solve this by replacing global reputation with user-to-user trust. Now it matters who upvoted - a random bot or a user whose past contributions have been useful to you.
That's assuming the site lets 10K+ users sign up and pay with crypto, or you have the time to track down and signup for 10K prepaid burner cards. Then, after allo that, you'd have to hope that the site never detects the vote manipulation, since you'd have an account that's getting tons of upvotes from a specific set of users.
Really.. I think this is the worst idea for laundering money I've ever heard of. You'd be better off walking into a casino and putting it all on blackjack until you win a big hand, then reporting the winnings.
In that system how do you create a ranked list of content for a user to browse? Isn't it going to be very heavy on processing demand?
This is more computationally intensive than sorting by the raw number of upvotes or weight upvotes by karma/popularity.
But I think this is a useful computation - the user can be more confident that the content they is is not astroturfed and comes from trustworthy users.
Details of how trust is calculated: https://linklonk.com/item/3292763817660940288