I've never worked on a web platform like Reddit, nor with any per-request priced APIs. Reddit's charge of $0.00024 per request still looks like it is _significantly_ above what their own costs are.
Wasn't Reddit's pay-for-API-access announcement originally phrased as a desire to claw back some of the value that LLMs have found in Reddit data? I don't understand how per-request API pricing actually accomplishes that. (I was vaguely anticipating Reddit's API pricing to have some sort of expensive "firehose" endpoint for OpenAI/Google/Meta/etc to pull from.)
It looks like they're instead going to squeeze out all third-party apps instead. I don't think this bodes well for Reddit's future.
I can only imagine the man hours that went into Google's Adsense bot, and it only had to verify websites rather than mobile apps.
You need highly accurate data about user behaviour around the ads, and highly optimised display and linking.
I've built a similar system, albeit not for ads, for recommendations, based on viewing items in a feed. With a simple model, across web and app, all developed in-house, it was still hard and required a lot of care to get good ML signals and signals that people really understood.
Doing this across third party clients may be prohibitively difficult. I'd like to see attempts, but so far I've not seen any.