Also you pass the data a job needs to run as part of the job payload. Then you don't have the "data doesn't exist" issue.
Wanting to offload heavy work to a background job is absolute as old of a best practice as exists in modern software engineering.
This is especially important for the kind of API and/or web development that a large number of people on this site are involved in. By offloading expensive work, you take that work out-of-band of the request that generated it, making that request faster and providing a far superior user experience.
Example: User sign-up where you want to send a verification email. Talking to a foreign API like Mailgun might be a 100 ms to multisecond (worst case scenario) operation — why make the user wait on that? Instead, send it to the background, and give them a tight < 100 ms sign up experience that's so fast that for all intents and purposes, it feels instant.
Yes. I am intimately familiar with background jobs. In fact I've been using them long enough to know, without hesitation, that you don't use a relational database as your job queue.
In theory an append-only and/or HOT strategy leaning on Postgres just ripping through moderate sized in-mem lists could be incredibly fast. Design would be more complicated and perhaps use case dependent but I bet could be done.