Right now the generators aren’t effective but they are definitely stepping stones to something better in the future.
If that future thing produces video, movies and pictures better than anything humanity can produce at a rate faster than we can produce things… how is that a waste?
It can arguably be bad for society but definitely not a waste.
Education-style infographics and videos are OK.
It also needs to be vertically integrated to make money, otherwise it's a handout to the materials science company. I can't see any of the AI companies stretching themselves that thin. So they give it away for goodwill or good PR.
I help people turn wire rolling shelf racks into the base of their home studio, and AI can now create a "how to attach something to a wire shelf rack" without me having to do all the space and rack and equipment and lighting and video setup, and just use a prompt. It's not close to perfect yet, but it's becoming useful.
compelling graphics take a long time to create. for education content creators, this can be too expensive as well. my high school physics teacher would hand draw figures on transparencies on an overhead projector. if he could have produced his drawings as animations cheap and fast using AI, it would have really brought his teaching style (he really tried to make it humorous) to another level. I think it would be effective for his audience.
imagine the stylized animations for things like the rebooted Cosmos, NOVA, or even 3Blue1Brown on YT. there is potential for small teams to punch above their weight class with genAI graphics
Not sure what are LLMs supposed to do there.
I have a feeling that's already happened to me.
I don't see how OpenAI or Google can profit from drug discovery. It's nearly pure consumer surplus (where the drug companies and patients are the consumers).
Stop talking about the status quo… we are talking about the projected trendline. What will AI be when it matures?
Second you’re just another demographic. Smaller than fans of Coldplay but equally generic and thus an equal target for generated art.
Here’s a prompt that will one day target you: “ChatGPT, create musical art that will target counter culture posers who think they’re better than everyone just because they like something that isn’t mainstream. Make it so different they will worship that garbage like they worship Pearl Jam. Pretend that the art is by a human so what when they finally figure out they fell for it hook line and sinker they’ll realize their counter culture tendencies are just another form of generic trash fandom no different than people who love cold play or, dare I say it, Taylor swift.”
What do you do then when this future comes to pass and all content even for posers is replicated in ways that are superior?
Draw the trendline into the future. What will happen when the content is indistinguishable and AI is so good it produces something moves people to tears?
I am sure you can think of a few prominent examples.
That said Deepmind are doing a spin-off making drugs https://www.isomorphiclabs.com/
Most of it is used to fool people for engagement, scam, politics or propaganda, it definitely is a huge waste of resource, time, brain and compute power. You have to be completely brainwashed by consumerism and techsolutionism to not see it
AI for science is not "marketed". It silently evolves under the wraps and changes our lives step by step.
There are many AI systems already monitoring our ecosystem and predicting things as you read this comment.
The AI/AGI hype in my opinion could be better renamed to ml with data and compute 'hype' (i don't like the word hype as it doesn't fit very well)
We consume A LOT of entertainment every day. Our brains like that a lot.
Doesn't has to be just video but even normal people not watching tv at all entertain themselves through books or events etc.
Live would be quite boring.
Take your favorite works of art, music and cinema. Imagine if content on that level can be generated by AI in seconds. I wouldn’t classify that as a “waste” at all. You’re obviously referring to bullshit content, I’m referring to content that is meaningful to you and most people. That is where the trendline is pointing. And my point, again is this:
We don’t know the consequence of such a future. But I wouldn’t call such content created by AI a waste if it is objectively superior to content created by humans.
Nobody gives a fuck about what ChatGPT can currently do. It’s not interesting to talk about because it’s obvious. I don’t even understand why you’re just rehashing the obvious response. I’m talking about the future. The progression of LLMs is leading to a future where my prompt leads to a response that is superior to the same prompt given to a human.
What I work on is large and extremely technical.
And no we are not at an infrastructure limit. This statement is insane. We are literally only a couple years into LLMs becoming popular. Everything we see now is just the beginning. You can only make a good judgement call of whether we are at our limit in 10 years.
Because the transition hit so quickly a lot of devs and companies haven’t fully embraced AI yet. Culture is still lagging capability. What you’re saying about ChatGPT was true a year ago. And now one year later, everything you’re saying isn’t remotely true anymore. The pace is frightening. So I don’t blame you for not knowing. Yes AI needs to be managed but it’s at a point where the management no longer hinders you and it instead augments your capabilities.