zlacker

[return to "OpenAI is now everything it promised not to be: closed-source and for-profit"]
1. mellos+pe[view] [source] 2023-03-01 10:46:59
>>isaacf+(OP)
This seems an important article, if for no other reason than it brings the betrayal of its foundational claim still brazenly present in OpenAI's name from the obscurity of HN comments going back years into the public light and the mainstream.

They've achieved marvellous things, OpenAI, but the pivot and long-standing refusal to deal with it honestly leaves an unpleasant taste, and doesn't bode well for the future, especially considering the enormous ethical implications of advantage in the field they are leading.

◧◩
2. ripper+yr[view] [source] 2023-03-01 12:38:21
>>mellos+pe
To quote Spaceballs, they're not doing it for money, they're doing it for a shitload of money.
◧◩◪
3. 93po+7N[view] [source] 2023-03-01 14:52:52
>>ripper+yr
OpenAI, if successful, will likely become the most valuable company in the history of the planet, both past and future.
◧◩◪◨
4. teeker+WX[view] [source] 2023-03-01 15:55:27
>>93po+7N
Really? I feel like they'll go the way of Docker, but faster: Right now super hot, nice tools/API, great PR. But it's build on open and known foundations, soon GPTs will be commodity and then something easier/better FOSS will arise. It may take some time (2-3 years?) but this scenario seems most likely to me.

Edit: Ah didn't get the "reference", perhaps indeed it will be the last of the tech companies ever indeed, at least one started by humans ;).

◧◩◪◨⬒
5. startu+ib1[view] [source] 2023-03-01 16:48:00
>>teeker+WX
Possible. Coding as we know it might get obsolete. And it is a trillion dollar industry.
◧◩◪◨⬒⬓
6. mckrav+yu3[view] [source] 2023-03-02 08:29:28
>>startu+ib1
I was initially impressed and blown away when it could output code and fix mistakes. But the first time I tried to use it for actual work, I fed it some simple stuff and I even had the pseudo code as comments already - all it had to do is to implement it. It made tons of mistakes and trying to correct it felt like way more effort than just implementing it myself. Then that piece of code got much more complex, and I think there's no way this thing is even close to outputting something like that, unless it has seen it already. And that was ChatGPT, I have observed Copilot to be even worse.

Yes, I'm aware though it may get better but we actually don't know that yet. What if it's way harder to go from outputting junior level code with tons of mistakes to error-free complex code, than it is to go from no capability to write code to junior level code with tons of mistakes? What if it's the difference between word prediction algorithm and actual human-level intelligence?

There may be a big decrease in demand, because a lot of apps are quite simple. A lot of software out there are "template apps", stuff that can theoretically be produced by a low code app, will be eventually produced by a low code app, AI or not. When it comes to novel and complex things, I think it's not unreasonable to consider that the next 10 - 20 years will still see plenty demand for good developers.

◧◩◪◨⬒⬓⬔
7. startu+iJ4[view] [source] 2023-03-02 16:47:56
>>mckrav+yu3
Considering that OpenAI started instruction following alignment a month ago, with 1k workers, to do engineering tasks, coding might be solved now.
[go to top]