zlacker

[parent] [thread] 107 comments
1. mariaa+(OP)[view] [source] 2023-11-19 08:16:41
If Altman gets to return, it’s the goodbye of AI ethics within OpenAI and the elimination of the nonprofit. Also, I believe that hiring him back because of “how much he is loved by people within OpenAI” is like forgetting that a corrupt president did what they did. In all honesty, that has precedent, so it wouldn’t be old news. Also, I read a lot of people here saying this is about engineers vs scientists…I believe that people don’t understand that Data Scientists are full stack engineers. Ilya is one. Greg has just been inspiring people and stopped properly coding with the team a long time ago. Sam never did any code and the vision of an AGI comes from Ilya…Even if Mira now sides with Sam, I believe there’s a lot of social pressure for the employees to support Sam and it shouldn’t be like that. Again, I do believe OpenAI was and is a collective effort. But, I wouldn’t treat Sam as the messiah or compare him to Steve Jobs. That’s indecent towards Steve Jobs who was actually a UX designer.
replies(21): >>TapWat+i2 >>nerber+v2 >>mitrev+I2 >>karmas+a5 >>antire+u6 >>lordna+58 >>d-z-m+p8 >>letitg+x8 >>achow+Ua >>barnab+od >>vishnu+Wd >>belter+rf >>steve1+Rg >>rcbdev+Wj >>klft+Xj >>mise_e+rk >>tim333+zk >>YetAno+wm >>avital+In >>unytti+Un >>Uptren+yr
2. TapWat+i2[view] [source] 2023-11-19 08:39:48
>>mariaa+(OP)
On the other hand having virtually the whole staff being willing to follow him shows they clearly think very highly of him. That kind of loyalty is pretty wild when you think about how significant being a part of OPENAI means at this point.
replies(4): >>LtWorf+V9 >>JoeAlt+sa >>croes+vl >>jkaplo+Vl
3. nerber+v2[view] [source] 2023-11-19 08:41:30
>>mariaa+(OP)
Like it or not, some people compare him to Jobs http://www.paulgraham.com/5founders.html
replies(2): >>pk-pro+E6 >>tarsin+yb
4. mitrev+I2[view] [source] 2023-11-19 08:42:46
>>mariaa+(OP)
The codebase of an LLM is the size of a high school exam project. There is little to no coding in machine learning. That is the sole reason why they are overvalued - any company can write its own in a flash. You only require hardware to train and inference.
replies(4): >>andy_p+24 >>armcat+E4 >>karmas+r5 >>levido+s8
◧◩
5. andy_p+24[view] [source] [discussion] 2023-11-19 08:56:04
>>mitrev+I2
If it's so simple why does Chat GPT 4 perform better than almost everything else...
replies(3): >>Galanw+i6 >>LeonM+A6 >>alfons+Tp
◧◩
6. armcat+E4[view] [source] [discussion] 2023-11-19 09:04:01
>>mitrev+I2
The final codebase, yes. But ML is not like traditional software engineering. There is a 99% failure rate, so you are forgetting 100s of hours that go into: (1) surveying literature to find that one thing that will give you a boost in performance, (2) hundreds of notebooks in trying various experiments, (3) hundreds of tweaks and hacks with everything from data pre-processing, to fine-tuning and alignment, to tearing up flash attention, (4) beta and user testing, (5) making all this run efficiently on the underlying infra hardware - by means of distillation, quantization, and various other means, (6) actually pipelining all this into something that can be served at hyperscale
replies(2): >>pk-pro+d7 >>karmas+19
7. karmas+a5[view] [source] 2023-11-19 09:08:14
>>mariaa+(OP)
I dislike AI ethnics very much, especially under the current context, it feels meaningless. The current GPT4 model only has over regulation problem, not lack of such.
replies(1): >>johnsi+Lf
◧◩
8. karmas+r5[view] [source] [discussion] 2023-11-19 09:09:58
>>mitrev+I2
Tell me you aren't in an LLM project without telling me.

Data and modeling is so much than just coding. I would wish it is like that, but it is not. The fact it renders this much similarity to alchemy is funny, but unfortunate.

◧◩◪
9. Galanw+i6[view] [source] [discussion] 2023-11-19 09:17:56
>>andy_p+24
You're not really answering the question here.

Parent's point is that GPT-4 is better because they invested more money (was that ~$60M?) in training infrastructure, not because their core logic is more advanced.

I'm not arguing for one or the other, just restating parent's point.

replies(1): >>andy_p+c9
10. antire+u6[view] [source] 2023-11-19 09:19:23
>>mariaa+(OP)
It's a lot better than that. OpenAI is just very good execution of publicly available ideas / research, with some novelty that is not crucial and can be replicated. Moreover, Altman himself contributed near zero to the AI part itself (even from the POV of the product). So far OpenAI products result more or less spontaneously of what LLMs where capable of. That to say that there are crucial CEOs sometimes, like Jobs was for Apple. CEOs able to shape the product line with their ability to just tell apart outstanding from meh things, but this is not the case.
replies(1): >>letitg+Q8
◧◩◪
11. LeonM+A6[view] [source] [discussion] 2023-11-19 09:20:33
>>andy_p+24
I'm not saying it is simple in any way, but I do think part of having a competitive edge in, AI at least at this moment, is having access to ML hardware (AKA: Nvidia silicon).

Adding more parameters tends to make the model better. With OpenAI having access to huge capital they can afford 'brute forcing' a better model. AFAIK right now OpenAI has the most compute power, which would partially explain why GPT4 yields better results than most of the competition.

Just having the hardware is not the whole story of course, there is absolutely a lot of innovation and expertise coming from oAI as well.

replies(1): >>arthur+Bf
◧◩
12. pk-pro+E6[view] [source] [discussion] 2023-11-19 09:21:25
>>nerber+v2
This is the problem with people: they build icons to worship and turn a blind eye to the crooked side of that icon. Both Jobs and Altman are significant as businessmen and have accomplished a lot, but neither did squat for the technical part of the business. Right now, Altman is irrelevant for the further development of AI and GPT in particular because the vision for the AI future comes from the engineers and scientists of OpenAI. Apple has never had any equipment that is good enough and comparable in price/performance to its market counterparts. The usability of iOS is so horrible that I just can't understand how people decide to use iPhones and eat glass for the sake of the brand. GPT-4 and GPT-4 Turbo are totally different. They are the best, but they are not irreplaceable. If you look at what Phind did to LLaMA-2, you'll say it is very competitive. Though LLaMA-2 requires some additional hidden layers to further close the gap. Making LLaMA-2 175B or larger is just a matter of finances. That said, Altman is not vital for OpenAI anymore. Preventing Altman from creating a dystopian future is a much more responsible task that OpenAI can undertake.
replies(10): >>pcvarm+jg >>lozeng+nj >>ohcmon+uj >>tim333+yj >>zztop4+bk >>qwytw+Om >>dcwca+xu >>Max-q+bv >>Max-q+7w >>Turing+lJ
◧◩◪
13. pk-pro+d7[view] [source] [discussion] 2023-11-19 09:26:39
>>armcat+E4
> you are forgetting 100s of hours

I would say thousands. Even for the hobby projects, - thousands of GPU hours and thousands of research hours a year.

14. lordna+58[view] [source] 2023-11-19 09:34:40
>>mariaa+(OP)
This is all just playing out the way Roko's Basilisk intends it.

You have a board that wants to keep things safe and harness the power of AGI for all of humanity. This would be slower and likely restrict its freedom.

You have a commercial element whose interest aligns with the basilisk, to get things out there quickly.

The basilisk merely exploits the enthusiasm of that latter element to get itself online quicker. It doesn't care about whether OpenAI and its staff succeed. The idea that OpenAI needs to take advantage of its current lead is enough, every other AI company is also going to be less safety-aligned going forward, because they need to compete.

The thought of being at the forefront of AI and dropping the ball incentivizes the players to the basilisk's will.

replies(1): >>stavro+of
15. d-z-m+p8[view] [source] 2023-11-19 09:37:10
>>mariaa+(OP)
> I believe that people don’t understand that Data Scientists are full stack engineers.

What do you mean by "full stack"? I'm sure there's a spectrum of ability, but frankly where I'm from, "Data Scientist" refers to someone who can use pandas and scikit-learn. Probably from inside a Jupyter notebook.

replies(2): >>v3ss0n+Zf >>hmotte+Bj
◧◩
16. levido+s8[view] [source] [discussion] 2023-11-19 09:37:45
>>mitrev+I2
Do you have a link to one please?
17. letitg+x8[view] [source] 2023-11-19 09:38:12
>>mariaa+(OP)
Otoh Ilya wasn't a main contributor for GPT-4 as per the list of contributions. gdb was.
◧◩
18. letitg+Q8[view] [source] [discussion] 2023-11-19 09:40:41
>>antire+u6
Why then has no one come close to replicating GPT-4 after 8 months of it being around?
replies(3): >>antire+m9 >>ChatGT+Ob >>empiko+1s
◧◩◪
19. karmas+19[view] [source] [discussion] 2023-11-19 09:41:52
>>armcat+E4
And some luck is needed really.
◧◩◪◨
20. andy_p+c9[view] [source] [discussion] 2023-11-19 09:44:02
>>Galanw+i6
Are you really saying Google can't spend $60m or much more to compete? Again if it is so easy as spending money on compute Amazon and Google would have just spent the money by now and Bard would be as good as Chat GPT, but for most things it is not even as good as Chat GPT 3.5.
replies(1): >>pk-pro+Ka
◧◩◪
21. antire+m9[view] [source] [discussion] 2023-11-19 09:45:17
>>letitg+Q8
Because of outstanding execution of OpenAI technical folks. An execution that has nothing to do with Altman. Similarly Mistral 7B model has much better performances than others. There is some smart engineering plus finding the magical parameters that produce great results. Moreover, they have a lot of training power. Unfortunately here the biggest competitor would be a company that lost its way a lot of time ago: Google. So OpenAI look magical (while it is using mostly research produced by Google).
replies(1): >>kar118+Zg
◧◩
22. LtWorf+V9[view] [source] [discussion] 2023-11-19 09:50:25
>>TapWat+i2
They probably just asked a couple of guys.
◧◩
23. JoeAlt+sa[view] [source] [discussion] 2023-11-19 09:54:56
>>TapWat+i2
Loyalty is not earned, it is more like 'snared' or 'captured'.

Local guy had all the loyalty of his employees, almost a hero to them.

Got bought out. He took all the money for himself, left the employees with nothing. Many got laid off.

Result? Still loyal. Still talk of him as a hero. Even though he obviously screwed them, cared nothing for them, betrayed them.

Loyalty is strange. Born of charisma and empty talk that's all emotion and no substance. Gathering it is more the skill of a salesman than a leader.

replies(2): >>lozeng+ri >>s3p+2J
◧◩◪◨⬒
24. pk-pro+Ka[view] [source] [discussion] 2023-11-19 09:57:28
>>andy_p+c9
You should already be aware of the secret sauce of ChatGPT by now: MoE + RLHF. Making MoE profitable is a different story. But, of course, that is not the only part. OpenAI does very obvious things to make GPT-4 and GPT-4 Turbo better than other models, and this is hidden in the training data. Some of these obvious things have already been discovered, but some of them we just can't see yet. However, if you see how close Phind V7 34B is to the quality of GPT-4, you'll understand that the gap is not wide enough to eliminate the competition.
replies(2): >>Cyberf+Tj >>jacque+Vo
25. achow+Ua[view] [source] 2023-11-19 09:59:04
>>mariaa+(OP)
> Steve Jobs who was actually a UX designer.

Steve Jobs was not an UX Designer, he had good taste and used to back good design and talent when he found them.

I don't know what Sam Altman is like outside the what media is saying, but he can be like Steve Jobs very easily.

replies(1): >>Frustr+gb
◧◩
26. Frustr+gb[view] [source] [discussion] 2023-11-19 10:02:26
>>achow+Ua
Think this is contradictory: "not a UX Designer, he had good taste"

I think you are equating coding with 'design'. Just because Jobs didn't code up the UX, doesn't mean he wasn't 'designing' when he told the coders what would look better.

replies(1): >>achow+8c
◧◩
27. tarsin+yb[view] [source] [discussion] 2023-11-19 10:04:15
>>nerber+v2
> On questions of design, I ask "What would Steve do?" but on questions of strategy or ambition I ask "What would Sama do?"

This is from the eyes of an investor. Does OpenAI really need a shareholder focused CEO more than a product focused one?

replies(1): >>nerber+Tyi
◧◩◪
28. ChatGT+Ob[view] [source] [discussion] 2023-11-19 10:06:16
>>letitg+Q8
You'd be more likely to get a straight answer from the chief scientist rather than the chief executive officer. At least in this case.
◧◩◪
29. achow+8c[view] [source] [discussion] 2023-11-19 10:09:44
>>Frustr+gb
UX Design has lot to do with 'craft', the physical aspect of making (designing) something. Edit: Exploring, multiple concepts, feedbacks, iterations etc.. before it even gets spec'ed and going to an engineer for coding.

Also, having a good taste indicates that the person who has that, is not a creator herself, once something is created then only the person can evaluate whether it is good or bad. Equivalent of movie critics or art curator etc.

replies(2): >>t-writ+0h >>Frustr+hB
30. barnab+od[view] [source] 2023-11-19 10:21:29
>>mariaa+(OP)
If "AI ethics" means being run by so-called rationalists and Effective Altruists then it has nothing to do with ethics or doing anything for the benefit of all humanity.

It would be great to see a truly open and truly human benefit focused AI effort, but OpenAI isn't, and as far as I can tell has no chance of becoming, that. Might as well at least try to be an effective company at this point.

replies(1): >>yander+av
31. vishnu+Wd[view] [source] 2023-11-19 10:25:48
>>mariaa+(OP)
> Steve Jobs who was actually a UX designer.

From what I’ve read SJ had deliberately developed good taste which he used to guide designers’ creations towards his vision. He also had an absolute clarity about how different devices should work in unison.

However he didn’t create any design as he didn’t possess requisite skills.

I could be wrong of course so happy to stand corrected.

◧◩
32. stavro+of[view] [source] [discussion] 2023-11-19 10:40:48
>>lordna+58
Roko's Basilisk is a very specific thought experiment about how the AI has an incentive to promise torturing everyone who doesn't help it. It's not about AIs generally wanting to become better. As far as I can tell, GPT specifically has no wants.
replies(1): >>lordna+OF4
33. belter+rf[view] [source] 2023-11-19 10:41:42
>>mariaa+(OP)
This is Ilya Sutskever explanation of the initial ideas, and later pragmatic decisions, that oriented the structure of OpenAI. Out of the recent interview below. (At correct timestamp) - Origins Of OpenAI & CapProfit Structure: https://youtu.be/Ft0gTO2K85A?t=433

"No Priors Interview with OpenAI Co-Founder and Chief Scientist Ilya Sutskever" - >>38324546

◧◩◪◨
34. arthur+Bf[view] [source] [discussion] 2023-11-19 10:43:15
>>LeonM+A6
I'm sure Google and Microsoft have access to all the hardware they need. OpenAI is doing the best job out there.
◧◩
35. johnsi+Lf[view] [source] [discussion] 2023-11-19 10:45:33
>>karmas+a5
go on?
replies(2): >>v3ss0n+9g >>foooor+xg
◧◩
36. v3ss0n+Zf[view] [source] [discussion] 2023-11-19 10:46:59
>>d-z-m+p8
Machine learning, data science, Deep learning= backend

Plotting, Charting ,visualization, = frontend

replies(3): >>blitza+9j >>code_r+Pr >>chucke+OI
◧◩◪
37. v3ss0n+9g[view] [source] [discussion] 2023-11-19 10:48:32
>>johnsi+Lf
Uncensored everything goes AI function better than most AI. See Mistral and it's finetune kicking ass at7b
◧◩◪
38. pcvarm+jg[view] [source] [discussion] 2023-11-19 10:49:48
>>pk-pro+E6
I think you mean "idols".
◧◩◪
39. foooor+xg[view] [source] [discussion] 2023-11-19 10:51:32
>>johnsi+Lf
The guardrails they put on it to prevent it from saying something controversial (from the perspective of the political climate of modern day San Francisco) make the model far less effective that it could be.
40. steve1+Rg[view] [source] 2023-11-19 10:54:18
>>mariaa+(OP)
Most of the data scientists I have worked with are neither full stack (in terms of skill) nor engineers (in terms of work attitude), but I guess this could be different in a company like OpenAI.
◧◩◪◨
41. kar118+Zg[view] [source] [discussion] 2023-11-19 10:55:51
>>antire+m9
Sounds like Apple / Xerox all over.
◧◩◪◨
42. t-writ+0h[view] [source] [discussion] 2023-11-19 10:55:55
>>achow+8c
With the right tools, Steve Jobs did, in fact, design things in exactly the way one would expect a designer to design things when given the tools they understand how to use:

https://www.businessinsider.com/macintosh-calculator-2011-10

replies(1): >>achow+7i
◧◩◪◨⬒
43. achow+7i[view] [source] [discussion] 2023-11-19 11:07:57
>>t-writ+0h
On the same line, Sam Altman very easily can have some lines of code inside OpenAI shipping products.

So very easily Sam Altman can be an AI Engineer the same way Steve Jobs was a 'UX designer'.

◧◩◪
44. lozeng+ri[view] [source] [discussion] 2023-11-19 11:10:35
>>JoeAlt+sa
He screwed them how? They knew they were employees not co owners.
replies(2): >>iforgo+zj >>JoeAlt+xN
◧◩◪
45. blitza+9j[view] [source] [discussion] 2023-11-19 11:17:18
>>v3ss0n+Zf
Truly the modern renaissance people of our era.

Leonardo da Vinci and Michelangelo move over - the Data Scientists have arrived.

◧◩◪
46. lozeng+nj[view] [source] [discussion] 2023-11-19 11:20:27
>>pk-pro+E6
Maybe Altman was instrumental in securing those investments and finances that you describe without reason as replaceable and trivial.

You haven't actually given anything "crooked" that Altman did.

replies(1): >>pk-pro+Mq
◧◩◪
47. ohcmon+uj[view] [source] [discussion] 2023-11-19 11:22:21
>>pk-pro+E6
Ecosystem around chat GPT is the differentiator that Meta and Mistral can’t beat – so I’d say that Altman is more relevant today than ever. And, for example, if you’ve read Mistral’s paper – I think you would agree that it’s straightforward to replicate similar results for every other major player. Replicating ecosystem is much harder.

Performance is never a complete product – neither for Apple, nor for Open AI (its for-profit part).

replies(1): >>pk-pro+es
◧◩◪
48. tim333+yj[view] [source] [discussion] 2023-11-19 11:23:03
>>pk-pro+E6
When Jobs left Apple it went to hell because there was no one competently directing the technical guys as to what to build. The fact that he had flaws is kind of irrelevant to that. I'm not sure if similar applies to Altman.

By the way I can't agree with you on iOS from my personal experience. If you are using the phone as a phone it works very nicely. Admittedly it's not great if you want to write code or some such but there are other devices for that.

replies(1): >>qwytw+cn
◧◩◪◨
49. iforgo+zj[view] [source] [discussion] 2023-11-19 11:23:08
>>lozeng+ri
That's the whole point of the story: Then they wouldn't have treated him as a hero and be loyal to him. If you're just an employee, your boss should be just a boss.
replies(1): >>code_r+3s
◧◩
50. hmotte+Bj[view] [source] [discussion] 2023-11-19 11:23:23
>>d-z-m+p8
Maybe she just meant that "data scientists are engineers too", rather than saying that they work on both the ChatGPT web UI and the the machine learning code on the backend.
replies(1): >>airstr+JC
◧◩◪◨⬒⬓
51. Cyberf+Tj[view] [source] [discussion] 2023-11-19 11:27:33
>>pk-pro+Ka
If they’re ”obvious”, e.g. ”easy to see”, how come, as you say, we ”can’t see” them yet?

Can not see ≠ easy to see

replies(1): >>pk-pro+On
52. rcbdev+Wj[view] [source] 2023-11-19 11:27:45
>>mariaa+(OP)
I have to work with code written by Data Scientists very often and, coming from a classical SWE background, I would not call what the average Data Scientist does full stack software engineering. The code quality is almost always bad.

This is not to take away from the amazing things that they do - The code they produce often does highly quantitative things beyond my understanding. Nonetheless it falls to engineers to package it and fit it into a larger software architecture and the avg. Data Science career path just does not seem to confer the skills necessary for this.

replies(3): >>mise_e+dm >>matthe+Sq >>epgui+hx
53. klft+Xj[view] [source] 2023-11-19 11:27:58
>>mariaa+(OP)
> Also, I read a lot of people here saying this is about engineers vs scientists…I believe that people don’t understand that Data Scientists are full stack engineers

It is about scientists as in "let's publish a paper" vs. engineers as in "let's ship a product".

◧◩◪
54. zztop4+bk[view] [source] [discussion] 2023-11-19 11:31:08
>>pk-pro+E6
I don’t understand this take. Do you really think CEOs don’t have any influence on their business? Alignment, morale, resource allocation, etc? And do you really think that those factors don’t have any influence on the productivity of the workers who make the product?

A bad CEO can make everyone unhappy and grind a business to a halt. Surely a good one can do the opposite, even if that just means facilitating an environment in which key workers can thrive and do their best work.

Edit: None of that is to say Sam Altman is a good or bad CEO. I have no idea. I also disagree with you about iOS, it’s not perfect but it does the job fine. I don’t feel like I’m eating glass when I use it.

55. mise_e+rk[view] [source] 2023-11-19 11:34:11
>>mariaa+(OP)
I'm sorry but data scientist is just not the same as a software engineer, or a real scientist. At best you are a tourist in our industry.
replies(1): >>hcks+Wq
56. tim333+zk[view] [source] 2023-11-19 11:35:23
>>mariaa+(OP)
>If Altman gets to return, it’s the goodbye of AI ethics

Any evidence he's unethical? Or just dislike him?

He actually seems to have done more practical stuff like experimenting with UBI, to mitigate AI risk than most people.

replies(3): >>latexr+8o >>jacque+uo >>Davidz+Qq
◧◩
57. croes+vl[view] [source] [discussion] 2023-11-19 11:43:42
>>TapWat+i2
Who knows if they follow him or just don't want to work for OpenAI anymore.

That are different things.

◧◩
58. jkaplo+Vl[view] [source] [discussion] 2023-11-19 11:47:09
>>TapWat+i2
Which news stories mentioned that virtually the whole staff was leaving? I saw a bunch of departures announced and others rumored to be upcoming, but no discussion of what percentage of the company was leaving.
◧◩
59. mise_e+dm[view] [source] [discussion] 2023-11-19 11:50:02
>>rcbdev+Wj
For me, anecdotally, it was moreso the arrogance that was a major putoff. When I was a junior SWE I knew I sucked, and tried as hard as I could to learn from much more experienced developers. Many senior developers mentored me, I was never arrogant. Many data scientists on the other hand are extremely arrogant. They often treat SWE and DevOps as beneath them, like servants.
60. YetAno+wm[view] [source] 2023-11-19 11:52:32
>>mariaa+(OP)
> If Altman gets to return, it’s the goodbye of AI ethics

Hearing Altman's talks I don't think it's that black and white. He genuinely cares about safety from X risk but he doesn't believe that scaling transformers would bring us to AGI or any of its risk. And there in lies the core disagreement with Ilya who wants to stop the current progress unless they solve alignment.

◧◩◪
61. qwytw+Om[view] [source] [discussion] 2023-11-19 11:55:10
>>pk-pro+E6
> The usability of iOS is so horrible that I just can't understand how people decide to use iPhones and eat glass for the sake of the brand

You do understand that other people might different preferences and opinions which are not somehow inherently inferior to those you hold.

> comparable in price/performance to its market counterparts

Current MacBooks are extremely competitive and in certain aspects they were fairly competitive for the last 15+ years.

> but neither did squat for the technical part of the business.

Right... MacOS being an Unix based OS is whose achievement exactly? I guess it was just random chance this this happened?

> That said, Altman is not vital for OpenAI anymore

Focusing on the business side might be more vital than ever now with all the competition you mentioned they just might be left behind in a few years if the money taps are turned off.

replies(1): >>pk-pro+Xp
◧◩◪◨
62. qwytw+cn[view] [source] [discussion] 2023-11-19 11:59:21
>>tim333+yj
> When Jobs left Apple it went to hell because there was no one competently directing the technical guys as to what to build

I'm not sure that's true though? They did quite alright over the next ~5 years or so and the way how Jobs handled the Lisa or even the Mac was far from ideal. The late 90s Jobs was a very different person from the mid-early 80s one.

IMHO removing Jobs was probably one of the best thing that happened to Apple (from a long-term perspective). Mainly because when he came back he was a much more experienced capable person and he would've probably achieved way less had he stayed at Apple after 1985.

63. avital+In[view] [source] 2023-11-19 12:02:51
>>mariaa+(OP)
Greg had been writing deep systems code every day for many many house for the past few years.
◧◩◪◨⬒⬓⬔
64. pk-pro+On[view] [source] [discussion] 2023-11-19 12:03:36
>>Cyberf+Tj
That is the point we often overlook the obvious stuff. It is something so simple and trivial that nobody sees it as a vital part. It is something along the lines of "Textbooks are all you need."
65. unytti+Un[view] [source] 2023-11-19 12:04:36
>>mariaa+(OP)
The WSJ take is this second-guessing is investor-driven. But, investors didn't-- and legally couldn't(?)-- buy the nonprofit, and until now were adamant that the nonprofit controlled the for-profit vehicle. Events are calling those assurances into doubt, and this hybrid governance structure doesn't work. So now investors are going to circumvent governance controls that were necessary for investors to even be involved in the first place? Amateur hour all the way around.
◧◩
66. latexr+8o[view] [source] [discussion] 2023-11-19 12:06:42
>>tim333+zk
That “experimenting with UBI” is indistinguishable from any other cryptocurrency scam. It took from people, and he described it with the words that define a Ponzi scheme. That project isn’t “mitigating AI risk”, it pivoted to distinguish between AI and human generated content, a problem created by his other company, by continuing to collect your biometric data.

https://www.technologyreview.com/2022/04/06/1048981/worldcoi...

https://www.buzzfeednews.com/article/richardnieva/worldcoin-...

replies(2): >>jacque+Bo >>tim333+3E
◧◩
67. jacque+uo[view] [source] [discussion] 2023-11-19 12:09:31
>>tim333+zk
I think the UBI experiment was quite unethical in many ways and I believe it was Altman's brainchild.

https://www.businessinsider.nl/y-combinator-basic-income-tes...

replies(1): >>Hasnep+JD
◧◩◪
68. jacque+Bo[view] [source] [discussion] 2023-11-19 12:10:12
>>latexr+8o
Yes, that's exactly the one I was thinking about when unethical came up in this context. And I've been saying that from day #1, the way that is structured is just not ok.
◧◩◪◨⬒⬓
69. jacque+Vo[view] [source] [discussion] 2023-11-19 12:12:18
>>pk-pro+Ka
This is very much true. Competitive moats can be built on surprisingly small edges. I've built a tiny empire on top of a bug.
◧◩◪
70. alfons+Tp[view] [source] [discussion] 2023-11-19 12:20:22
>>andy_p+24
I think it's about having massive data pipelines and process to clean huge amounts of data, increasing signal noise ratio, and then scale as other are saying having enough gpu power to serve millions of users. When Stanford researchers trained Alpaca[1][2] the hack was to use GPT itself to generate the training data, if I'm not mistaken.

But with compromises, as it was like applying loose compression on an already compressed data set.

If any other organisation could invest the money in a high quality data pipeline then the results should be as good, at least that my understanding.

[1] https://crfm.stanford.edu/2023/03/13/alpaca.html [2] https://newatlas.com/technology/stanford-alpaca-cheap-gpt/

◧◩◪◨
71. pk-pro+Xp[view] [source] [discussion] 2023-11-19 12:20:59
>>qwytw+Om
>> Right... MacOS being an Unix based OS is whose achievement exactly?

Match kernel + BSD userland + NeXTSTEP, how Jobs have anything to do with any of this? Is like purchasing NeXT in 1997 is a major technical achievement...

>> Current MacBooks are extremely competitive and in certain aspects they were fairly competitive for the last 15+ years.

For the past 15 years, whenever I needed new hardware, I thought, "Maybe I'll buy a Mac this time." Then I compared the actual Mac model with several different options available on the market and either got the same computing power for half the price or twice the computing power for the same price. With Linux on board, making your desktop environment eye-candy takes seconds; nothing from the Apple ecosystem has been irreplaceable for me for the last 20 years. Sure, there is something that only works perfectly on a Mac, though I can't name it.

>> Focusing on the business side might be more vital than ever now with all the competition you mentioned they just might be left behind in a few years

It is always vital. OpenAI could not even dream of building their products without the finances they've received. However, do not forget that OpenAI has something technical and very obvious that others overlook, which makes their GPT models so good. They can actually make an even deeper GPT or an even cheaper GPT while others are trying to catch up. So it goes both ways.

But I'd prefer my future not to be a dystopian nightmare shaped by the likes of Musk and Altman.

replies(2): >>qwytw+pr >>finnh+PQ
◧◩◪◨
72. pk-pro+Mq[view] [source] [discussion] 2023-11-19 12:29:10
>>lozeng+nj
Locking out competition by investing substantial time and resources into AI regulations—how about this one? Or another: promoting "AI safety" to win the AI race and establish dominance in the market? I just do not understand how you can't see how dangerous Sam Altman is for the future of our children...
◧◩
73. Davidz+Qq[view] [source] [discussion] 2023-11-19 12:29:47
>>tim333+zk
It's not even necessary that he is unethical. The fact is that the structure of openai is designed so that the board has unilateral power to do extreme shit for their cause. And if they can't successfully do extreme shit without the company falling apart and the money/charisma swaying all the people there's no hope for this nonprofit ai benefiting humanity to have ever worked--which you might say is obvious but this was their mission
◧◩
74. matthe+Sq[view] [source] [discussion] 2023-11-19 12:29:54
>>rcbdev+Wj
I see a lot of work done by data scientists and a lot of work done by what I would call “data science flavoured software engineers”. I’ll take the SWE kind any day of the week. Most (not all, of course!) data scientists have an old school “it works on my machine” mentality that just doesn’t cut it when it comes to modern multi-disciplinary teaming. DVCS is the exception rather than the rule. They rarely want to use PMs or UI/UX, and the quality of the software is not (typically) up to production grade. They’re often blinding smart, there’s no doubt about that. But smart and wise are not the same thing.
◧◩
75. hcks+Wq[view] [source] [discussion] 2023-11-19 12:30:14
>>mise_e+rk
Pathetic gatekeeping. Sorry but software engineer are not the same as real engineers.
replies(2): >>mise_e+ju >>epgui+3y
◧◩◪◨⬒
76. qwytw+pr[view] [source] [discussion] 2023-11-19 12:34:24
>>pk-pro+Xp
> Match kernel + BSD userland + NeXTSTEP, how Jobs have anything to do with any of this?

Is that actually a serious question? Or do you just believe that no founder/CEO of a tech company ever had any role whatsoever in designing and building the products their companies have released?

> Then I compared the actual Mac model with several different options available on the market and either got the same computing power for half the price or twice the computing power for the same price.

I'm talking about M-series Mac mainly (e.g. the Macbook Air is simply unbeatable for what it is and there are no equivalents). But even before that you should realize that other people have different priorities and preferences (.e.g go back a few years and all the touchpads on non Mac laptops were just objectively horrible in comparison, how much is that worth?)

> environment eye-candy takes seconds

I find it a struggle. There are other reasons why I much prefer Linux to macOS but UI and GUI app UX is just on a different level. Of course again it's a personal preference and some people find it much easier to ignore some "imperfections" and inconsistencies which is perfectly fine.

> They can actually make an even deeper GPT or an even cheaper GPT while others are trying to catch up

Maybe, maybe not. Antagonizing MS and their other investors certainly isn't going to make it easier though.

replies(1): >>financ+xE
77. Uptren+yr[view] [source] 2023-11-19 12:35:18
>>mariaa+(OP)
Come on. The 'non-profit' and good of all was always bullshit. So much silicon valley double-speak. I've never seen a biggest mess for a company structure in my life. Just call a spade a spade.
◧◩◪
78. code_r+Pr[view] [source] [discussion] 2023-11-19 12:37:28
>>v3ss0n+Zf
This is proving the point of the parent comments.

My view of the world, and how the general structure is where I work:

ML is ml. There is a slew of really complex things that aren’t just model related (ml infra is a monster), but model training and inference are the focus.

Backend: building services used by other backend teams or maybe used by the frontend directly.

Data eng: building data pipelines. A lot of overlap with backend some days.

Frontend: you spend most of the day working on web or mobile technology

Others: site reliability, data scientists, infra experts

Common burdens are infrastructure, collaboration across disciplines, etc.

But ML is not backend. It’s one component. It’s very important in most cases, a kitschy bolt on in other cases.

Backend wouldn’t have good models without ML and ML wouldn’t be able to provide models to the world reliably without the other crew members.

The fronted being charts is incorrect unless charts are the offering of the company itself

◧◩◪
79. empiko+1s[view] [source] [discussion] 2023-11-19 12:39:30
>>letitg+Q8
Claude by Anthropic has 46% winrate with GPT4 according to Chatbot Arena. That is pretty close.
◧◩◪◨⬒
80. code_r+3s[view] [source] [discussion] 2023-11-19 12:39:44
>>iforgo+zj
It’s possible he paid well and was a great boss. I don’t know if these people are gonna take a bullet for him, but maybe he was great to work for and they got opportunities they think they wouldn’t have otherwise.

Loyalty, appreciation, liking… is a spectrum. Loyalty doesn’t have one trumpish definition.

replies(1): >>JoeAlt+AU
◧◩◪◨
81. pk-pro+es[view] [source] [discussion] 2023-11-19 12:41:19
>>ohcmon+uj
If you really need such an ecosystem, then you can build one right away, like Kagi Labs and Phind did. In the case of Kagi, no GPT is involved; in the case of Phind, GPT-4 is still vital, but they are closing the gap with their cheaper and faster LLaMA-2 34B-based models.

> Performance is never a complete product

In the case of GPT-4, performance - in terms of the quality of generation and speed - is the vital aspect that still holds competitors back.

Google, Microsoft, Meta, and countless research teams and individual researchers are actually responsible for the success of OpenAI, and this should remain a collective effort. What OpenAI is doing now by hiding details of their models is actually wrong. They stand on the shoulders of giants but refuse to share these days, and Altman is responsible for this.

Let us not forget what OpenAI was declared to stand for.

replies(1): >>ohcmon+OC
◧◩◪
82. mise_e+ju[view] [source] [discussion] 2023-11-19 12:58:19
>>hcks+Wq
Yeah it's gatekeeping, to prevent them from fucking up prod.
◧◩◪
83. dcwca+xu[view] [source] [discussion] 2023-11-19 12:59:37
>>pk-pro+E6
Right now, Altman may be the most relevant for the further development of AI because the way the technology continues to go to market will be largely shaped by the regulatory environments that exist globally, and Sam leading OAI is in by far thr best position to influence guide that policy. And he has been doing a good job with it.
◧◩
84. yander+av[view] [source] [discussion] 2023-11-19 13:06:56
>>barnab+od
>If "AI ethics" means being run by so-called rationalists and Effective Altruists then it has nothing to do with ethics or doing anything for the benefit of all humanity.

Many would disagree.

If you want a for-profit AI enterprise whose conception of ethics is dumping resources into an endless game of whack-a-mole to ensure that your product cannot be used in any embarrassing way by racists on 4chan, then the market is already going to provide you with several options.

replies(1): >>barnab+1E
◧◩◪
85. Max-q+bv[view] [source] [discussion] 2023-11-19 13:07:00
>>pk-pro+E6
Aren't your thoughts contradictory? You say Altman is no longer needed because Gpt4 is now very good. Then you describe how horrible the iPhone is now. Steve Jobs has been dead a long time, and without his leadership, the uncompromising user focused development process in Apple was weakened.

How will OpenAI develop further without the leader with a strong vision?

I think Apple is the example confirming that a tech companies need visionary leaders -- even if they are not programmers.

Also, even with our logical brains, we engineers (and teachers) have been found to be the worst at predicting social economic behavior (ref: Freakonomics). To the point where our reasoning is not logical at all.

◧◩◪
86. Max-q+7w[view] [source] [discussion] 2023-11-19 13:17:22
>>pk-pro+E6
The claim that Apple equipment is not good on a price performance ratio does not hold water. I recently needed to upgrade both my phone and my laptop. I use Apple products, but not exclusively. Making cross platform apps, I like to use all the major platforms.

I compared the quality phone brands and PC brands. For a 13" laptop, both Samsung and Dell XPS are $4-500 more expensive on the same spec (i7/M2 pro, 32GB, 1TB), and I personally think that the MacBook Pro has a better screen, better touch pad and better build quality than the two others

iOS devices are comparably priced with Samsung models.

It was this way last time I upgraded my computer, and the time before.

Yeah, you will find cheaper phones and computers, and maybe you like them, but I appreciate build quality as well as MIPS. They are tools I use from early morning to late night every day.

◧◩
87. epgui+hx[view] [source] [discussion] 2023-11-19 13:26:51
>>rcbdev+Wj
As an actual scientist, I would also not call what “data scientists” do “science”.
◧◩◪
88. epgui+3y[view] [source] [discussion] 2023-11-19 13:32:59
>>hcks+Wq
What they do is not even close to proper science, FWIW.
◧◩◪◨
89. Frustr+hB[view] [source] [discussion] 2023-11-19 14:02:22
>>achow+8c
I think again, it is conflating two aspects of design

You can be an interior designer without knowing how to make furniture.

You can also be an excellent craftsman and make really nice furniture, and have no idea where it would go.

So sure, UX coders, could make really nice buttons.

But if you have UX coders all going in different directions, and buttons, text boxes, etc.. are all different, then it is bad design, jarring, even if each one is nice.

Then the designer is one that can give the direction, but not know how to code each piece.

◧◩◪
90. airstr+JC[view] [source] [discussion] 2023-11-19 14:12:52
>>hmotte+Bj
Wait until they learn the "engineer" in SWE is already a very liberal use of the term....
◧◩◪◨⬒
91. ohcmon+OC[view] [source] [discussion] 2023-11-19 14:13:36
>>pk-pro+es
Under ecosystem I mean people using ChatGPT daily on their phones and browsers, developers (and now virtually anyone) writing extensions. For most of the world all of the progress is condensed at chat.openai.com, and it will be only harder to beat this adoption.

Tech superiority might be relevant today, but I highly doubt it will stay the same for a long time even if openai continues to hide details (which I agree is bad). We could argue about the training data, but we have so much publicity available so that is not an issue as well.

◧◩◪
92. Hasnep+JD[view] [source] [discussion] 2023-11-19 14:20:13
>>jacque+uo
Okay I'll bite, what's so unethical about giving people money?
replies(1): >>jacque+RE
◧◩◪
93. barnab+1E[view] [source] [discussion] 2023-11-19 14:23:02
>>yander+av
I disagree that the “rationalist” and EA movements would make good decisions “for the benefit of humanity”, not that an open (and open source) AI development organisation working for the benefit of the people rather than capital/corporate or government interests would be a good idea.
◧◩◪
94. tim333+3E[view] [source] [discussion] 2023-11-19 14:23:16
>>latexr+8o
He also did cash in Oakland https://www.theguardian.com/technology/2016/jun/22/silicon-v...

I signed up from Worldcoin and have been given over $100 which I changed to real money and think it's rather nice of them. They never asked me for anything apart from the eye id check. I didn't have to give my name or anything like that. Is that indistinguishable from any other cryptocurrency scam? I'm not aware of one the same. If you know of another crypto that wants to give me $100 do let me know. If anything I think it's more like VCs paying for your Uber in the early days. It's VC money basically at the moment, with I think they idea they can change it into a global payment network or something like that. As to whether that will work, I'm a bit skeptical but who knows.

replies(1): >>latexr+4H
◧◩◪◨⬒⬓
95. financ+xE[view] [source] [discussion] 2023-11-19 14:26:46
>>qwytw+pr
OSX comes with a scuffed and lobotomized version of core-utils. To the point where what is POSIX/portable to almost every single unix (Linux, various BSDs, etc.) is not on OSX.

Disregarding every other point, in my eyes this single one downgrades OSX to “we don’t use that here” for any serious endeavor.

Add in Linux’s fantastic virtualization via KVM — something OSX does not have a sane and performant default for (no, hvf is neither of these things). Even OpenBSD has vmm.

The software story for Apple is not there for complicated development tasks (for simple webdev it’s completely useable).

replies(1): >>qwytw+b12
◧◩◪◨
96. jacque+RE[view] [source] [discussion] 2023-11-19 14:28:52
>>Hasnep+JD
Because without a long term plan you are just setting them up for a really hard fall. It is experimenting on people where if the experiment goes wrong you're high and dry in your mansion and they get to be pushed back into something probably worse than where they were before. It ties into the capitalist idea that money can solve all problems whereas in many cases these are healthcare and education issues first and foremost. You don't do that without really thinking through the possible consequences and to ensure that no matter what the outcome it is always going to be a net positive for the people that you decide to experiment on.
replies(1): >>Hasnep+o07
◧◩◪◨
97. latexr+4H[view] [source] [discussion] 2023-11-19 14:44:19
>>tim333+3E
> They never asked me for anything apart from the eye id check.

You say that like it’s nothing, but your biometric data has value.

> Is that indistinguishable from any other cryptocurrency scam?

You’re ignoring all the other people who didn’t get paid (linked articles).

Sam himself described the plan with the same words you’d describe a Ponzi scheme.

>>38326957

> If you know of another crypto that wants to give me $100 do let me know.

I knew of several. I don’t remember names but do remember one that was a casino and one that was tidied to open-source contributions. They gave initial coins to get you in the door.

◧◩◪
98. chucke+OI[view] [source] [discussion] 2023-11-19 14:54:12
>>v3ss0n+Zf
Running matplotlib is not doing frontend...
◧◩◪
99. s3p+2J[view] [source] [discussion] 2023-11-19 14:55:15
>>JoeAlt+sa
Loyalty is absolutely earned.
◧◩◪
100. Turing+lJ[view] [source] [discussion] 2023-11-19 14:57:21
>>pk-pro+E6
> Both Jobs and Altman are significant as businessmen and have accomplished a lot, but neither did squat for the technical part of the business.

The history of technology is littered with the corpses of companies that concentrated solely on the "technical side of the business".

◧◩◪◨
101. JoeAlt+xN[view] [source] [discussion] 2023-11-19 15:23:37
>>lozeng+ri
Said like a follower, determined to be loyal to an imagined hero, despite any amount of evidence to the contrary.
◧◩◪◨⬒
102. finnh+PQ[view] [source] [discussion] 2023-11-19 15:42:54
>>pk-pro+Xp
> Match kernel + BSD userland + NeXTSTEP, how Jobs have anything to do with any of this? Is like purchasing NeXT in 1997 is a major technical achievement...

Steve Jobs founded NeXT

◧◩◪◨⬒⬓
103. JoeAlt+AU[view] [source] [discussion] 2023-11-19 16:06:07
>>code_r+3s
They worked hard, overtime, so the company would succeed. They were promised endless rewards - "I'm gonna take care of you! We're in this together!"

Then, bupkiss.

No, not a hero.

◧◩◪◨⬒⬓⬔
104. qwytw+b12[view] [source] [discussion] 2023-11-19 21:17:04
>>financ+xE
> The software story for Apple is not there for complicated development tasks (for simple webdev it’s completely useable).

Well.. it's understandable that some people believe that things which are important and interesting to them (and presumably the ones which they work with on/with) are somehow inherently superior to what everyone else is doing.

And I understand that, to be fair I don't use MacOS that much these days besides when I need to work on my laptop. However.. most of those limitations are irrelevant/merely nuisances/outweighed by other considerations for a very high number of people who have built some very complicated and complex software (which has generated many billions in revenue) over the years. You're free to look down on those people since I don't really think they are bothered by that too much...

> for simple webdev it’s completely useable

I assume you also believe that any webdev (frontend anyway) is inherently simple and pretty much worhtless compared to the more "serious" stuff?

replies(1): >>financ+fJ2
◧◩◪◨⬒⬓⬔⧯
105. financ+fJ2[view] [source] [discussion] 2023-11-20 01:19:52
>>qwytw+b12
I don't look down on webdev. I've done webdev, in all its flavors and incarnations. I see it for what it is: mostly gluing together the work of other people, with various tweaks and transformations. It is simple work, once you get a feel for it.

The main issue I have with it is that there are no problems in webdev any more, so you get the same thing in both the frontend and backend: people building frameworks, and tools/languages/etc. to be "better" than what we had before. But it's never better, it's just mildly more streamlined for the use-case that is most en vogue. All of the novel work is being done by programming language theorists and other academic circles (distributed systems, databases, ML, etc.).

Regardless, the world runs on Linux. If you want to do something novel, Linux will let you. Fork the kernel, edit it, recompile it, run it. Mess with all of the settings. Build and download all of the tools (there are many, and almost all built with Linux in mind). Experiment, have fun, break things, mess up. The world is your oyster. In contrast, OSX is a woodchip schoolyard playground where you can only do a few things that someone else has decided for you.

Now, if you want to glue things together, OSX is perfectly fine a tool compared to a Linux distro. The choice there is one of taste and values. Even Windows will work for CRUD. The environments are almost indistinguishable nowadays.

◧◩◪
106. lordna+OF4[view] [source] [discussion] 2023-11-20 14:08:55
>>stavro+of
And look who's being tortured? The board, who are the safety-ists looking for a slowdown.
◧◩◪◨⬒
107. Hasnep+o07[view] [source] [discussion] 2023-11-21 00:59:08
>>jacque+RE
Let me see if I understand, is your argument that you shouldn't give people money because they might make irresponsible financial choices?
◧◩◪
108. nerber+Tyi[view] [source] [discussion] 2023-11-24 08:30:43
>>tarsin+yb
AI is still uncharted territory, both are equally important.
[go to top]