zlacker

[return to "Watching AI drive Microsoft employees insane"]
1. margor+72[view] [source] 2025-05-21 11:23:29
>>laiysb+(OP)
With how stochastic the process is it makes it basically unusable for any large scale task. What's the plan? To roll the dice until the answer pops up? That would be maybe viable if there was a way to automatically evaluate it 100% but with a human in the loop required it becomes untenable.
◧◩
2. eterev+J2[view] [source] 2025-05-21 11:33:07
>>margor+72
The plan is to improve AI agents from their current ~intern level to a level of a good engineer.
◧◩◪
3. ehnto+8a[view] [source] 2025-05-21 12:31:20
>>eterev+J2
They are not intern level.

Even if it could perform at a similar level to an intern at a programming task, it lacks a great deal of the other attributes that a human brings to the table, including how they integrate into a team of other agents (human or otherwise). I won't bother listing them, as we are all humans.

I think the hype is missing the forest for the trees, and I think exactly this multi-agent dynamic might be where the trees start to fall down in front of us. That and the as currently insurmountable issues of context and coherence over long time horizons.

◧◩◪◨
4. Workac+5s[view] [source] 2025-05-21 14:34:20
>>ehnto+8a
The real missing the forest for the trees is thinking that software and the way users will use computers is going to remain static.

Software today is written to accommodate every possible need of every possible user, and then a bunch of unneeded selling point features on top of that. These massive sprawling code bases made to deliver one-size fits all utility.

I don't need 3 million LOC Excel 365 to keep track of who is working on the floor on what day this week. Gemini 2.5 can write an applet that does that perfectly in 10 minutes.

◧◩◪◨⬒
5. Bughea+P32[view] [source] 2025-05-22 02:04:25
>>Workac+5s
I don't know. I guess it depends on what you classify as being change. I don't really view software as having changed all that much since around maybe the mid 70s as HLLs began to become more popular. What programmers do today and what they did back then would be easily recognizable to both groups if we had time machines. I don't see how AI really changes things all that much. It's got the same scalability issues that low code/no code solutions have always had and those go way back. The main difference is that you can use natural language, but I don't see that as being inherently better than say drawing a picture using some flowcharting tools in a low code platform. You just introduce the same problem natural languages always have had and why we didn't choose them in the first place, i.e. they are not strict enough and need lots of context. Giving an AI very specific sentences to define my project in natural language and making sure it has lots of context begins to look an awful lot like psuedocode to me. So as you learn to approach using AI in such a way that it produces what you want you naturally get closer and closer to just specifying the code.

What HAS indisputably changed is the cost of hardware which has driven accessibility and caused more consumer facing software to be made.

[go to top]