zlacker

[parent] [thread] 3 comments
1. kerkes+(OP)[view] [source] 2022-12-15 22:56:13
Tesla makes self-driving cars that drive better than humans. The reason you have to touch the steering wheel periodically is political/social, not technical. An acquaintance of mine read books while he commutes 90 minutes from Chattanooga to work in Atlanta once or twice a week. He's sitting in the driver's seat but he's certainly not driving.

The political/social factors which apply to the life-and-death decisions made driving a car, don't apply to whether one of the websites I work on works perfectly.

I'm 35, and I've paid to write code for about 15 years. To be honest, ChatGPT probably writes better code than I did at my first paid internship. It's got a ways to go to catch up with even a junior developer in my opinion, but it's only a matter of time.

And how much time? The expectation in the US is that my career will last until I'm 65ish. That's 30 years from now. Tesla has only been around 19 years and now makes self-driving cars.

So yeah, I'm not immediately worried that I'm going to lose my job to ChatGPT in the next year, but I am quite confident that my role will either cease existing or drastically change because of AI before the end of my career. The idea that we won't see AI replacing professional coders in the next 30 years strains credulity.

Luckily for me, I already have considered some career changes I'd want to do even if I weren't forced to by AI. But if folks my age were planning to finish out their careers in this field, they should come up with an alternative plan. And people starting this field are already in direct competition to stay ahead of AI.

replies(2): >>Panzer+V8 >>prioms+Jw
2. Panzer+V8[view] [source] 2022-12-15 23:51:05
>>kerkes+(OP)
I'm doubtful - There's a pretty big difference between writing a basic function and even a small program, and that's all I've seen out of these kinds of AIs thus far, and it still gets those wrong regularly because it doesn't really understand what it's doing - just mixing and matching its training set.

Roads are extremely regular, as things go, and as soon as you are off the beaten path with those AIs start having trouble too.

It seems that in general that the long tail will be problematic for a while yet.

3. prioms+Jw[view] [source] 2022-12-16 02:39:16
>>kerkes+(OP)
I was of the impression that Tesla's self driving is still not fully reliable yet. For example a recent video shows a famous youtuber having to take manual control 3 times in a 20 min drive to work [0]. He mentioned how stressful it was compared to normal driving as well.

[0] https://www.youtube.com/watch?v=9nF0K2nJ7N8

replies(1): >>kerkes+QgG
◧◩
4. kerkes+QgG[view] [source] [discussion] 2022-12-29 04:14:33
>>prioms+Jw
If you watch the video you linked, he admits he's not taking manual control because it's unsafe--it's because he's embarrassed. It's hard to tell from the video, but it seems like the choices he makes out of embarrassment are actually more risky than what the Tesla was going to do.

It makes sense. My own experience driving a non-Tesla car the speed limit nearly always, is that other drivers will try to pressure you to do dangerous stuff so they can get where they're going a few seconds faster. I sometimes give into that pressure, but the AI doesn't feel that pressure at all. So if you're paying attention and see the AI not giving into that pressure, the tendency is to take manual control so you can. But that's not safer--quite the opposite. That's an example of the AI driving better than the human.

On the opposite end of the social anxiety spectrum, there's a genre of pornography where people are having sex in the driver's seats of Teslas while the AI is driving. They certainly aren't intervening 3 times in 20 minutes, and so far I don't know of any of these people getting in car accidents.

[go to top]