zlacker

[parent] [thread] 0 comments
1. rdtsc+(OP)[view] [source] 2023-11-21 07:54:58
Some hot takes there but this one I liked:

> People building AGI unable to predict consequences of their actions 3 days in advance.

It’s a reasonable point if these are the people building the future of humanity it’s a little concerning they can’t predict the immediate consequences of their own actions.

On the other hand it shows some honesty being able to admit a mistake in public, also something you might want out of someone building the future of humanity.

[go to top]