zlacker

[parent] [thread] 1 comments
1. KIFulg+(OP)[view] [source] 2022-12-15 17:43:11
I experienced ChatGPT confidently giving incorrect answers about the Schwarzchild radius of the black hole at the center of our galaxy, Saggitarius A-star. Both when asked about "the Scharzchild radius of a black hole with 4 million solar masses" (a calculation) and "the Scharzchild radius of Saggitarius A-star" (a simple lookup).

Both answers were orders of magnitude wrong, and vastly different from each other.

JS code suggested for a simple database connection had glaring SQL injection vulnerabilities.

I think it's an ok tool for discovering new libraries and getting oriented quickly to languages and coding domains you're unfamiliar with. But it's more like a forum post from a novice who read a tutorial and otherwise has little experience.

replies(1): >>mcguir+ML
2. mcguir+ML[view] [source] 2022-12-15 21:24:18
>>KIFulg+(OP)
My understanding is that ChatGPT (and similar things) are purely language models; they do not have any kind of "understanding" of anything like reality. Basically, they have a complex statistical model of how words are related.

I'm a bit surprised that it got a lookup wrong, but for any other domain, describing it as a "novice" is understating the situation a lot.

[go to top]