zlacker

[parent] [thread] 5 comments
1. Fridge+(OP)[view] [source] 2023-02-08 21:56:17
I do not understand the appeal or gpt-powered searches.

Most of my web searches are for looking up specific things, to find the specific link(s) that contains the information I need. These aren’t searches that are going to be made better or faster by an ML model-they’re not natural language queries, they’re just a bunch of terms.

replies(5): >>ghshep+f3 >>helf+Q7 >>zerocr+Pp >>friend+Gq >>postin+XE
2. ghshep+f3[view] [source] 2023-02-08 22:09:40
>>Fridge+(OP)
Depends on your use case - if you are coding, or doing constraint analysis - 75+% of the time it's a single request, single answer, and you are done. The other element with ChatGPT is if you aren't happy with the initial answer for some reason - you can engage in a conversation with it, provide some guidance, and will adjust it to suit specifically what you are interested in. I've found about 50% of my google searches just go straight to ChatGPT these days. Hallucinations are the only real problem I've had - but over time you start to become cynical about the truth of anything factual - asking for unit tests helps quite a bit when coding - double checking any math is also important.

I do agree though, that without citations to the original source - any "Facts" that ChatGPT offers are absolutely untrustworthy.

3. helf+Q7[view] [source] 2023-02-08 22:26:48
>>Fridge+(OP)
I think because people want to natural language ask questions to a virtual butler (Jeeves) and get an answer back in digestible natural language form.

I… don’t really get it either.

But I’m also a cranky person who can’t stand every damn thing being a video whether it makes sense for the content or not, etc.

4. zerocr+Pp[view] [source] 2023-02-08 23:50:16
>>Fridge+(OP)
As someone else said, I think a surprisingly (overwhelmingly?) large amount of queries are just questions people want answered, or close to them, and not really "search queries" in the traditional sense of text people are looking for on a page somewhere.

Even before this, you've seen the search engines add features to cater to that kind of use, things like Siri handing off questions it didn't understand to a search engine (as well as the other assistants that can do that), indications of this behavior in how companies like Google show themselves being used in ads.

Of course, regardless of the true prevalence of that behavior, it's probably in Google et al.'s favor to encourage that. Regular search sort of inherently cedes some power and control to the pages the results are coming from, where you're sending users away to if their answer isn't right in the snippet. But the "answer box" features, or an LLM that just tells you "the answer to your question" directly on the page keeps you there, treating the search page as your source of information and not somewhere else.

5. friend+Gq[view] [source] 2023-02-08 23:54:40
>>Fridge+(OP)
When you are curious about a fact, you're looking for a useful and accurate answer. You don't care what page you read it on, as long as it's both useful and accurate. You just think you are looking for a page because that's how you're used to doing it.

Beyond that, most of the results that rank for anything are plainly worse LLM blogspam padded for SEO. All you're doing here is cutting out the middleman.

The problem with chat gpt for factual question search is, of course, that it's not factual. It's job is to produce coherent sentences, not actually tell you factual information. So until they manage to get that right it's not a superior product.

There are two types of search, people looking for a specific resource (like searching for a song on youtube) and people just looking for an answer to a question. If a LLM can be factually accurate, its a superior search product for that specific use, which is probably the majority of search.

6. postin+XE[view] [source] 2023-02-09 01:23:44
>>Fridge+(OP)
I tend to agree with this take.

That said, ChatGPT does a very good job of understanding what I am looking for. If Microsoft could just insert this understanding into its regular search algorithms it would seem likely to significantly improve the quality of search results.

[go to top]