zlacker

[parent] [thread] 2 comments
1. aantix+(OP)[view] [source] 2023-12-27 17:31:42
"Now displaying 3 citations out of ~150,000,000.."

[1] http://web.archive.org/web/20120608192927/http://www.google....

[2] https://steemit.com/online/@jaroli/how-google-search-result-...

[3] https://www.smashingmagazine.com/2009/09/search-results-desi...

[4] Next page

:)

replies(1): >>pama+66
2. pama+66[view] [source] 2023-12-27 18:04:44
>>aantix+(OP)
This is not answering the GP question and does not count as a satisfactory ranked citation list. The first one is particularly dubious. Also you didn’t clarify which statement was based on which citation. I didn’t see “dog” in your text.

To help understand the complexity of an LLM consider that these models typically hold about 10,000 less parameters than the total characters in the training data. If one wants to instruct the LLM to search the web and find relevant citations it might obey this command but it will not be the source of how it formed the opinions it has in order to produce its output.

replies(1): >>jquery+7W9
◧◩
3. jquery+7W9[view] [source] [discussion] 2023-12-31 07:33:06
>>pama+66
You mean 10,000x less parameters? In other words, only 1 character for every 10,000 characters of input?

Yeah, good luck embedding citations into that. Everyone here saying it's easy needs to go earn their 7 figure comp at an AI company instead of wasting their time educating us dummies.

[go to top]