zlacker

[parent] [thread] 1 comments
1. thesz+(OP)[view] [source] 2024-04-18 07:06:20
Usually, LLM's output gets passed through beam search [1] which is as symbolic as one can get.

[1] https://www.width.ai/post/what-is-beam-search

It is possible to even have 3-gram model to output better text predictions if you combine it with the beam search.

replies(1): >>eru+7b
2. eru+7b[view] [source] 2024-04-18 09:27:32
>>thesz+(OP)
See >>40073039 for a discussion.
[go to top]