zlacker

[parent] [thread] 4 comments
1. nostra+(OP)[view] [source] 2024-02-02 04:20:56
Don't need an LLM for that (except maybe to get funding), a plain old classifier would work fine at a fraction of the training & inference costs.
replies(2): >>vidarh+0o >>bryanr+su
2. vidarh+0o[view] [source] 2024-02-02 08:32:52
>>nostra+(OP)
Yeah, but why would you make that effort when all you'd need is the thinnest veneer over ChatGPT, and given the proposed pricing would leave plenty of margin?
3. bryanr+su[view] [source] 2024-02-02 09:39:04
>>nostra+(OP)
Well you're not going to be able to charge them a lot of money without claiming LLM and you won't get funding, so gotta think about the big picture.
replies(1): >>dns_sn+pA
◧◩
4. dns_sn+pA[view] [source] [discussion] 2024-02-02 10:46:07
>>bryanr+su
Don't worry, you can say something vague like "it's powered by LLM" which could reasonably mean that the LLM was used during the training phase of your own classifier.
replies(1): >>nostra+pz1
◧◩◪
5. nostra+pz1[view] [source] [discussion] 2024-02-02 16:56:09
>>dns_sn+pA
Or that you asked ChatGPT how to write scikit-learn code.
[go to top]