zlacker

[parent] [thread] 2 comments
1. Arctic+(OP)[view] [source] 2023-09-12 19:22:37
Currently OpenPipe allows you to capture input/output from a powerful model and use it to fine-tune a much smaller one, then offers you the option to host through OpenPipe or download it and host it elsewhere. Models hosted on OpenPipe enjoy a few benefits, like data drift detection and automatic reformatting of output to match the original model you trained against (think extraction "function call" responses from a purely textual Llama 2 response) through the sdk.

Longer-term, we'd love to expand the selection of base models to include specialized LLMs that are particularly good at a certain task, e.g. language translation, and let you train off of those as well. Providing a ton of specialized starting models will decrease the amount of training data you need, and increase the number of tasks at which fine-tuned models can excel.

replies(2): >>idosh+pn >>throw0+HS
2. idosh+pn[view] [source] 2023-09-12 20:41:48
>>Arctic+(OP)
Thanks! I need to dive into the project and learn more. Sounds exciting
3. throw0+HS[view] [source] 2023-09-12 23:10:08
>>Arctic+(OP)
Any compliance yet? HIPAA etc
[go to top]