zlacker

[return to "Pushing ChatGPT's Structured Data Support to Its Limits"]
1. JoshMa+Dg[view] [source] 2023-12-27 16:42:33
>>goranm+(OP)
FWIW, I've seen stronger performance from gpt-4-1106-preview when I use `response_format: { type: "json_object" },` (providing a target typescript interface in context), vs the "tools" API.

More flexible, and (evaluating non-scientifically!) qualitatively better answers & instruction following -- particularly for deeply nested or complex schemas, which typescript expresses very clearly and succinctly.

Example from a hack week project earlier this month (using a TS-ish schema description that's copy/pasted from healthcare's FHIR standard): https://github.com/microsoft-healthcare-madison/hackweek-202...

Or a more complex example with one model call to invent a TS schema on-the-fly and another call to abstract clinical data into it: https://github.com/microsoft-healthcare-madison/hackweek-202...

◧◩
2. minima+Wh[view] [source] 2023-12-27 16:48:30
>>JoshMa+Dg
For posterity, this is the "JSON mode" mentioned at the bottom of the post.

The docs say it's on by default if you use function calling normally: https://platform.openai.com/docs/guides/text-generation/json...

> Note that JSON mode is always enabled when the model is generating arguments as part of function calling.

[go to top]