How to use fine-tuned OpenAI model

Hi,

I’d like to know how to use the fine-tuned OpenAI model in AI agent. I checked and there is no fine-tuning model in the OpenAI Chat Model.

Information on your n8n setup

  • **n8n version: 1.55.3
  • **Database (default: SQLite): SQLite
  • **n8n EXECUTIONS_PROCESS setting (default: own, main): own
  • **Running n8n via (Docker, npm, n8n cloud, desktop app): Docker
  • **Operating system: Linux

Hi @davidjm

I don’t have a fine-tuned model to test this with but, you should be able to enter the name of the model by switching to the Expression tab:

CleanShot 2024-09-04 at 13.23.12

Can you try using the fine-tuned model in the OpenAI playground, check the name of your fine-tuned model, and then enter the name directly in n8n?

@davidjm some updates — there’s a PR to fix this: fix(OpenAI Chat Model Node): Prevent filtering of fine-tuned models in model selector by OlegIvaniv · Pull Request #10662 · n8n-io/n8n · GitHub

Thanks 4 your info @aya. Can’t wait for the fixing to be done.

1 Like

in the meantime david u can choose gpt-4o and use the structured output parser to delineate how you want the output to be reasoned and delivered. this should improve ur reliability a lot. ai-jason has a good video showing this and here is the openai blog post: https://platform.openai.com/docs/guides/structured-outputs

1 Like

Thanks @zzkmoonflowers