Add Ollama Embedding Sub-node

The idea is:

In addition to the current cloud based embedding options, add a version of the embedding that can be run locally. Since Ollama 0.1.126, it has added [

nomic-embed-text

](nomic-embed-text)

My use case:

Run text embedding locally.

I think it would be beneficial to add this because:

This is important for workflows that require privacy and can be run locally.

Any resources to support this?

https://ollama.com/library/nomic-embed-text

curl http://localhost:11434/api/embeddings -d '{
  "model": "nomic-embed-text",
  "prompt": "The sky is blue because of Rayleigh scattering"
}'

Are you willing to work on this?

Sure. I am happy to help test.