Daniel Nguyen
complete
I've added native support for Ollama in version 1.16.0.
Go to Settings > Models, click (+) button and choose Ollama.
PDF Pals will attempt to fetch the list of models automatically.
Daniel Nguyen
in progress
Serving open-source LLMs seems to be a challeging tasks and takes a lot of time.
After consideration, I think I should not put my effort on model serving etc.
Instead, I'm going to support Ollama natively.
B
Beckett Dillon
Daniel Nguyen sounds great! Ollama can essentially run any open source LLM so it should be more than enough. Thanks!
C
Cassandre Emard
Exciting news! Meta has just launched LLaMA v2 and made it open source for commercial use.
Daniel Nguyen
planned