Public Notes on
View Public Collections
Ollama ollama.ai
Get up and running with large language models, locally.

#llm #cli #llama

Show More
Distribute and run LLMs with a single file. Contribute to Mozilla-Ocho/llamafile development by creating an account on GitHub.

#llm #llama #ai #cli

Show More
The simplest way to run LLaMA on your local machine - cocktailpeanut/dalai: The simplest way to run LLaMA on your local machine #llama #ai #self-hosted #cli
Show More