LM Studio – Discover, download, and run local LLMs:
With LM Studio, you can …
🤖 – Run LLMs on your laptop, entirely offline👾 – Use models through the in-app Chat UI or an OpenAI compatible local server📂 – Download any compatible model files from HuggingFace 🤗 repositories🔭 – Discover new & noteworthy LLMs in the app’s home page
LM Studio supports any ggml Llama, MPT, and StarCoder model on Hugging Face (Llama 2, Orca, Vicuna, Nous Hermes, WizardCoder, MPT, etc.)
Minimum requirements: M1/M2/M3 Mac, or a Windows PC with a processor that supports AVX2. Linux is available in beta.
Made possible thanks to the llama.cpp project.
Open-Source-LLMs lokal laufen lassen auf Mac, Win und Linux (Beta).