Table of Contents

Host Local Models in your PC with Kobold

Contributors

blitzen

Last Updated

05/06/2025

Host Local Models in your PC with Kobold

Hello hello! If you're here, you're probably asking, “What is this guy talking about?” Well, LLMs are essentially models that you run on your own computer or hardware. This guide primarily focuses on Windows 10/11, but Linux can work as well.

Changelog

Guide version Date Details
0.3 05/06/2025 Split into 3 separate guides
0.2 05/06/2025 Added SillyTavern & Tailscale guides
0.1 05/05/2025 Initial draft

Check Your Hardware

Download a Model

Install KoboldCPP

Configure KoboldCPP

Tweak Settings

Run the Model!

Janitor API Setup

SillyTavern API Setup

Okay... Can I see an example of how much speed I'd get?

My Personal Setup:

End Notes

If you have any questions, either DM me at @blitzen1122 on discord, ping me on the JAI discord server in #AI-Models, or just google it!

1)
OR LM Studio