llmasaservice.io

Hosting and Routing to Private Models (like DeepSeek or Llama)

Want to run DeepSeek locally? It takes about 5 minutes and is zero cost. I made this video tutorial on running it locally and integrating it into our LLM as a Service product, enabling routing to different LLM vendors based on confidentiality/personal info/cost/etc

A powerful local LLM engine like DeepSeek R1 allows more people and companies to overcome the privacy sensitivity felt using public LLMs. I would love feedback on the video and am open to chatting more if you want to explore mixing public and private LLMs inside your products or enterprises.
 

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>