Want to run DeepSeek locally? It takes about 5 minutes and is zero cost. I made this video tutorial on running it locally and integrating it into our LLM as a Service product, enabling routing to different LLM vendors based on confidentiality/personal info/cost/etc
A powerful local LLM engine like DeepSeek R1 allows more people and companies to overcome the privacy sensitivity felt using public LLMs. I would love feedback on the video and am open to chatting more if you want to explore mixing public and private LLMs inside your products or enterprises.
A powerful local LLM engine like DeepSeek R1 allows more people and companies to overcome the privacy sensitivity felt using public LLMs. I would love feedback on the video and am open to chatting more if you want to explore mixing public and private LLMs inside your products or enterprises.