InfoTechTarget and Informa Tech's Digital Businesses Combine.

Together, we power an unparalleled network of 220+ online properties covering 10,000+ granular topics, serving an audience of 50+ million professionals with original, objective content from trusted sources. We help you gain critical insights and make more informed decisions across your business priorities.

Why We DON'T Run AI-Models Locally (Only In Docker or Cloud!)

Presented by

ByteGrad

About this talk

Running AI models on a personal machine might seem convenient—but is it really secure? In this video, we walk through three approaches to running AI workloads: starting with local tools like Ollama, then using Docker for added isolation, and finally spinning up fully customizable, cloud-hosted development environments with Coder. In this video, you’ll learn: - Why we don’t trust running AI models directly on our own machines - How to use Docker to safely isolate AI workloads - How to spin up secure, cloud-based dev environments with Coder - How Coder templates can help teams standardize environments with tools like Ollama and LLaMA pre-installed Whether you're an AI engineer, a curious developer, or part of a team that needs consistent, secure setups, this video shows how to run AI workloads the right way.
Coder

Coder

21358 subscribers76 talks
Coder webinars on optimizing the software development experience.
Coder is the AI software development company leading the future of autonomous coding. Coder helps teams build fast, stay secure, and scale with control by combining AI coding agents and human developers in one trusted workspace. Coder’s award-winning self-hosted Cloud Development Environment (CDE) gives enterprises the power to govern, audit, and accelerate software development without trade-offs. Learn more at coder.com.
Related topics