Open source MLOps on AWS

Presented by

Mark Thomas, Solution architect at AWS & Andreea Munteanu, AI/ML Product manager at Canonical

About this talk

With the rise of generative AI, enterprises are growing their AI budgets, looking for options to quickly set up the infrastructure and run the entire machine learning cycle. Cloud providers like AWS are often preferred to kick-start AI/ML projects as they offer the computing power to experiment without long-term commitments. Starting on the cloud takes away the burden of computing power, reducing start-up time and cost and allowing teams to iterate more quickly. Open source MLOps on AWS Open source MLOps platforms such as Charmed Kubeflow help developers easily get started with AI, without focusing on operations and maintenance. The software appliance available on the AWS Marketplace is a hassle-free alternative to get access to specialized ML tooling. The Charmed Kubeflow appliance is a perfect match for the public cloud, enabling data scientists and machine learning engineers to set up an MLOps environment in minutes and reducing time spent on the setup phase. Users can get familiar with the tool before running AI at scale. Why join us? During the webinar, our hosts, Mark Thomas, Solution architect at AWS & Andreea Munteanu, AI/ML Product manager will talk about: Main challenges of AI/ML Top 3 industries to benefit from AI/ML Why choose AWS for AI/ML applications Benefits of AWS Open source MLOps on AWS Run Charmed Kubeflow on a hybrid cloud The Charmed Kubeflow appliance on AWS
Related topics:

More from this channel

Upcoming talks (7)
On-demand talks (381)
Subscribers (158209)
Get the most in depth information about the Ubuntu technology and services from Canonical. Learn why Ubuntu is the preferred Linux platform and how Canonical can help you make the most out of your Ubuntu environment.