AI deployment and inference: reliable infrastructure and streamlined operations

Logo
Presented by

Rui Vasconcelos, Canonical and Michael Boros, Dell

About this talk

As artificial intelligence reshapes traditional and new industries, challenges arise for enterprises looking to innovate. Dell and Canonical address these topics in our upcoming webinar. Join us to discuss: - What happens after you have trained your AI model? This spans model deployment for inference at the edge, to inference serving, to distributed training and the handling of new data, to the underlying setup and operations. - The importance of reliable hardware and software layers in critical applications and the benefits of using specialised data science workstations with Linux. - How to streamline your AI operations to shorten your deployment cycles.

Related topics:

More from this channel

Upcoming talks (6)
On-demand talks (335)
Subscribers (140998)
Get the most in depth information about the Ubuntu technology and services from Canonical. Learn why Ubuntu is the preferred Linux platform and how Canonical can help you make the most out of your Ubuntu environment.