Hyperparameter tuning with an MLOps platform

Logo
Presented by

Andreea Munteanu, MLOps Product Manager, Michal Hucko, Charmed Kubeflow engineer

About this talk

Developing AI/ML models is not a novelty for most organisations. But controlling their behaviour is still challenging. It’s essential to tune models correctly in order to avoid getting suboptimal results. To address this need, MLOps platforms started including dedicated solutions. What is hyperparameter tuning? Hyperparameters are used for computing model parameters. They are specific to the algorithm used for modelling. Their value cannot be calculated from the data. They are different from model parameters, which are learned or estimated by the algorithm and continue to update their values during the training process. Hyperparameter tuning is the process of finding a set of optimal hyperparameter values for a learning algorithm. It is necessary to obtain an optimised algorithm, on any data set. Join our webinar to learn about: - Hyperparameter tuning - MLOps’ role in hyperparameter tuning - How you can use Kubeflow for this process Speakers: Michal Hucko - Charmed Kubeflow engineer Andreea Munteanu - MLOps Product Manager
Related topics:

More from this channel

Upcoming talks (7)
On-demand talks (394)
Subscribers (161127)
Get the most in depth information about the Ubuntu technology and services from Canonical. Learn why Ubuntu is the preferred Linux platform and how Canonical can help you make the most out of your Ubuntu environment.