InfoTechTarget and Informa Tech's Digital Businesses Combine.

Together, we power an unparalleled network of 220+ online properties covering 10,000+ granular topics, serving an audience of 50+ million professionals with original, objective content from trusted sources. We help you gain critical insights and make more informed decisions across your business priorities.

Hyperparameter tuning with an MLOps platform

Presented by

Andreea Munteanu, MLOps Product Manager, Michal Hucko, Charmed Kubeflow engineer

About this talk

Developing AI/ML models is not a novelty for most organisations. But controlling their behaviour is still challenging. It’s essential to tune models correctly in order to avoid getting suboptimal results. To address this need, MLOps platforms started including dedicated solutions. What is hyperparameter tuning? Hyperparameters are used for computing model parameters. They are specific to the algorithm used for modelling. Their value cannot be calculated from the data. They are different from model parameters, which are learned or estimated by the algorithm and continue to update their values during the training process. Hyperparameter tuning is the process of finding a set of optimal hyperparameter values for a learning algorithm. It is necessary to obtain an optimised algorithm, on any data set. Join our webinar to learn about: - Hyperparameter tuning - MLOps’ role in hyperparameter tuning - How you can use Kubeflow for this process Speakers: Michal Hucko - Charmed Kubeflow engineer Andreea Munteanu - MLOps Product Manager
Ubuntu and Canonical

Ubuntu and Canonical

181485 subscribers475 talks
Make the most out of Ubuntu
Get the most in depth information about the Ubuntu technology and services from Canonical. Learn why Ubuntu is the preferred Linux platform and how Canonical can help you make the most out of your Ubuntu environment.
Related topics