InfoTechTarget and Informa Tech's Digital Businesses Combine.

Together, we power an unparalleled network of 220+ online properties covering 10,000+ granular topics, serving an audience of 50+ million professionals with original, objective content from trusted sources. We help you gain critical insights and make more informed decisions across your business priorities.

AI Model Inferencing and Deployment Options

Presented by

Luis Freeman, IBM; Kevin Marks, Dell Technologies; Tim Lustig, NVIDIA

About this talk

This webinar will explain the process of model inferencing and deployment options. We will discuss how trained models are used to make predictions and the considerations for deploying models. Attendees will understand the challenges and strategies for optimizing model performance, covering: • What is model inferencing • Model inferencing process • How inferencing deployment differs in Gen AI and visual inspection - Inferencing deployment software options - Inferencing hardware considerations (e.g., on the edge, on-prem, etc.) • Deploying AI models for production best practices and lessons learned • Strategies for maintaining model performance post-deployment
Network Infrastructure

Network Infrastructure

42614 subscribers117 talks
Updating the network infrastructure for the 21st century
With virtualization and cloud computing revolutionizing the data center, it's time that the network has its own revolution. Join the Network Infrastructure channel on all the hottest topics for network and storage professionals such as software-defined networking, WAN optimization and more to maintain performance and service in your infrastructure
Related topics