InfoTechTarget and Informa Tech's Digital Businesses Combine.

Together, we power an unparalleled network of 220+ online properties covering 10,000+ granular topics, serving an audience of 50+ million professionals with original, objective content from trusted sources. We help you gain critical insights and make more informed decisions across your business priorities.

AI Model Inferencing and Deployment Options

Presented by

Luis Freeman, IBM; Kevin Marks, Dell Technologies; Tim Lustig, NVIDIA

About this talk

This webinar will explain the process of model inferencing and deployment options. We will discuss how trained models are used to make predictions and the considerations for deploying models. Attendees will understand the challenges and strategies for optimizing model performance, covering: • What is model inferencing • Model inferencing process • How inferencing deployment differs in Gen AI and visual inspection - Inferencing deployment software options - Inferencing hardware considerations (e.g., on the edge, on-prem, etc.) • Deploying AI models for production best practices and lessons learned • Strategies for maintaining model performance post-deployment
Data Center Management

Data Center Management

16870 subscribers118 talks
Best practices for achieving an efficient data center
With today’s pressures on lowering our carbon footprint and cost constraints within organizations, IT departments are increasingly in the front line to formulate and enact an IT strategy that greatly improves energy efficiency and the overall performance of data centers. This channel will cover the strategic issues on ‘going green’ as well as practical tips and techniques for busy IT professionals to manage their data centers. Channel discussion topics will include: - Data center efficiency, monitoring and infrastructure management; - Data center design, facilities management and convergence; - Cooling technologies and thermal management And much more
Related topics