xPU Accelerator Offload Functions

Logo
Presented by

Joseph White, Dell; John Kim, NVIDIA; Mario Baldi, Pensando; Yadong Li, Intel; David McIntyre, Samsung

About this talk

As covered in our first webcast “SmartNICs and xPUs: Why is the Use of Accelerators Accelerating,” we discussed the trend to deploy dedicated accelerator chips to assist or offload the main CPU. These new accelerators (xPUs) have multiple names such as SmartNIC, DPU, IPU, APU, NAPU. This second webcast in this series will cover a deeper dive into the accelerator offload functions of the xPU. We’ll discuss what problems the xPUs are coming to solve, where in the system they live, and the functions they implement, focusing on: Network Offloads • Virtual switching and NPU • P4 pipelines • QoS and policy enforcement • NIC functions • Gateway functions (tunnel termination, load balancing, etc) Security • Encryption • Policy enforcement • Key management and crypto • Regular expression matching • Firewall • Deep Packet Inspection (DPI) Compute • AI calculations, model resolution • General purpose processing (via local cores) • Emerging use of P4 for general purpose Storage • Compression and data at rest encryption • NVMe-oF offload • Regular expression matching • Storage stack offloads
Related topics:

More from this channel

Upcoming talks (2)
On-demand talks (119)
Subscribers (42433)
With virtualization and cloud computing revolutionizing the data center, it's time that the network has its own revolution. Join the Network Infrastructure channel on all the hottest topics for network and storage professionals such as software-defined networking, WAN optimization and more to maintain performance and service in your infrastructure