xPU Accelerator Offload Functions

Presented by

Joseph White, Dell; John Kim, NVIDIA; Mario Baldi, Pensando; Yadong Li, Intel; David McIntyre, Samsung

About this talk

As covered in our first webcast “SmartNICs and xPUs: Why is the Use of Accelerators Accelerating,” we discussed the trend to deploy dedicated accelerator chips to assist or offload the main CPU. These new accelerators (xPUs) have multiple names such as SmartNIC, DPU, IPU, APU, NAPU. This second webcast in this series will cover a deeper dive into the accelerator offload functions of the xPU. We’ll discuss what problems the xPUs are coming to solve, where in the system they live, and the functions they implement, focusing on: Network Offloads • Virtual switching and NPU • P4 pipelines • QoS and policy enforcement • NIC functions • Gateway functions (tunnel termination, load balancing, etc) Security • Encryption • Policy enforcement • Key management and crypto • Regular expression matching • Firewall • Deep Packet Inspection (DPI) Compute • AI calculations, model resolution • General purpose processing (via local cores) • Emerging use of P4 for general purpose Storage • Compression and data at rest encryption • NVMe-oF offload • Regular expression matching • Storage stack offloads

Related topics:

More from this channel

Upcoming talks (2)
On-demand talks (375)
Subscribers (51001)
The Storage Networking Industry Association (SNIA) is a non-profit organization made up of member companies spanning information technology. A globally recognized and trusted authority, SNIA’s mission is to lead the storage industry in developing and promoting vendor-neutral architectures, standards and educational services that facilitate the efficient management, movement and security of information.