Today’s data centers must be architected to support cloud native accelerated computing workloads. This will enable customers to model, design, and deliver advanced AI services and develop more efficient factories, hospitals, offices, and even the data centers themselves. As AI becomes more prevalent applications grow to span multiple servers and locations, it becomes critical to implement software-defined infrastructure and accelerate it with domain-specific processors such as a DPU. The DPU accelerates data center infrastructure services: networking, security, storage, data streaming. The DPU enables businesses and service providers to efficiently scale to support millions of online users. The entry point for all these users is the Domain Name Service (DNS).
The session will feature a leading Internet services company, who will present how use the DPU to offload, accelerate, and secure the DNS service over HTTPS. Use of GPU and DPU acceleration ensures data centers are ready to support new solutions including AI, Edge computing, and the Omniverse.
Presented by:
Kevin Deierling, SVP of Networking, NVIDIA
Thomas Jacob, Engineer 4, Software Development and Engineering, Comcast