Improve Kubernetes Uptime and Resilience with a Canary Deployment

Logo
Presented by

Jenn Gile, Senior Manager of Product Marketing, NGINX

About this talk

Your organization is successfully delivering apps in Kubernetes and now the team is ready to roll out v2 of a backend service. But there are concerns about traffic interruptions (a.k.a. downtime) and the possibility that v2 might unstable. As the Kubernetes engineer, you need to find a way to ensure v2 can be tested and rolled out with little to no impact to customers. You need to implement a gradual, controlled migration – and what better way than with the traffic splitting technique “canary deployment”! Canary deployments provide a safe and agile way to test the stability of a new feature or version. Because your use case involves traffic moving between two Kubernetes services, you know use of a service mesh will yield the easiest and most reliable results. You use NGINX Service Mesh to send 10% of your traffic to v2, which the remaining 90% to v1. Then you gradually transition larger percentages of traffic to v2 until you reach 100%. Problem solved! In This Lab You Will: ◆ Deploy minikube and NGINX Service Mesh ◆ Deploy two apps and use NGINX Service Mesh to observe traffic ◆ Use NGINX Service Mesh to implement a canary deployment Technologies Used: ◆ NGINX Service Mesh ◆ Helm ◆ Jaeger
Related topics:

More from this channel

Upcoming talks (0)
On-demand talks (265)
Subscribers (26447)
NGINX helps companies deliver their sites and applications with performance, reliability, security, and scale. NGINX offers an award-winning, comprehensive application delivery platform in use on more than 300 million sites worldwide. Watch this webinars how to ensure flawless digital experiences through features such as advanced load balancing, web and mobile acceleration, security controls, application monitoring, and management.