How to Use Data Masking to Protect Sensitive Data in Your DevOps Pipelines

Logo
Presented by

Tali Einhorn, Product Manager & Hod Rotem, VP Solution Engineering at K2View

About this talk

Data masking is critical for data-driven businesses that must comply with a growing number of data protection laws, and safeguard against ever-increasing cyber threats. Yet, the growing velocity, variety, veracity, and volumes of data makes data masking a complex operational obstacle that is slowing down businesses. Join us in this webinar to learn how to implement a comprehensive, automated, enterprise-grade data masking program that addresses the complex challenges most enterprises face. We'll cover: * The core components of a data masking framework * Static, dynamic, and unstructured data masking * Common implementation challenges * A novel approach to data masking, based on business entities * How to mask unstructured data (checks, documents, chat scripts, and more), including a demo
Related topics:

More from this channel

Upcoming talks (1)
On-demand talks (19)
Subscribers (6312)
At K2View, we believe that every company should be able to liberate and elevate its data to deliver the most personalized and profitable customer experience in its industry, while being innovative and radically agile. With K2View, companies manage data in a whole new way, using a business lens: they create data products that continually sync, transform, and serve data from siloed source systems – delivering a real-time, holistic view of any business entity to any data consumer. Our Data Product Platform fuels operational and analytical workloads, at enterprise scale. It deploys as a data fabric, data mesh, or data hub - in an on-premises, cloud, or a hybrid architecture - in a matter of weeks, and adapts to change on the fly. The most data-intensive companies in the world, including AT&T, Verizon, American Express, Vodafone, and Hertz trust K2View Data Product Platform for their operational use cases - spanning Customer 360, cloud migration, test data management, data tokenization, and legacy application modernization.