Welcome to the big data and data management community on BrightTALK. Join thousands of data quality engineers, data scientists, database administrators and other professionals to find more information about the hottest topics affecting your data. Subscribe now to learn about efficiently storing, optimizing a complex infrastructure, developing governing policies, ensuring data quality and analyzing data to make better informed decisions. Join the conversation by watching live and on-demand webinars and take the opportunity to interact with top experts and thought leaders in the field.
Joshua Robinson - Founding Engineer, FlashBladeRecorded: Jan 17 201940 mins
Learn how Pure Storage engineering manages streaming 190B log events per day and makes use of that deluge of data in our continuous integration (CI) pipeline. Our test infrastructure runs over 70,000 tests per day creating a large triage problem that would require at least 20 triage engineers. Instead, Spark’s flexible computing platform allows us to write a single application for both streaming and batch jobs to understand the state of our CI pipeline for our team of 3 triage engineers. Using encoded patterns, Spark indexes log data for real-time reporting (Streaming), uses Machine Learning for performance modeling and prediction (Batch job), and finds previous matches for newly encoded patterns (Batch job).
Resource allocation in this mixed environment can be challenging; a containerized Spark cluster deployment, and disaggregated compute and storage layers allow us to programmatically shift compute resources between the streaming and batch applications.. This talk will go over design decisions to meet SLAs of streaming and batching in hardware, data layout, access patterns, and containers strategy. We will also go over the challenges, lessons learned, and best practices for this kind of setup.”
Nick Dearden, Confluent + John Thuma, Arcadia Data, Thomas Clarke, RCG Global ServicesRecorded: Jan 16 201956 mins
Digital transformation is more than just a buzzword, it’s become a necessity in order to compete in the modern era. At the heart of digital transformation is real-time data. Your organization must respond in real time to every customer experience transaction, sale, and market movement in order to stay competitive.
Streaming data technologies like Apache Kafka® and Confluent KSQL, the streaming SQL engine for Apache Kafka, are being used to detect and react to events as they occur. Combining this technology with the analytics insights from RCG and visualizations from Arcadia Data delivers a powerful foundation for driving real time business decisions. Use cases span across industries and include retail transaction cost analysis, automotive maintenance and loyalty program management, and credit card fraud detection.
Join experts from Confluent, RCG and Arcadia Data for a discussion and demo on how companies are integrating streaming data technologies to transform their business.
Watch now to learn:
-Why Apache Kafka is widely used for real-time event monitoring and decisioning
-How to integrate real-time analytics and visualizations to drive business processes
-How KSQL, streaming SQL for Kafka, can easily transform and filter streams of data in real time
Clark Bradley, Solutions Engineer at ZaloniRecorded: Jan 16 201950 mins
How does your organization collaborate with data? Aligning data management tasks across any size organization can be a challenge. This can be attributed to a lack of transparent data access, lack of big data skills, or antiquated toolsets that do not enable shared metadata for clear lineage of the data. Regardless of the reason, the results are slow, rigid decision-making processes.
While modernizing your data architecture for more agility can seem overwhelming, with an integrated platform that enhances collaboration, organizations can reap the benefits of quality data that is well understood. The data platform should provide users with the ability to fully understand all aspects of the data with a simple, unified user interface where the business and IT can define, transform and provision the data. All while providing right-sized governance for access, security and auditability.
Join Clark Bradley, Solutions Engineer with Zaloni, as he tackles modernizing your data platform and explains how your organization can expand collaborative practices with the Zaloni Data Platform.
By the end of the presentation, you’ll be able to answer these questions:
- Why is a data catalog important?
- What do I need to know about data quality?
- How does self-service play a role in the data strategy?
John Armstrong, Head of Product Marketing, PepperdataRecorded: Jan 16 201940 mins
Leveraging APM to Overcome Big Data Development and Infrastructure Performance Challenges
While businesses are deriving tremendous insights from ever-growing big data sets, development teams are challenged with increasingly resource-hungry workloads and overwhelming bottlenecks that impact productivity. This makes big data application performance management (APM) a must-have in today’s ecosystem. Join us to learn how APM can help enterprises overcome development and performance challenges associated with growing big data stores.
Attendees will learn:
- What is driving the demand for big data in application development
- Challenges application developers face when working with increasingly larger workloads
- How APM can mitigate these and other challenges, improve workflow productivity, and optimize resource effectiveness
Kapil Chhabra, Principal Product Manager at Rubrik and Aaron Delp, Director of Technical Solutions at RubrikRecorded: Jan 16 201930 mins
Native replication in NoSQL databases such as MongoDB, Apache Cassandra, and Datastax, provides high scale and availability. However, does it provide protection of critical data against logical errors and data corruption? Is native replication a good substitute for disaster recovery and other enterprise backup needs? Is your database protected for compliance and governance use-cases?
Watch this 30 minute webinar to learn about the top fallacies of using only native NoSQL database replication and the consequences of this approach to enterprise backup and recovery.
Brendan Peterson, Sr. Technical EvangelistRecorded: Jan 16 201913 mins
Learn how TIBCO Cloud™ Integration with Scribe and other TIBCO products are better together than standalone. We'll show you how to aggregate data and manage changes to make data import in TIBCO Spotfire® more efficient. See how TIBCO Mashery® can manage access to the TIBCO Cloud Integration API (which also helps your customers and partners manage their integrations), control versions of individual API endpoints, and more.
Host: Eric Kavanagh CEO The Bloor Group Analyst: Wayne Eckerson Principal Consultant, Eckerson Group Guest: Gene ArnoRecorded: Jan 15 201960 mins
We know that the evaluation process for choosing the right embedded BI product for your use case can be overwhelming. So here’s an easy webinar from a 3rd party that discusses:
-Benefits and challenges of adopting embedded BI
-Key issues with evaluating vendors
-Top 12 evaluation criteria you need to consider
-Live demo of one embedded BI product against the evaluation criteria
You’ll get to hear from business intelligence expert Wayne Eckerson from the Eckerson Group. Wayne will discuss key insights from his popular report, “Ultimate Guide to Embedded Analytics.” He’ll cover criteria such as embedding, customization, extensibility, multi-tenancy, and vendors.
Dave Montgomery, Director, Storage Platforms MarketingRecorded: Jan 15 201933 mins
The ‘general-purpose’ architectures that have served so well in the past are reaching their limits of scalability, performance and efficiency, typically using a uniform ratio of resources to address all compute processing, storage and network bandwidth requirements. As a result, the ‘one size fits all’ approach is no longer effective for data-intensive workloads. What is required for today’s data-centric architectures that address such diverse applications as big data, fast data, data analytics, artificial intelligence (AI) and machine learning (ML), are capabilities that enable more control over the blend of resources that each application needs so that optimized levels of processing, storage and network bandwidth can be scaled independent of one another, enabling both flexibility and composability.
Composable Disaggregated Infrastructures (CDIs) are becoming a popular solution – delivering greatly improved TCO and addressing the inflexible nature of many current IT architectures. With a total market CAGR of 58.2% (forecasted from 2017 to 2022), CDIs treat physical compute, storage and network fabric resources as services and use an API to create a virtual application environment that provides whatever resources the application needs in real-time to meet workload demands.
This webinar will cover the current economic pain-points of today’s IT infrastructures and introduce Western Digital’s OpenFlex™ line of composable infrastructure, fabric-attached storage products.
Sheila FitzPatrick President FitzPatrick & Associates, Patrick McGrath Digital Transformation Leader CommvaultRecorded: Jan 15 201952 mins
Companies raced to meet the May 2018 deadline to meet Europe’s General Data Protection Regulation with varying degrees of success, but if they learned anything since it is this: GDPR isn’t a one and done exercise. The GDPR might be the most eye-catching data privacy law on the books because of the outsized potential fine (up to 4% of revenue), but it is simply one of the first and most far-reaching such laws. Many more are emerging around the world and, indeed, in many US states. Some of the regulations overlap, while others conflict, meaning organizations face the ugly reality of having to deal with a patchwork quilt of laws.
In this webinar we’ll examine:
- Lessons learned to date
- The key differences between the new regulations
- Pain points companies are having (simply shutting off customers in
the EU is not a sustainable practice)
- Best approaches moving forward
Typical anti-money laundering process can be complicated, difficult, lengthy, highly manual, and prone to false positives. The data required for AML investigations is often complex, highly connected, and can come from multiple data sources within an organisation as well as from external providers.
Learn how to use Neo4j and the power of connected data along with graph analytics techniques to improve anti-money laundering investigations and compliance. We’ll demonstrate how Neo4j enables faster and more accurate AML investigations, even across diverse and siloed data landscapes.
Learn how DFLabs’ Security Orchestration, Automation and Response solution, IncMan SOAR, integrates and performs seamlessly with Cisco’s security suite, including its latest integration with Cisco AMP for Endpoints.
As organizations are exposed to more advanced and frequent attacks, speed of detection and response is critical in reducing financial and reputational damage.
Cisco AMP for Endpoints leverages cloud-based analytics to detect and respond to advanced threats in real-time. Used with Cisco’s security suite, including Threat Grid, Umbrella and Umbrella Investigate, threats can be assessed, and assessments of the network performed; but this consumes valuable analyst time.
IncMan SOAR allows security teams to automate repeatable tasks, including enriching initial threat indicators, allowing more time to focus on tasks which require human intervention.
By combining these solutions, security teams can automate and orchestrate the process from initial alert, to containment and remediation, reducing actionable detection and response times from hours to seconds.
Andy El Maghraby, Flashblade Systems Engineer bei Pure StorageRecorded: Jan 11 201937 mins
"Pure Storage hat mit dem Data Hub eine neue Klasse von Speicherarchitektur definiert, die nach vier Prinzipien für Modern Analytics entwickelt wurde. Es handelt sich um eine datenzentrierte Architektur für Backup und Data Warehouse, Data Lake, Streaming Data Analytics und KI.
Für Kunden, die ihre Daten über Anwendungen hinweg vereinheitlichen und teilen möchten, nutzt Data Hub die wichtigsten Stärken jedes dieser vier Datensilos, einzigartige Funktionen, die sie für ihre eigenen Aufgaben geeignet machen, und integriert sie in eine einzige einheitliche Plattform.
FlashBlade vereint alle Eigenschaften dieser Datensilos: hoher Durchsatz, natives Scale-Out, multidimensional performance und massiv parallele Architektur. "
Clarke Patterson - Head of Product Marketing - StreamSets; Kirit Basu - Head of Product Management - StreamSetsRecorded: Jan 10 201946 mins
With streaming platforms like Kafka, data arguably never rests. As data flows through and across data sources and destinations, it’s possible that sensitive data goes unnoticed and potentially gets in the hands of the wrong people or land in the wrong applications. In-stream data protection helps ensure that any data flowing through Kafka is protected from unwanted use and exposure.
In this session you'll learn:
-How to implement global data protection policies for all streaming data
-Detecting and protecting sensitive data within individual Kafka pipelines
-Implementing multiple data security policies to augment data at rest solutions
Chintan Udeshi, Security Product Marketing, Infoblox; Brandon Dunlap, ModeratorRecorded: Jan 10 201954 mins
Most organizations have multiple products and services, from multiple vendors and suppliers to address their cybersecurity needs. The lack of integration and inability to share critical information results in silos of technology that cause inefficiency, lack of agility, limited visibility and a poor security posture. How can an organization solve and streamline this improve their cybersecurity operations? Join Infoblox and (ISC)2 on January 10, 2019 at 1:00PM Eastern for an examination of how Infoblox and Fortinet have joined together to assist organizations in improving their security operations and reducing time to containment.
Gerd Danner from SAP & Frank Schuler from BackOffice AssociatesRecorded: Jan 10 201960 mins
Anyone planning a move to SAP S/4HANA will want to optimise their investment to achieve full business beneﬁts from day one. This means getting it right from outset — starting with the migration. Yet data migration is often not highlighted as a key work stream in S/4HANA projects. So, considering so many data migration projects run over time and/or over budget, the question to ask is how is your business planning to do this?
This informative webinar will discuss the various migration options available and share best practice customer cases.
Gerd Danner Vice President EMEA Centre of Excellence for Information Management at SAP
Frank Schuler SAP Mentor and Vice President SAP Technical Architecture with BackOffice Associates
Rachel Pedreschi, Senior Director, Solutions Engineering, Imply.io + Josh Treichel, Partner Solutions Architect, ConfluentRecorded: Jan 10 201954 mins
Analytic pipelines running purely on batch processing systems can suffer from hours of data lag, resulting in accuracy issues with analysis and overall decision-making. Join us for a demo to learn how easy it is to integrate your Apache Kafka® streams in Apache Druid (incubating) to provide real-time insights into the data.
In this online talk, you’ll hear about ingesting your Kafka streams into Imply’s scalable analytic engine and gaining real-time insights via a modern user interface.
Register now to learn about:
-The benefits of combining a real-time streaming platform with a comprehensive analytics stack
-Building an analytics pipeline by integrating Confluent Platform and Imply
-How KSQL, streaming SQL for Kafka, can easily transform and filter streams of data in real time
-Querying and visualizing streaming data in Imply
-Practical ways to implement Confluent Platform and Imply to address common use cases such as analyzing network flows, collecting and monitoring IoT data and visualizing clickstream data
Confluent Platform, developed by the creators of Kafka, enables the ingest and processing of massive amounts of real-time event data. Imply, the complete analytics stack built on Druid, can ingest, store, query and visualize streaming data from Confluent Platform, enabling end-to-end real-time analytics. Together, Confluent and Imply can provide low latency data delivery, data transform, and data querying capabilities to power a range of use cases.
Jatin Hansoty, Director of Solutions ArchitectureRecorded: Jan 9 201958 mins
A majority of the data collected by organizations today is wasted. Whether through poor analytics, lack of resources, or because they have too much of it. So how can organizations turn this around and actually start utilizing their data for powerful results?
More and more companies are taking their customer, product, patient, or other data and providing a 360-degree view using a governed and actionable data lake. By breaking down the silos associated with traditional data located in disjointed systems and databases, companies are finding new ways to improve loyalty programs, product development, marketing campaigns, and even find a new source of revenue from their data.
Join Jatin Hansoty, Director of Solutions Architecture at Zaloni, as he dives into real-world use cases from several of the world’s top companies. Learn from their architecture and the results they achieved.
Topics covered include:
- Best practices
- Common pitfalls to avoid
- Real-world use cases
- Future-proof architecture
Brian Bulkowski, CTO & Co-founder, Aerospike and Alper Ilkbahar, Vice President & General Manager Memory and Storage, IntelRecorded: Jan 9 201959 mins
Intel’s Optane DC persistent memory with 3D Xpoint technology will fundamentally change the database market. Machine Learning-based applications such as fraud prevention, digital payments, real-time bidding and recommendation engines ingest hundreds of terabytes or more of data continuously.
Effectively storing and accessing this data to allow the applications to make the right decisions in real-time is critical. Intel Optane DC persistent memory is a ground breaking innovation that will enable databases to scale and meet the demands of modern machine learning based applications.
Intel and Aerospike have been working for years to test and tune how Optane DC persistent memory works for production level environments.
Aerospike Enterprise Edition 4.5 is the first commercially available open database that takes full advantage of Intel Optane DC persistent memory (PM) technology. The combination of Aerospike 4.5 and Optane DC promise to deliver massive scale improvements – orders of magnitude - at low cost, with unsurpassed persistence and reliability.
Join this Special Customer Briefing featuring Brian Bulkowski, CTO & Co-founder of Aerospike, along with Alper Ilkbahar, Vice President & General Manager Memory and Storage Solutions Group of Intel. Each will share perspective on the significance of these advancements.
You’ll hear details on Intel Optane DC Persistent Memory directly from one of Intel’s most influential executives and Brian’s expectations for what 4.5 will mean for Aerospike customers.
Andy Sheldon, VP Marketing, Unifi SoftwareRecorded: Jan 9 201959 mins
It seems like whichever is your favorite media company or industry analyst, everyone is talking about Data Catalogs. Gartner now says that Data Catalogs are on the declining slope of their Hype Cycle, a sure sign that this is no longer a fad technology and has hit commercial critical mass to justify an actual product category, even if this does not extend in their case to a Magic Quadrant. There are now sufficient data catalogs in daily use at both large and small organizations to underscore the value for a variety of use cases including analytics, big data projects, data management and business intelligence initiatives. In this webinar, Andy Sheldon, VP of Marketing at Unifi, will describe some of the use cases and value proposition for a number of the Unifi Data Catalog customers. He will also describe some of the early “Ah Ha” moments when customers implement and start using the Data Catalog.
Miguel Torres - Principal Architect, TIBCO Software Inc.Recorded: Jan 9 201952 mins
Blockchain, the tech underneath the exploding cryptocurrency Bitcoin, is turning the supply chain world upside down – in a good way! Blockchain is shaping up to be the answer to many supply chain problems, like lack of trust, transparency, and traceability. Want to know how you can take advantage of this emerging technology and become a digital supply chain leader?
Join this session to learn how blockchain is transforming the supply chain world, the pros and cons of creating your own global supply chain blockchain, and solution offerings to make blockchain a reality for you. The session will discuss all of the benefits, challenges, and requirements of integrating into global blockchain with a live demo demonstrating a real-world example.
What you will learn:
Benefits of blockchain in the supply chain world
-Tamper proof system
-Minimize courier costs
-Improve inventory management
Public versus permissioned blockchains
How to integrate into a permissioned supply chain blockchain
This webinar gently introduces H2O Driverless AI tool to Data Scientists at all levels. BI Analysts who are on the path to be a Data Scientist would also find this tool very useful. Discussion of a business problem will be followed by a quick demo. Without writing a single line of code, we will build a production deployable AI model. Learn things like choosing a Target Variable, a Scorer, and also how to play with the Accuracy, Time and Interpretability to build a model. The webinar will also explore on how to interpret complex non-linear models with simple visuals that can be used to communicate to a business or regulators easily.
About Karthik Guruswamy:
Karthik is a “business first” data scientist. His expertise and passion have always been around building game-changing solutions - by using an eclectic combination of algorithms, drawn from different domains. He has published 50+ blogs on “all things data science” in Linked-in, Forbes and Medium publishing platforms over the years for the business audience and speaks in vendor data science conferences. He also holds multiple patents around Desktop Virtualization, Ad networks and was a co-founding member of two startups in silicon valley.
Kara Gillis, Director of Product Marketing, SplunkRecorded: Jan 8 201934 mins
You’ve heard about the top trends in IT - but how do you use this information to take meaningful action? How do you encourage collaboration between the business, developers, and operations? How do you up-level IT as a service provider that reduces manual processes and troubleshooting without context to focus on strategic initiatives that impact important KPIs? By avoiding these 8 mistakes IT practitioners makes!
In this webinar, we’ll talk through avoiding:
-Friction between IT and the business
-Cumbersome and difficult root cause analysis
-Not preparing for Incident response
-And many others!
You’ll walk away with an understanding of how to apply lessons learned to your own organization and operate maximum scale!
Scott Andersen, Solutions Consultant , TIBCO SoftwareRecorded: Jan 8 201960 mins
Create a beautiful report-building oasis for your users.
Providing your users and customers with helpful reports and dashboards is one thing. But you can’t expect to predict every question they will have.
Ad hoc or self-service reporting puts the power of report-making into your users’ hands. Provided a drag-and-drop interface, users of any skill level can build reports and get answers to custom questions—all on their own.
With a few simple steps, you can turn your complex source data into easy-to-understand fields & measures and create a beautiful report-building oasis for your users.
In this webinar, you will learn how to:
Prepare data for your users by creating a metadata layer that makes it easy for them to understand
Build an ad hoc report from scratch using an intuitive, web-based design environment
Make customizations to reports, save them for later use, and share reports with others
Frank Liberio, Former Global CIO at McDonalds and Cat Huegler, Senior Director, Customer Success at MuleSoftRecorded: Jan 8 201939 mins
Hear from Frank Liberio, former CIO of McDonald's, as he reveals the success behind the global super chain’s digital transformation initiatives. Learn how his team utilized MuleSoft’s Anypoint Platform to open up new digital channels at the speed required by the increasingly competitive marketplace. From mobile applications and kiosks to partnerships with on-demand delivery apps, McDonald’s created a truly omnichannel experience for their customers.
Mohit Dhawan, VP EngineeringJan 17 20196:00 pmUTC45 mins
Migrating NAS file data can be a nightmare. Join VP of Engineering, Mohit Dhawan, as he walks through how Komprise eliminates the errors and the guesswork by automating the migration with a reliable solution that is resilient and handles network and storage glitches.
What started to evolve as the most agile and real-time enterprise data fabric, Data Virtualization is proving to go beyond its initial promise and is becoming one of the most important enterprise big data fabrics.
Attend this session to learn:
* What data virtualization really is
* How it differs from other enterprise data integration technologies
* Why data virtualization is finding enterprise wide deployment inside some of the largest organizations
Дмитрий Лицов, Старший системный инженерJan 22 20198:00 amUTC60 mins
Поскольку критически важные бизнес-данные становятся все более распределенными - от ядра до границы сети и облаков, поскольку появляются новые регулирующие правила (GDPR) и новые угрозы для бизнес-операций (например, кибератаки), защита данных приобретает важное значение и становится приоритной в процессе трансформации ИТ. Приходите узнать, как портфель защиты данных Dell EMC может помочь вам эффективно трансформировать стратегии защиты данных с помощью комплексного подхода к защите данных - независимо от того, где ваши данные находятся, на локальной площадке или в нескольких облаках, и как вы их используете. Также узнайте, что нового в этом году появилось в плане безопасности и защиты данных в Data Domain, Integrated Data Protection Appliance (IDPA) и в программной обеспечении Data Protection Suite (DPS), что позволяет еще больше упростить, автоматизировать и модернизировать текущую среду защиты данных, предоставляя трансформирующую инфраструктуру защиты данных для удовлетворения будущих потребностей.
Jon Gery International IT Manager, GameStop and Seyi Verma Product Marketing Director, DruvaJan 22 201910:00 amUTC52 mins
GameStop, a global Fortune 500 company (NYSE: GME), needed an efficient cloud backup and recovery solution to meet business continuity requirements across its 7,200 retail stores in 14 different countries. The company had a myriad of virtualized environments with over 250TB of data but its on-premises backup infrastructure and procedures were so complex and inefficient, that GameStop was unable to meet business continuity SLAs.
GameStop turned to Druva, and secured its data in enterprise server workloads and virtual environments through Druva Phoenix. Register for this webinar with special guest speaker Jon Gery, International IT Manager at GameStop, to hear how the company was able to:
Unify global data protection requirements within a single system
Increase the backup footprint of virtual machines by 20%
Easily achieve their business continuity SLAs
Reduce backup and recovery operating expenses by 70%
Don't miss this informative webinar — Register today!
Sagi Grimberg, Lightbits; J Metz, Cisco; Tom Reu, ChelsioJan 22 20196:00 pmUTC75 mins
In the storage world, NVMe™ is arguably the hottest thing going right now. Go to any storage conference – either vendor- or vendor-neutral, and you’ll see NVMe as the latest and greatest innovation. It stands to reason, then, that when you want to run NVMe over a network, you need to understand NVMe over Fabrics (NVMe-oF).
TCP – the long-standing mainstay of networking – is the newest transport technology to be approved by the NVM Express organization. This can mean really good things for storage and storage networking – but what are the tradeoffs?
In this webinar, the lead author of the NVMe/TCP specification, Sagi Grimberg, and J Metz, member of the SNIA and NVMe Boards of Directors, will discuss:
•What is NVMe/TCP
•How NVMe/TCP works
•What are the trade-offs?
•What should network administrators know?
•What kind of expectations are realistic?
•What technologies can make NVMe/TCP work better?
John Bell, Sr. Consultant & Brian Guetzlaff, Engineering ManagerJan 22 20197:00 pmUTC60 mins
In this webinar, John Bell and Brian Guetzlaff will explore the advanced capabilities of Caringo Swarm’s Content Management API, including managing user access, usage and content (using collections and metadata). They will provide a live demonstration and show you how to use the Swarm Object Storage Content Portal User Interface (UI). Attendees will also have the opportunity to ask questions throughout the webinar.
Tony Liau, Veritas & Matt Sirbu, SoftchoiceJan 22 20198:00 pmUTC75 mins
Digital transformation is causing infrastructure to evolve faster than ever and managing data must be the top priority for any modern enterprise. Most organizations are working towards transforming their IT infrastructure and its associated capabilities, but traditional environments are being disrupted by cloud, virtualization, open source and hyper converged infrastructure. The need for holistic modern data management remains as crucial as ever to navigate this complexity and leverage information for strategic advantage.
Join us on this webinar as Tony Liau, Director, Product & Solutions Marketing at Veritas Technologies and Matt Sirbu, Director of Data Management & Data Center Infrastructure at Softchoice discuss how the most important barrier to adopting hybrid IT is to have an intuitive data management solution.
Attendees of this webinar will learn how you navigate through these IT obstacles to build an agile solution that extends across on-prem and off-prem architectures.
About the speakers:
Tony brings over 15 years of enterprise experience to Veritas, taking innovative data management solutions to market. In his current role as Director and Head of Product & Solutions Marketing for Data Protection and management, he leads product launch, go-to-market and global campaign strategy. Before joining Veritas, Tony held senior roles at other data protection and security companies such as Barracuda Networks, Symantec and Cisco.
Matt Sirbu has spent over two decades in Information Management; focusing on Data Center Optimization, Protection, Availability and Data Intelligence solutions. Having covered most market segments across North America, he has demonstrated an ability to enter new industries, learn their unique requirements, and develop comprehensive strategic direction with exceptional results. He joined Softchoice in July of 2015, coming from Dell where he was the Director of Storage Strategy for North America.
Евгений Папантониу, специалист по Windows Server, Microsoft и Никита Степанов, системный инженер Dell EMCJan 23 20198:00 amUTC60 mins
Более 15,000 серверных кластеров используют Storage Spaces Direct (S2D) для реализации гиперконвергентной архитектуры (HCI) на основе Windows Server 2016/2019 и пре-валидированного аппаратного обеспечения. Посетите это вебинар, чтобы узнать, почему компании во всем мире используют Storage Spaces Direct, и каким образом Dell EMC совместно с Microsoft может вам помочь ускорить трансформацию ИТ на основе готовых решений Dell EMC S2D Ready Nodes. Этот вебинар также охватит новости о Windows Admin Center и новом функционале хранилищ данных в Windows Server 2019.
Erik Archer Smith, Marketing Director of ABM, Henry Li, Director of Business DevelopmentJan 23 20196:00 pmUTC60 mins
In the last 10 years the digital advertising space exploded with the growth of DMPs and DSPs, giving digital marketers new ways to find and attract audiences. In the last few years, the market has shifted again with the introduction of CDPs giving marketers a new and powerful tool to augment and supercharge existing services. This webinar will walk you through what each service does, when you should consider using it, and best practices for combining them together.
This Webinar Will Cover:
- DMPs, DSPs and CDPs definitions and use cases
- How to leverage/enrich 1st party data inside existing martech and adtech
- Security and ensuring you maintain customer privacy and ethical use of data
Presentation Leaders: Doug Austin and Tom O'ConnorJan 23 20196:00 pmUTC90 mins
2018 was another notable year for eDiscovery case law with several significant rulings that stand to impact eDiscovery practices and the admissibility of evidence. How can these key case law decisions affect discovery within your organization? This CLE-approved* webcast session will cover key 2018 case law decisions covered by the eDiscovery Daily blog and what the legal profession can learn from those rulings. Topics include:
+ Technology Assisted Review best practices and trends
+ The use of sampling to settle disputes
+ Admissibility vs. proportionality and privacy disputes
+ Form of production disputes and the issues involved
+ Key case rulings on discoverability of mobile device data
+ Privilege disputes and clawback requests
+ Impact of rules changes on boilerplate objections
+ The state of sanctions three years after the 2015 Fed Rules updates
Doug Austin is the Vice President of Products and Services for CloudNine. Doug has over 30 years of experience providing legal technology consulting, technical project management and software development services to numerous commercial and government clients. Doug is also the editor of the CloudNine sponsored eDiscovery Daily blog, which is a trusted resource for eDiscovery news and analysis, and has received a JD Supra Readers Choice Award as the Top eDiscovery Author for 2017 and 2018.
Tom O’Connor is a nationally known consultant, speaker, and writer in the field of computerized litigation support systems. Tom’s consulting experience is primarily in complex litigation matters.
Jeff Ready, CEO and Co-Founder of Scale ComputingJan 23 20196:00 pmUTC60 mins
In today’s global economy, businesses and governments are instrumenting the edge with ever greater intelligence, so the desire for reliable, scalable and agile edge infrastructure solutions continues to grow. Implementing a solution ideal for highly distributed, on-premises environments, such as retail stores, bank branches or manufacturing, with multiple locations managed by the enterprise from a central location has become and will remain critical.
Learn how providers are addressing this market by providing edge infrastructure that has the capacity to run various IT and OT workloads, is space conscious and can be managed at each individual location by generalists. This in return reduces the time and budget spent managing technology and allows companies to focus more on growing their business and serving their customers.
Attendees will learn:
How to maximize uptime at the edge
The extraordinary ease of use
How a converged platform simplifies deployment and management
Total management cost reductions of 60%-80% with automation and intelligence
Jeff Ready, CEO of Scale Computing, is a high tech entrepreneur and executive with deep roots in technology and industry trends. Jeff has a keen interest in current and future technologies of IT infrastructure to help alleviate the management and overall costs of IT in your business.
Brian Bulkowski, Founder and CTO, Aerospike; Matt Bushell, Director of Product Marketing, AerospikeJan 23 20199:00 pmUTC60 mins
Artificial Intelligence / Machine Learning is being implemented to make better decisions to avert fraud and lower risk for digital payments. These systems thrive on data: the more you can feed them, the better they perform. We call this “Hungry AI.” Furthermore, fraud and risk decisions need to be conducted in real-time (milliseconds or less).
Unfortunately, most data infrastructures aren’t built to handle real-time data analysis, ingesting and acting on massive datasets at the global scale needed to mitigate these problems.
Register now for this webinar to learn:
- The key characteristics between offline and online AI/ML
- Why more data is better, and the measurable impacts to your business
- Real-world AI/ML real-time use cases for fraud and risk prevention that are deployed today
Patrick Smith, Pure StorageJan 24 201910:00 amUTC45 mins
The increasingly competitive climate within Financial Services means that enhancing business outcomes by leveraging AI is essential. Data has never been more important to business success and a key aspect to optimising its value is in conjunction with AI.
In this session we will discuss the areas in which Financial Services are looking to leverage their data together with AI and some of the considerations for successful implementation of an AI infrastructure that accelerates time to value for these projects.
Florent Voignier, Founder and Chief Technical Officer, IndeximaJan 24 20193:00 pmUTC22 mins
Every day, your business is producing new data, regardless of how many employees you have or what industry you work in. That data is full of valuable information that can help you better serve your customers, discover new opportunities, and take your business to the next level.
But to realize that business value, you need the proper tools to transform that data into actionable insight.
Best Practices to learn:
1. Reduce query cost and increase your BI performance.
2. Key factors to succeed with BI on the cloud.
3. Get more from your existing data cluster.
Mike Harding, Product Manager - Microsoft Storage Solutions, HPEJan 24 20194:00 pmUTC60 mins
Learn how leading companies are improving application performance, reducing IT cost and simplifying their IT administration with Microsoft solutions from HPE. This free webinar introduces HPE Microsoft Storage Solutions that are ensuring business-critical performance and availability for Microsoft SQL Server, Exchange, SharePoint, Windows cloud and Azure Stack infrastructure. Customer case studies and solution offering details will show how to get enhanced SQL Server performance, Higher-density and scale Exchange environments, and simplified administration across your Windows environment with free plug-ins and automation tools.
Join us as we continue this series of webinars specifically designed for the community by the community with the goal to share knowledge, spark innovation, and further build and link the relationships within our HPCC Systems community.
Featured speakers and topics include:
•Rob Mansfield, Senior Data Scientist, Proagrica - Dapper - A bundle to make your ECL neater
Have you ever written a long project for a simple column rename and thought, this should be easier? What about nicely named output statements? Yeah they bother me too. Oh, and DEDUP(SORT(DISTINCT()))? There is a better way! Learn how dapper can help!
•Bob Foreman, Senior Software Engineer, HPCC Systems, LexisNexis Risk Solutions - ECL Tip: The Seven Faces (Forms) of Dr. LOOP (Function)
The LOOP function has always been a powerful, yet tough ECL function to understand and use. Bob will review and examine the upcoming major changes to this documentation and showcase new examples.
•Lorraine Chapman, Consulting Business Analyst, LexisNexis Risk Solutions - Update on Academic Collaboration
Lorraine will share an update on recent collaboration, upcoming academic events and the 2019 HPCC Systems Internship Program.
John Withers, Product Marketing Manager, MuleSoft and Jimil Patel, Product Marketing Manager, MuleSoftJan 24 20195:30 pmUTC63 mins
Business processes are complicated and often manual due to an underlying technology landscape that is not connected, constantly changing, and increasingly complex. With MuleSoft’s Anypoint Platform™, unlock data from your systems, orchestrate business processes that automate and optimize workflows, and package them as reusable services to transform your organization into a digital platform.
Across industries from healthcare to banking to retail, MuleSoft is powering business automation, and helping companies deliver business processes as reusable services, resulting in faster innovation, improved customer satisfaction, and increased revenue.
Watch this webinar, which includes a demo of the Anypoint Platform, to learn how to:
- Unlock systems 64% faster: Easily connect to core systems of record and unlock data with out-of-the-box connectors and graphical data mapping
- Accelerate process automation: Orchestrate processes using a drag-and-drop development environment supported by a library of pre-built implementation templates
- Reuse business capabilities: Package processes into reusable services and publish them for internal and external use
Krishna Subramanian, COO Komprise, and Jon Toor, CMO CloudianJan 24 20196:00 pmUTC45 mins
The sheer volume of data produced today is staggering, doubling approximately every two years. But this is nothing compared to how data is set to grow over the next decade. Traditional approaches to managing your organizations' data struggle to keep up with today's scale and will prove to be entirely inadequate over the next few years.
Can your current environment support this surging data? How will you manage this growth – while keeping cost-effective and meeting your business SLAs? Are you prepared?
Join Komprise and Cloudian to learn about the latest for this informative webinar. We will discuss data management and protection strategies that your organization can implement today to enable your organization to scale with tomorrow's exponential data growth.
Some of the topics to be covered:
- Data tiering and migration strategies
- Achieving cost-effective data management
- Strategies for hybrid and multi-cloud environments
- Flexible data protection options
- Architectural considerations across your storage environments
David Clark, Host, VenturebeatJan 24 20196:00 pmUTC60 mins
By 2020, 90 percent of businesses will have moved to a hybrid cloud infrastructure, seizing both the competitive advantages of digital evolution -- and the economic and strategic benefits of maintaining legacy applications on premises. But once your data moves outside your private architecture, you're facing a new world of potential attacks and security breaches that your current security practices just can't match.
Just for starters, not only will you need to implement effective RBAC to lower risk of unauthorized access and enforce mandatory multi-factor authentication, you'll also have to tap into the power of automation to validate security compliance baselines, detect unauthorized cloud config changes, and power self-healing infrastructure. Additionally, application-centric security is a must, with real-time visibility into application dependencies and performance metrics, plus an automatic micro-segmented security policy enforcement, and more.
To learn more about the advantages of hybrid cloud architectures, where your security needs shoring up, and how to best protect your enterprise and data with automated and application-centric security practices, don't miss this VB Live event!
Register for free now.
*Why you need a single, fully tested, security-first infrastructure platform
*How to converge storage, computing, and networking
•A full understanding of security best practices
•How to protect against data breaches, unauthorized access, and other threats in a multi-cloud world
Dave Clark, Host, VentureBeat
Mike Wronski, Principal Marketing Manager, Nutanix
Niel Ashworth, Security Solutions Architect, Nutanix
Demetrius Comes, VP of Engineering, GoDaddy
Matheen Raza and Sandeep Dabade from QuboleJan 24 20196:00 pmUTC60 mins
As the volume, variety, and velocity of data increases, the cloud is the most efficient and cost-effective option for machine learning and advanced analytics. Organizations looking to scale their big data projects can do so with greater ease with a cloud-native data platform.
Qubole provides a single platform for data engineers, analysts, and scientists that supports multiple use cases -- from machine learning to predictive analytics. The platform saves organizations up to 50 percent in data processing costs by leveraging multiple engines like Apache Spark, Presto, and Hive, and automatically provisions, manages, and optimizes cloud resources.
Join experts from Qubole as they demonstrate how to get the most out of your data on the cloud. In this webinar, you'll learn:
- The benefits of a single platform and centralized access to data
- How to pick the right data processing engines and tools
- To save money with intelligent cluster management and financial governance
- Key considerations to evaluate cloud data platforms
Tom Phelan, Chief Architect, BlueData; Nanda Vijaydev, Director, Solutions, BlueDataJan 24 20198:00 pmUTC60 mins
Join this webinar to learn how you can accelerate your deployment of TensorFlow and AI / ML in Financial Services.
Keeping pace with new technologies for data science, machine learning, and deep learning can be overwhelming. And it can be challenging to deploy and manage these tools – including TensorFlow and many others – for data science teams in large-scale distributed environments.
This webinar will discuss how to deploy TensorFlow and other ML / DL tools in the Banking, Insurance, and Capital Markets industries. Learn about:
-Example use cases for AI / ML / DL in Financial Services – with an enterprise case study
-Using TensorFlow and other ML / DL tools with GPUs and containers
-Overcoming deployment challenges for distributed environments – including operationalization
-How to ensure enterprise-grade security, high performance, and faster-time-to-value
Brian Eichman, Director of Cloud Ecosystems & Product Development, CoreSiteJan 24 20199:00 pmUTC20 mins
The increasing need for agility and faster innovation has many organizations implementing cloud-first strategies. Spreading core business services and data across a hybrid cloud environment multiplies complexity.
Watch this webinar to learn about how the CoreSite Open Cloud Exchange® can simplify your hybrid IT strategy with a single connection to multiple cloud providers.