Welcome to the big data and data management community on BrightTALK. Join thousands of data quality engineers, data scientists, database administrators and other professionals to find more information about the hottest topics affecting your data. Subscribe now to learn about efficiently storing, optimizing a complex infrastructure, developing governing policies, ensuring data quality and analyzing data to make better informed decisions. Join the conversation by watching live and on-demand webinars and take the opportunity to interact with top experts and thought leaders in the field.
During the next 5 years, machine learning is poised to play a pivotal and transformational role in how IT Infrastructure is managed. Two key scenarios are possible: transforming infrastructure from a set of under-utilized capital assets to a highly efficient set of operational resources through dynamic provisioning based on consumption; and the identification of configurations, dependencies and the cause/effect of usage patterns through correlation analysis.
In the world of IT infrastructure, it’s all about efficient use of resources. With on-premise infrastructure (compute, storage and network) utilization rates for most organizations in the low single digits, the cloud has sold the promise of a breakthrough. For those organizations moving to Infrastructure as a Service (IaaS), utilization in the middle to high teens is possible, and for those moving to Platform as a Service (PaaS), utilization in the mid-twenties is within reach.
Dynamic provisioning driven by demand is essentially the same operational concept as power grids and municipal water systems – capacity allocation driven by where resources are consumed, rather than where they are produced.
The second part of the breakthrough relates to right-sizing infrastructure. Whether this is network capacity or compute Virtual Machine size – machine learning will enable analysis of the patterns of behavior by users and correlate them to the consumption of infrastructure resources.
During the near term, these benefits will be much more tactical. Automated discovery combined with behavioral correlation analysis will virtually eliminate the need for manual inventory and mapping of components and configuration items in the IT ecosystem to reveal how the ecosystem is operating.
Today, IT has the opportunity to automate the mapping of components in their infrastructure to provide a more accurate and actionable picture.
Splunk Enterprise Security (ES) is an analytics-driven SIEM that powers successful security operations teams. But did you know it is actually made up of distinct frameworks that can each be leveraged independently to meet specific security use cases?
Join us to learn the technical details behind key ES frameworks including: asset and identity correlation, notable event, threat intelligence, risk analysis, investigation and adaptive response. Splunk experts will discuss real-world examples and demo the key frameworks, which will help you to solve your security challenges.
Girish Bhat, director of security product marketing at Splunk
Chris Shobert, senior sales engineer at Splunk
10min - Overview of Splunk Enterprise Security
40min - Demonstration of key frameworks
10min - Q&A
What if you could directly ask questions of your data and the software could respond with a selection, filter, or new visualization? In this DSC webinar, the Tableau Research team explains natural language queries and how they are (already) helping you visualize your data.
In this webinar, explore:
• What natural language processing (NLP) is
• Examples of how NLP is already used in Tableau's geocoding and map search
• A demonstration of some research prototypes that you might see in the not too distant future.
Come participate in the discussion and tell us what things you would say to or ask your visualization!
DevOps vs. NoOps. Capulets vs. Montagues. The debate has reached Shakespearean proportions, but unlike the tragedy of Romeo and Juliet, peaceful coexistence is nigh. Learn the benefits of both methodologies and how to apply them within your organization harnessing the power of Microsoft Azure.
Join 10th Magnitude Cloud Automation Engineer Chris Nagel and Stackify CEO Matt Watson as they discuss the evolution of the two schools of thought, as well as the roles played by developers and sys admins in the process. Whether you choose the DevOps or NoOps route, you’ll leave our one-act play with a host of tools, use cases and best practices.
Any organization, including the United States government, is only as strong as its weakest link. As the amount of data we collect, analyze, and store proliferates, the risk of cyber attacks significantly increases. In a time when state and local technology systems are interconnected and share sensitive data with each other and federal databases, cybersecurity along with entire information chain is essential.
This viewcast will examine how governments can collaborate to strengthen their most vulnerable points. Route Fifty and Nextgov will sit down with state and federal officials to discuss the cross-government cybersecurity resources available to technology managers at all levels of government and how each individual, department, and organization can effectively strengthen their cyber defense.
Get inspired and informed by Eric Ogren, Sr Analyst at 451 Group, and Matt Rodgers, Head of Strategy at E8 Security, as they discuss how behavioral analytics is transforming security operations with improved visibility across endpoints, users, and networks. Learn how that translates into better detection and faster investigation time for security incidents.
Behavioral analytics is about un-complicating your security environment. The information is there, but how do you make the most of it? Security analysts should understand what’s happening without having to piece an incident together from scratch.
1. Why security teams are a company’s true heroes
2. Black holes: turning “more data” into “better data”
3. Navigating the grey area between the binary poles of true positive and false positive
Located at the intersection of pharmaceutical manufacturers and healthcare providers, AmerisourceBergen Corporation (ABC) is a $147B global pharmaceutical services and distribution company. The company’s success rests on knowing its diversified customer mix and serving them with innovative programs and solutions.
Formed in 2001, AmerisourceBergen had become a group of companies under one banner. Join us to hear how the company designed a data strategy to view business interactions holistically across multiple business groups to serve customers as a single company. Pete Stormer, Director of Data Management at ABC, will talk about their experience, best practices, and lessons learned to support the journey to one ABC.
The webinar will also cover the following topics:
• The evolution of MDM as a strategic platform to deliver trusted data to business users
• Best practices for master data management, data quality, data governance and reference data
• Defining the enterprise strategy for data management in complex business environments
In the Media and Entertainment industry, organizations that can store and access all digital video at every step in the production process have a significant competitive advantage which can result in new revenue streams. In this webinar, Richie Murray, President and Founder of Bridge Digital (a solutions integrator specializing in helping M&E companies create, distribute and monetize their video assets more efficiently), and Tony Barbagallo, VP of Product for Caringo (a massively scalable storage, delivery and management platform designed specifically for content), will discuss how to leverage secure secondary storage without disruption to existing digital video workflows. Topics covered include:
- Definition of “secure secondary storage” and how current offerings differ from traditional archive solutions
- How secondary storage can help optimize pre-edit, edit and post processes
- How to seamlessly plug secure secondary storage into video workflows
- A demo on optimizing Avid shared storage with Marquis Project Parking through automated transfer to/from the Caringo platform
In this upcoming webinar, Jonah Kowall, VP of Market Development and Insights at AppDynamics, will present his thoughts and opinions on the current and future state of DevOps, specifically targeting enterprises that are undergoing or experimenting with DevOps and agility.
Jonah will explore the demands for agility in order to remain customer focused and drive digital business. He’ll then share some best practices and ideas to overcome one of the largest challenges to business utilizing DevOps — creating trust across organizations and getting your people on board with the creation of a DevOps culture.
- Understand the importance of agility to meet the demands of customers and drive digital business.
- How to build and execute on cross-functional team building.
- Best practices for the essential process of collecting and sharing data across all teams.
- Gain best practices for the cultural change, and how to facilitate ownership, trust, and independent decision making.
About the presenter:
Jonah Kowall is the VP of Market Development and Insights at AppDynamics, driving product strategy, roadmap and vision, entry into new markets, and digital evangelism. With 15 years of experience as an IT practitioner at several startups and larger enterprises, Jonah has focused on infrastructure and operations, security, and performance engineering. Jonah also previously worked at Gartner, focusing on availability, performance, and IT operations management (ITOM) and headed their influential APM and NPMD Magic Quadrants as a Research VP.
It's time to modernize you disaster recovery process, making your organizational data accessible and secure.
An important aspect of enterprise file services is ensuring the uptime of users and offices around the world, with proper methods in place to recover from any scenario: fire, hardware failure, or, increasingly, a ransomware attack.
Join this webinar with CTERA and Western Digital to learn how to deploy a zero-minute disaster recovery plan, ensuring:
- Zero office downtime
- Significant cost savings and fewer headaches
- Business continuity during internet outages
- Seamless failover of file access permissions
- Improved data reliability
Are you tormented by latency or outages that lead to lost revenue? Are you frustrated with the lack of comprehensive visibility that siloed solutions create? Never fear!
Join this webinar to learn how you can solve these problems and optimize your infrastructure with Dell EMC and Splunk. You’ll also see certified apps that Dell EMC has built in Splunk to help joint customers derive Operational Intelligence from their IT assets. We’ll share best practices, tips and tricks for successful deployments and rapid time-to-value that Splunk and
Dell EMC have learned from working with customers around the globe.
Join us as we continue this series of workshops specifically designed for the community by the community with the goal to share knowledge, spark innovation and further build and link the relationships within our HPCC Systems community.
Episode 4 will include 15 minute Tech Talks on HPCC Systems
In this webinar, join Rhonda Ascierto of 451 and Aaron Peterson with RunSmart OS in a presentation on what is next for infrastructure management software. Far beyond just collecting data, monitoring and alarming, the session will address the efficiency and other added benefits automated control can bring when implemented. Case studies will be presented to highlight examples of recapturing underutilized assets and dynamic application provisioning taking advantage of the cloud.
The Cloud Standards Customer Council has published a reference architecture for securing workloads on cloud services. The aim of this new guide is to provide a practical reference to help IT architects and IT security professionals architect, install, and operate the information security components of solutions built using cloud services.
Building business solutions using cloud services requires a clear understanding of the available security services, components and options, allied to a clear architecture which provides for the complete lifecycle of the solutions, covering development, deployment and operations. This webinar will discuss specific security services and corresponding best practices for deploying a comprehensive cloud security architecture.
Special Promotion for the TIA 5G Breakfast, April 28, in Boston. Visit TIAonline.org to learn more and use code 5G-WEBCAST for 20% discount.
Special promotion for TIA Executive Connectivity Jam, June 5-8, Dallas, TX. Visit TIAConnectivityJam.org. Code: 2017CJ-WEBCAST for 10% discount.
This year, it is expected that 74% of all online traffic will be video with a combination of network broadcasters and Over The Top (OTT) content providers bringing compelling content to and through the Internet. 55% of people in the United States watch video online every day. Facebook, Snapchat, Twitter and Tumbler have launched live streaming capabilities. Twitch, the number one online gaming video platform, sees 9.7 million daily active users (DAUs), with 2 million users broadcasting live streams each month. eSports is a huge gaming draw with some companies seeing upward of 96 million monthly average users (MAUs). And we haven’t even mentioned augmented and virtual reality, yet.
Network speed, capacity and flexibility to meet the growing demand are improving - but not fast enough and with significant needs still not being addressed. This panel will look at the challenges facing getting video to and from end users including: how to manage the need to deliver to both broadcast and OTT; how to ensure seamless customer experiences; what to expect as 4K and higher resolution cameras become ubiquitous; what network buildout and optimization need to take place to ensure the demand for ultra high speed broadband is met.
You can continue the conversation after the webcast during the upcoming TIA Connectivity Jam, being held June 5-7 in Dallas, TX.
-- Bo Daly, President and Chief Business Officer, Super Evil Megacorp
-- Yuval Fisher, CTO MVPD, Imagine Communications
-- Kevin Gage, EVP Strategic Development and Chief Technology Officer, ONE Media
-- Rashmi Misra, Worldwide General Manager Digital Video Services, Hewlett Packard Enterprise (HPE)
Marketo helps marketers get the most relevant messages to prospects at the right time. But too many companies’ Marketo systems are full of inaccurate or outdated data—including bad email addresses, postal addresses, and phone numbers.
You need to protect these fields’ quality and integrity if you want to communicate with your customers and prospects, and obtain a single view of them across all channels. Fortunately, this is easy to achieve with the right knowledge and technology.
Attend this webinar to learn how you can:
•Quickly visualize and monitor the health of your Marketo data
•Fix the most impactful data quality problems right inside Marketo
•Enrich your data with business and/or consumer information
Splunk Enterprise is the data collection, indexing and visualization engine for Operational Intelligence. Splunk Enterprise 6.5 delivers fundamental advances in machine learning, data analysis, platform management and deployment TCO.
Join this webinar to see a demo and learn how you can:
• Apply machine learning to help detect, predict and prevent what matters most to your organization
• Use tables to prep and analyze data without using SPL
• Lower your storage costs by rolling historical data to Hadoop
• Use free dev/test licenses to explore new data sources and use cases
• Process critical data without interruption from metered license enforcement
The latest release of Splunk Enterprise 6.5 helps you maximize the value of your data and your investment in Splunk. With the new features, doing big data analysis is now more affordable than ever. See it in action by attending this webinar.
Legal Hold Notifications from CloudNine empower legal and information professionals by enabling them to quickly and comprehensively develop, send, track, and manage legal hold notices to key custodians with the goal of protecting data integrity for potential or current investigations or litigation.
Delivered as an advanced, integrated, and automated option in CloudNine’s cloud-based, Software-as-a-Service delivered simplified eDiscovery automation software, Legal Hold Notifications improve the ability of legal and compliance teams to implement legal holds by providing them with an intuitive and secure tool for marshaling the data preservation process to reduce ESI related exposure and risk while increasing data and process defensibility. To learn more about Legal Hold Notifications, visit eDiscovery.co.
Dynamic data performance, built-in intelligence and scalability brings Azure SQL to the forefront for app developers. With layers of abstraction, your traditional health metrics can be misleading when it comes to understanding how your Azure SQL workloads are truly performing, and in turn, impacting your development timelines. In this webinar with SelectStar’s Mike Kelly, learn how you can optimize the performance of your Azure SQL performance – and in turn, drive better development in the process.
The webinar will also cover:
Key trends driving Azure SQL as a leader in the app development space
Why your Azure SQL performance may lack – and how to fix it
How to augment monitoring for key elements like DTUs and service groups for maximum performance
*Spend little or no time on data hygiene and data transformation
*Make data accessible across the enterprise through data usage and collaboration
*Quickly identify what new open data or existing data is most valuable for your risk assessments
*Leverage deep learning on the underwriting and claims processes for a positive impact to your combined ratio
The status quo of old monitoring solutions and approaches can't handle today's complex, highly distributed service-oriented architectures. We’ve all tried implementing solutions in the past, expecting that they’d offer more visibility and control. But eventually all we got was just one more tool to maintain that didn't provide much value.
Splunk IT Service Intelligence (ITSI) solves this problem. The solution delivers an advanced data-centric approach to service monitoring driven by machine learning and analytics—taking operations and service intelligence to the next level.
Join us in this webinar for a live demo to see how you can:
- Gain service context by combining event and performance data
- Get the big picture of your environment to streamline operations
- Accelerate root-cause analysis and get ahead of customer-impacting outages
- Prioritize incident investigation and reduce time-to-resolution with events analytics
- Understand how analytics and machine learning can enhance service intelligence
Priya Balakrishnan, Director, Solutions Marketing at Splunk
Alok Bhide, Director, Product Management at Splunk
Learn the tips and tricks from the field you need to ensure application success on Cassandra
With almost 10 million node hours of operational experience running Cassandra, and currently managing and supporting hundreds of clusters and thousands of nodes globally for a diverse range of clients, Instaclustr has seen it all.
Our 45 minute live webinar will provide you with vital tips and "lessons learned" to ensure your cluster runs smoothly.
What you’ll discover:
-The most common operational problems in Cassandra, and how to solve them.
-How to diagnose and resolve unexplained performance problems
-How to design your cluster to maximize availability
-How to monitor for early warning signs
-Expert tips and tricks from the field
This presentation will look at the new opportunities being opened up for the visualisation of data within both personal and collaborative 3D spaces, delivered to mobile, browser, desktop, virtual reality and social virtual world/social virtual reality environments. The pros and cons of both 3D and VR environments for the creation and sharing of data visualisations will be considered, tools which let you do dataviz in 3D and VR today presented, and future development pathways discussed.
The GDPR will apply in all EU member states in May 2018, organizations and businesses that are data owners and/or dealing with data belonging to EU citizens have the responsibility to ensure their processing abides by new data protection law and processors must themselves abide by rules to maintain records of their processing activities. If organizations and businesses are involved in a data breach, they are far more liable under GDPR than they were under the Data Protection Act.
Complying with the new regulations requires operating to high standards of data security and protection. If suffering a data breach that puts the rights and freedoms of individuals at risk, organizations must notify the people affected and the data protection authority (Information Commissioner's Office (ICO) in the UK) within 72 hours of becoming aware of it. Data breaches occur every day - and the EU have just increased the financial liability and consequences of inadequate security.
Hear from Mike Fowler, author of the popular white paper: Automation as a Force Multiplier in Cyber Incident Response. Mike will provide insight into how best to leverage automation to provide incident response and reporting consistency for GDPR.
Steve Ditmore will present IncMan™ – Security Automation and Orchestration features covering:
•Installation and set-up is measured in hours rather than days or weeks without the need for expensive professional services support.
•Review steps involved in a mitigating and controlling a data breach.
•Automation of menial enrichment activities, so incident responders can determine/contain and mitigate breaches more efficiently and effectively.
•Enhanced visibility creating a layered approach to information gathering.
•Incident management including response prioritization.
•How incident and notification workflows are automatically assigned to an incident.
Workflows in life sciences and bioinformatics are characterized by massive volumes of machine-generated file data that is pipelined into downstream processes for analysis. With today’s sequencer technology, most experts agree that about 100 GB of data is generated for each human genome that is sequenced.
With the Earth’s population predicted to eclipse 8 billion people by 2025, some researchers believe that Life Sciences and Genomics in particular will soon become the single largest producer of new data across all media types — outpacing today’s leaders like YouTube, Twitter, and astronomical research.
Legacy file storage fails to provide researchers with acceptable performance and cost effectiveness at petabyte scale, especially with the wide mix of file sizes that characterizes modern research workflows.
But it’s not all bad news.
Balancing researcher, IT, and executive team concerns, watch this video case study about the Department of Embryology at the Carnegie Institution for Science, and see why they turned to Qumulo’s modern scale-out storage to deliver the performance, scalability, and simplicity needed to keep pace with evolving research data requirements.
Hortonwork’s Hadoop Powered EDW (Enterprise Data Warehouse) Optimization Solution with Syncsort DMX-h enables organizations to liberate data from across the enterprise, quickly create and populate the data lake, and deliver actionable insights.
Customer case studies across a variety of industries will bring to life how organizations are using this solution to gain bigger insights from their enterprise data – securely and cost-effectively – with faster time to time value.
Join our webcast to learn how:
- Lower cost storage for EDW Data
- Lower cost processing of non-critical loads
- Fast BI of full data set
How do you bridge the gap between high expectations of entrenched BI users and the rapidly changing big data landscape? Diverse data types are now mainstream in every organization and analytics are a ubiquitous enterprise priority. Many companies struggle with reconciling the investment of proven BI with the power of modern big data lakes. But the solution is more obvious than you think.
Join this session to learn from Surya Mukherjee of OVUM as he shares:
● How to leverage your SQL / OLAP / BI investments together with big data for immediate value
● What to do and what to avoid for success with BI on Big Data (with real-life customer stories)
SURYA Mukherjee is a senior analyst for Ovum’s Business Intelligence and Information Management team, responsible for analysis of enterprises’ business intelligence technology investment priorities, market forecast models, and product and vendor evaluations. Based in London, he is a thought leader and keynote speaker at events globally. Prior to Ovum, Surya worked with Deloitte Touche Tohmatsu and Evalueserve, earning an MBA in finance and bachelor’s in engineering, specializing in Electronics.
DAVE Mariani, co-founder and CEO, started AtScale based on his many personal experiences building Business Intelligence solutions on Hadoop at companies like Yahoo! and Klout. As VP of Business Analytics at Yahoo!, he managed data pipelines and analytics ingesting 20TBs of data per day across multiple 4,000 node Hadoop clusters. At Klout, he created the world’s largest data warehouses including a petabyte Hive warehouse. After struggling with BI on big data existing technologies he decided it was time build what he couldn’t buy. As a BI guy who believes in and has been a hands-on with Big Data for over a decade, Dave has walked in his customers’ shoes. Dave started his journey with an Economics degree from UCLA.
In this 60 minute webinar, we will cover the key areas of consideration for data layer decisions in a microservices architecture, and how a caching layer, satisfies these requirements. You’ll walk away from this webinar with a better understanding of the following concepts:
- How microservices are easy to scale up and down, so both the service layer and the data layer need to support this elasticity.
- Why microservices simplify and accelerate the software delivery lifecycle by splitting up effort into smaller isolated pieces that autonomous teams can work on independently. Event-driven systems promote autonomy.
- Where microservices can be distributed across availability zones and data centers for addressing performance and availability requirements. Similarly, the data layer should support this distribution of workload.
- How microservices can be part of an evolution that includes your legacy applications. Similarly, the data layer must accommodate this graceful on-ramp to microservices.
Vous pensez qu’Hive rime avec batch et latence ? Vous pensez devoir systématiquement extraire vos données de votre datalake pour pouvoir les analyser avec un outil de BI agile ?
Alors ce webinar est fait pour vous !
Plongez avec nous dans le monde de l’analyse interactive sur Hadoop
Nous couvrirons :
• Les principales fonctionnalités de Hortonworks Data Platform et Tableau.
• Où Tableau s'intègre à la plate-forme de données Hortonworks dans le cadre d'une architecture de données moderne
• Comment utiliser Tableau avec Hortonworks pour l'exploration et la visualisation de données lors d’une démonstration à couper le souffle.
ICTFOOTPRINT.eu is organising its 5th webinar on 27th April 2017, 12:00 CEST to help you manage the energy consumed by ICT and know how Life Cycle Assessment (LCA) can guide you to make your ICT more sustainable. All those who want to become more sustainable in ICT are welcome to join us in this exciting webinar.
Jean-Marc Alberola, Group Energy Strategy leader at Airbus & Vice Chairman of ETSI ISG OEU (Industry Specification Group Operational Energy Efficiency For Users). Jean-Marc will speak about the work developed by ETSI ISG-OEU, operative KPIs that enable the monitoring of the energy management performance in data centres and ICT sites. After a short description of the global KPI DCEM, the presentation will focus on the implementation of these KPI’s in an industrial area of corporate ICT sites.
Fadri Casty & Tereza Lévová, both from EcoInvent, , the world's most consistent & transparent Life Cycle Inventory database. The ecoinvent database provides well documented process data for thousands of products, helping you make truly informed decisions about their environmental impact. Fadri and Tereza will demonstrate the value of doing LCA on ICT equipment.
Berina Delalic, from multEE will introduce the Monitoring & Verification Platform (MVP), a web based tool developed to calculate and store data about energy and CO2 savings resulting from implemented energy efficiency measures. Having in mind central role of ICT especially in the commercial sector, some methods are developed for improving ICT’s energy efficiency, and therefore achieving CO2 emission reduction. The presentation will show how MVP can be used for measuring results from the activities that enable more efficient use of energy for ICT in commercial buildings.
The webinar will be moderated by Silvana Muscella (Project Coordinator of ICTFOOTPRINT.eu and CEO of Trust-IT Services) who has broad experience in stimulating topics in the ICT sector.
Reinvent your business by innovating with DataOps.
Modern, data-driven businesses are employing data-management methods to establish trust, transparency, auditability, consistency, and control. The business impacts are tangible: greater efficiency, increased revenue, and strengthened customer relationships.
In this webinar, Toph Whitmore, Blue Hill Research principal analyst for Big Data & Analytics, and Jake Freivald, Information Builders vice president of product marketing show how innovative BI technology is enabling forward-thinking companies to reinvent traditional business models.
Join us on April 27 to see how innovators are strengthening their DataOps practices.
Are you prepared to tackle the challenges of the new digital economy? Many organizations have underinvested in backup, running legacy solutions that are unable to keep pace with business demands. An explosive growth of data, increasingly demanding SLAs, new regulations and emerging security threats such as ransomware are putting pressure on businesses to reinvent current backup systems.
The new digital economy demands an adaptive approach to backup and recovery that addresses the dynamic nature of today’s 24x7 data center.
Join IDC’s Nick Sundby and Hewlett Packard Enterprise’s Julita Kussmaul to discuss best practices for designing data protection to support the era of digital transformation.
• Learn why legacy data protection solutions and store-everything practices are no longer sustainable
• Gain key insights that will shape information protection and management in the future
• Get a step-by-step strategy for rethinking your current data protection process
• Learn how an adaptive analytics-driven approach to backup and recovery can help you meet the demands of your business
The worlds of cloud computing and on-premises computing continue to grow and mature in their own ways. Cloud has built a strong reputation for flexibility and elastic scale while on-premises systems remain the mainstay for high-performance and secure business-critical computing.
The days of "either/or" are gone. Companies today must learn to be successful with a hybrid environment comprised of on-premises and cloud systems. Companies do not need to cross the chasm, but to bridge the chasm to innovate, grow and gain competitive advantage.
Join this webinar to hear from Imad Birouty, Director of Product Marketing at Teradata, who will discuss how companies are leveraging Hybrid Cloud to build an analytics everywhere mentality.
You want to transform your business, but your data is spread across a multitude of on-premise and cloud systems. How are you going to organize and manage it all? And how do you even know what data you have and if it’s clean and reliable? We have your answers.
Discover how Informatica simplifies the journey to cloud at the latest unveiling of the world’s #1 data integration and data quality solution, including:
- Informatica PowerCenter 10.2
- Informatica Data Quality 10.2
- Informatica Data Integration Hub 10.2
If you’ve conducted discovery for litigation, investigations or audits, you know that “Murphy’s Law” dictates that a number of “pitfalls” and “potholes” could occur that can derail your project. These issues can add considerable cost to your discovery effort through unexpected rework and also cause you to miss important deadlines or even incur the wrath of a judge for not following accepted rules and principles for discovery. This webcast* will discuss some of the most common “pitfalls” and “potholes” that you can encounter during the discovery life cycle and how to address them to keep your discovery project on track.
+ Avoiding the Mistake in Assuming that Discovery Begins When the Case is Filed
+ How to Proactively Address Inadvertent Privilege Productions
+ Up Front Planning to Reduce Review Costs
+ How to Avoid Getting Stuck with a Bad Production from Opposing Counsel
+ Understanding Your Data to Drive Discovery Decisions
+ Minimizing Potential ESI Spoliation Opportunities
+ Ways to Avoid Potential Data Breaches
+ How to Avoid Processing Mistakes that Can Slow You Down
+ Common Searching Mistakes and How to Avoid Them
+ Techniques to Increase Review Efficiency and Effectiveness
+ Checklist of Items to Ensure a Smooth and Accurate Production
Doug Austin: Doug is the VP of Ops and Professional Services for CloudNine. At CloudNine, Doug manages professional services consulting projects for CloudNine clients. Doug has over 25 years of experience providing consulting, technical project management and software development services to numerous commercial and government clients.
Karen DeSouza: Karen is the Director of Review Services and a Professional Services Consultant for CloudNine. Karen is a licensed attorney in Texas and has over 15 years of legal experience. She also has a Bachelor of Science in Legal Studies - American Jurisprudence.
Data is an organization’s most valuable asset. But capturing that value is impossible if data is locked away from the frontline business users. Join ThoughtSpot’s Chief Customer Evangelist and former Director of BI at Disney, Doug Bordonaro, for an in-depth discussion of best practices, and top pitfalls, BI teams are deploying to deliver governed self-service analytics company-wide and improve their competitive advantage.
In this webinar we will cover:
* Best practices for driving analytics adoption company-wide within a governed environment
* Top analytics use cases for building your organization’s competitive advantage
* Live demo of ThoughtSpot’s search-driven analytics
Discover how to create flexible, API-focused scale-out storage that supports today’s demanding enterprise applications. Join the live webinar from Datera and Packet.net for an advanced customer case study and technical deep dive.
Viewers will get insights on how elastic block storage is evolving in on-premises clouds, challenges with current storage solutions scaling beyond a rack, and data center alternatives for service providers and cloud builders. Learn about Datera’s unique solution to scale easily across the data center and hear from Packet.net on how they are operationalizing Datera Elastic Block Storage.
This insightful webinar is designed for service providers and data center professionals chartered with providing a high performance, consistent and profitable elastic block storage.
Embedded analytics is not new, but the technology for integrating charts, reports, dashboards, and self-service exploration has evolved considerably in the past 5 years. Today, companies in every industry are strengthening relations with customers and suppliers by productizing their data to make it accessible where people need it, when people need it.
In this webinar you'll learn about:
- The evolution of embedded analytics
- Applications and use cases for embedded analytics
- Whether to build or buy analytics to embed in applications
- Technology today that is making it possible to monetize data
The seismic jolt in the Threat Landscape caused by the success of threats like ransomware combined with the geometric rise of so-called zero-day malware (i.e. malware for which no AV signature defenses exist) has given rise to all manner of innovation in the Cybersecurity industry. But a lot of what is being said and presented in the market is really confusing and that’s a problem for practitioners. One of the most-frequently-used phrases in security today is “Machine Learning” or “Math-Based” and “Artificial Intelligence” or “AI”. These phrases are entering the security conversation to describe capabilities, approaches, and strategies, but in reality, they are confusing a great many people. Which begs the question: “What on Earth does it mean?” and “How can Machine Learning be used in Enterprise Security?” Join McAfee and (ISC)2 on April 27, 2017 at 1:00PM Eastern as we clear the confusion, explore the answers to these questions and discuss what this means for dealing with threats.
It’s a bird, it’s a plane, no it’s just your legacy SIEM. Did you know your SIEM might be weakening your security powers? Your legacy SIEM could be:
• Limiting your ability to collect, store and use security-relevant unstructured and structured data
• Making it difficult to maintain your SIEM and requiring skilled staff to work around the clock just to keep the lights on
• Burdening your security operations team by forcing them to chase false alarms while missing critical alerts
• Failing to detect modern threats and putting your entire business at risk
But have no fear, Splunk’s security experts are here to make you a security super hero again. Put on your cape and join us for this webinar to learn how Splunk can be used as a modern SIEM to solve a range of security use cases and more.
Splunk’s security experts will share common SIEM replacement and migration scenarios and discuss how a department store, a financial services firm and a luxury retailer successfully migrated from their legacy SIEM to Splunk.
Girish Bhat, Splunk director of security product marketing
Risi Avila, Security professional services consultant
10 mins – The challenges of having a legacy SIEM
15 mins – The options to migrate
15 mins – Customer success stories
5 mins – Resources to guide and help you
10mins – Q&A
VMware ogłosił koniec rozwoju rozwiązania vSphere Data Protection (VDP) do backupu / odtwarzania środowisk VMware.
W tym kontekście pojawiają się pytania:
•Jak długo mogę korzystać z VDP?
•Kiedy kończy się wsparcie?
•Czy mogę zmigrować moje środowisko VDP do Avamara (pełnej wersji VDP)?
•Jakie są koszty?
•Jak technicznie wygląda migracja?
•Czy są inne opcje?
Zapraszam na krótką, 30 minutową sesję w trakcie której przedyskutujemy
•Szczegóły dotyczące VDP i ogłoszenia VMware
•Możliwości dalszego użytkowania VDP
•Techniczne i komercyjne szczegóły dotyczące migracji
For decades, your operations center has been trapped by “insights” from the wall of charts. Meanwhile, your operations teams are flooded with alerts that lack context and a problem is emerging. You soon find yourself sitting in war rooms, watching siloed tools, navigating event storms and running scripts to extract “relevant” logs for triage. This just does not scale.
Splunk IT Service Intelligence (ITSI) brings a unique approach to monitoring and troubleshooting with detailed swim lanes, logical drill-downs and meaningful and contextual insights into events.
Watch this webcast to learn how to:
-Speed up investigations by organizing and correlating relevant metrics and events
-Analyze real-time performance in relation to past trends
-Identify and alert on notable events by creating predefined correlation searches
-Navigate through event storms easily and quickly and make informed decisions to focus your attention on what matters
Speaker: David Millis, Staff Architect of IT Markets, Splunk
15 min - Key Concepts of Splunk ITSI
30 min - Demo: Deep Dives, Multi-KPI Alerts and Notable Events
15 min - Q&A
With their unique position at the centre of an organization, Architects can play a pivotal role to usher digital evolution and transformation in-line with business and IT objectives, including the adoption of Agile, DevOps and other disruptive paradigms.
Join Richard Sey, Head of Development Operations (DevOps), Siemens Energy Management, and Mark Daly, Client Architect, MuleSoft, on January 18 at 10am (GMT) for a live webinar as they look at the key role of the Architect in achieving the new IT Operating model. It will cover:
- A working example from Siemens Energy Management on how they have realised the new IT Operating Model from a business and Architect's perspective to drive operational efficiency, cutting development time by half
- A view of Enterprise Architecture and DevOps from a leader in the field
- What an organization needs to support delivery of the new IT Operating Model, including the creation of a Centre for Enablement (C4E), which has reduced dependency on Central IT in the Lines of Business at Siemens by 25%