Welcome to the big data and data management community on BrightTALK. Join thousands of data quality engineers, data scientists, database administrators and other professionals to find more information about the hottest topics affecting your data. Subscribe now to learn about efficiently storing, optimizing a complex infrastructure, developing governing policies, ensuring data quality and analyzing data to make better informed decisions. Join the conversation by watching live and on-demand webinars and take the opportunity to interact with top experts and thought leaders in the field.
A complete demonstration of CloudNine's SaaS-delivered eDiscovery automation software by industry expert and author Doug Austin. Used extensively by corporations and law firms to simplify legal discovery in litigation, investigations, and audits, CloudNine provides eDiscovery practitioners with a secure platform that enables the upload, review, and production of documents from a fully integrated and automated fourth-generation eDiscovery platform. For a free trial, visit eDiscovery.co.
Yellowfin is a Business Intelligence platform that makes discovering and sharing insights easy.
You expect your Business Intelligence (BI) solution to convert data into insights. However, finding insights is only half the puzzle. Data-driven insights are only valuable when shared. Empower the right people to take the right action, at the right time, with Yellowfin.
Los puestos de trabajo y los dispositivos móviles son habitualmente los aspectos más olvidados en la estrategia de gestión de datos empresarial. Sin embargo, la información presente en ordenadores portátiles, de escritorio y dispositivos móviles es uno de los activos de mayor valor para la organización, pese a que potencialmente se encuentra en máximo riesgo. De acuerdo con IDC hay alrededor de 1.300 millones de trabajadores móviles hoy en día, pero tan solo la mitad de las empresas utiliza algún tipo de backup para puestos de trabajo.
La fuerza de trabajo móvil es cada vez mayor. Dichos usuarios manejan información fuera del dominio tradicional del departamento de TI y emplean soluciones de colaboración no corporativas. El coste potencial y el riesgo asociados al cumplimiento de la normativa y al eDiscovery, la necesidad de proteger el puesto de trabajo y de aportar soluciones de colaboración corporativas están claros y se encuentran en un punto de inflexión crucial.
Acompáñenos en este webinar, donde analizaremos:
- Cómo recuperar el control sobre lo que ocurre en los dispositivos móviles de los trabajadores
- Cómo mitigar los riesgos asociados con los ataques cada vez más frecuentes de ransomware
- Cómo aumentar la productividad con una plataforma segura para la compartición de archivos
As enterprises around the world bring more of their sensitive data into Hadoop data lakes, balancing the need for democratization of access to data without sacrificing strong security principles becomes paramount. In this webinar, Srikanth Venkat, director of product management for security & governance will demonstrate two new data protection capabilities in Apache Ranger – dynamic column masking and row level filtering of data stored in Apache Hive. These features have been introduced as part of HDP 2.5 platform release.
Organizations no longer need to waste budget by expanding or upgrading Windows Storage Servers or waste resources evaluating what data is or isn’t useful. They can now maintain their existing servers and seamlessly add intelligent data transfer to a limitless tier of scale-out secondary storage at a quarter of the cost. In addition, primary storage is optimized, disk I/O is reduced and storage silos are consolidated. Join Caringo VP of Product Tony Barbagallo as he introduces the new FileFly Secondary Storage Platform--a complete hardware, software, and services solution that is ideal for remote office consolidation, enabling collaboration and significantly simplifying data protection.
Apache Spark is red hot, but without the compulsory skillsets, it can be a challenge to operationalize — making it difficult to build a robust production data pipeline that business users and data scientists across your company can use to unearth insights.
Smartsheet is the world’s leading SaaS platform for managing and automating collaborative work. With over 90,000 companies and millions of users, it helps teams get work done ranging from managing simple task lists to orchestrating the largest sporting events and construction projects.
In this webinar, you will learn how Smartsheet uses Databricks to overcome the complexities of Spark to build their own analysis platform that enables self-service insights at will, scale, and speed to better understand their customers’ diverse use cases. They will share valuable patterns and lessons learned in both technical and adoption areas to show how they achieved this, including:
How to build a robust metadata-driven data pipeline that processes application and business systems data to provide a 360 view of customers and to drive smarter business systems integrations.
How to provide an intuitive and valuable “pyramid” of datasets usable by both technical and business users.
Their roll-out approach and materials used for company-wide adoption allowing users to go from zero to insights with Spark and Databricks in minutes.
Working in silos, while never a good idea, is a reality in many organizations today. Security and network operations teams have different priorities, processes and systems. Security teams use various controls and tools to mitigate different kinds of threats which provides them with thousands of alerts on a daily basis. They often find it difficult to prioritize the threats to address first. What they may not know is that there is a whole piece of the puzzle they could be missing - core network services like DNS, DHCP and IPAM. These can provide a wealth of information and context on threats, which can help prioritize response based on actual risk and ease compliance. Join Infoblox and (ISC)2 on February 23, 2017 at 1:00PM Eastern for a roundtable discussion on how to use ecosystem integrations between network and security tools for better security and compliance.
Infectious Media runs on data. But, as an ad-tech company that records hundreds of thousands of web events per second, they have to deal with data at a scale not seen by most companies. You can not make decisions with data when people need to write manual SQL only for queries take 10-20 minutes to return. Infectious Media made the switch to Google BigQuery and Looker and now every member of every team can get the data they need in seconds.
Infectious Media will share:
- Why they chose their current stack
- Why faster data means happier customers
- Advantages and practical implications of storing and processing that much data
The world’s largest enterprises run their infrastructure on Oracle, DB2 and SQL and their critical business operations on SAP applications. Organizations need this data to be available in real-time to conduct the necessary analytics. However, delivering this heterogeneous data at the speed it’s required can be a huge challenge because of the complex underlying data models and structures and legacy manual processes which are prone to errors and delays.
Unlock these silos of data and enable the new advanced analytics with Hortonworks Connected Data Platforms that allow you to capture perishable insights from data-in-motion while ensuring rich, historical insights from data-at-rest.
Join our webinar with Ted Orme, Attunity and Ana Gillan, Hortonworks to find out how to overcome such challenges. You will learn:
How to overcome common challenges faced by enterprises trying to access their SAP data
How to integrate SAP data in real-time with change data capture (CDC) technology
How organizations are using Attunity Replicate for SAP to stream SAP data in to Kafka
Digital and self-service payment channels have changed the way banks and customers interact. Less direct interaction means more reliance on “Big Data” for insights that will help you identify ways to better serve your customers, run more effective card programs and capture more wallet share.
Printec and INETCO invite you to attend a 30-45 minute live webinar, where we will showcase new technologies that will help your financial organization derive better value from customer transaction data. Learn how you can establish real-time insights into:
- Card usage patterns
- Profitability and cash flow
- Application, network or third party service performance
Can’t make the date? That’s ok. Contact Printec or INETCO if you are still interested in arranging a private discussion on how detailed transactional data can help you to predict next best product, identify customer performance problems, customize the consumer experience, competitively price card services and loans, execute targeted marketing strategies, and much more.
If there’s anything as important as making sure an app is running, it’s making sure the data underneath it is available as well. But the rapid changes in database technology and the quick adoption of cloud infrastructure have made that task extremely challenging for most database administrators and DevOps teams.
In just 30 minutes, you'll learn:
- How to monitor all database types (including SQL, NoSQL, cloud) in a single tool
- How to get up to speed quickly on new database technologies without adding experts to your team
- The fastest way to get an assessment of your organization’s overall database health
- Simple tools that will drastically reduce the time you spend on database and infrastructure administration
In the past years of Big Data and digital transformation “euphoria”, Hadoop and Spark received most of the attention as platforms for large-scale data management and analytics. Data warehouses based on relational database technology, for a variety of reasons, came under scrutiny as perhaps no longer needed.
However, if there is anything users have learned recently it’s that the mission of data warehouses is as vital as ever. Cost and operational deficiencies can be overcome with a combination of cloud computing and open source software, and by leveraging the same economics of traditional big data projects - scale-up and scale-out at commodity pricing.
In this webinar, Neil Raden from Hired Brains Research makes the case that an evolved data warehouse implementation continues to play a vital role in the enterprise, providing unique business value that actually aids digital transformation. Attendees will learn:
- How the role of the data warehouse has evolved over time
- Why Hadoop and Spark are not replacements for the data warehouse
- How the data warehouse supports digital transformation initiatives
- Real-life examples of data warehousing in digital transformation scenarios
- Advice and best practices for evolving your own data warehouse practice
It’s an exciting time for retailers as technology is driving a major disruption in the market. Whether you are just beginning to build a retail data analytics program or you have been gaining advanced insights from your data for quite some time, join Eric and Shish as we explore the trends, drivers and hurdles in retail data analytics
As we get closer to Oracle de-supporting Discoverer, many organizations are still struggling to find the right ‘replacement’ tool that will protect your investment and minimize the impact on both IT and end users. Join our webinar, where we will cover:
-Our approach to quickly migrate Discoverer Workbooks into dynamic SplashBI Reports -Migration Utility converts Worksheets, Columns, Parameters, Conditions, Totals, Format, Colors along with Folders and joins, keeping your workbook structure intact -Compatibility with the various Oracle Security Models -Ability to run your converted reports from Excel: a must for many Discoverer users! -Extend you reports to provide intelligent dashboards -Access your converted reports on mobile and tablet devices -Easy to learn, self-service tool means faster adoption by your users -Ability to connect other data sources
Register now and learn about SplashBI - our on-demand, self-service business intelligence and reporting tool that WILL make your Discoverer conversion faster, easier, more affordable and complete!
Christian Renaud, Research Director, Internet of Things, discusses the vertical and horizontal approaches to the Internet of Things, the state of IoT deployments, 2016 IoT M&A, and a preview of what to expect at the upcoming Real World IoT event, 5-6 April at the Landmark London.
What is your first line of defense against cyberattacks? Secure endpoints! Endpoints are everywhere in the IIoT landscape. Without proper security, Industrial Internet of Things (IIoT) systems are not trustworthy, putting organizations, their missions and the greater public at increased risk. The viability of the IIoT depends on proper implementation of security to counter the growing and ever changing threats that are emerging.
On February 22, 2017, editors of the IISF and security experts from the Industrial Internet Consortium will discuss the endpoint protection/security model and policy in its Industrial Internet Security Framework (IISF) document and present a real-world customer use case for an approach taken to secure an industrial system.
In 2017, more and more corporations are looking to reduce operational overheads in their enterprise data warehouse (EDW) installations. Hortonworks just launched Industry’s first turn key EDW Optimization solution together with our partners Syncsort and AtScale. Join Hortonworks’ CTO Scott Gnau to learn more about this exciting solution and its 3 use cases.
Avito, based in Moscow, is Russia’s fastest growing e-commerce site and portal. It uses Big Data technology to monitor and improve the placement of advertisements, and to better understand how their users are adapting to online business.
In this webinar, Nikolay Golov, Data Warehouse Architect at Avito, who has over 10 years’ experience for banks, retail and e-commerce, will share the lessons they have learned when implementing and expanding their HPE Vertica-based analytics infrastructure over the past 4 years.
This will include identifying the risks and demonstrating how those risks can be efficiently and effectively overcome with the right technology stack: HPE Vertica, which processes data from dozens of heterogeneous data sources and monetizes it through Machine Learning.
Wenn Regulatorik zur Chance, statt zum lästigen Projekt wird
Mit der EU-Verordnung zum europaweiten Aufbau einer granularen Kreditdatenbank vom Mai 2016 findet ein Paradigmenwechsel im statistischen Meldewesen statt. 95 Datenattribute müssen auf Einzelkreditebene ab Januar 2018 an die EZB gemeldet werden.
Ist die Datenbeschaffung im Rahmen der Initialmeldung in Ihrem Finanzinstitut bereits gemeistert? Im Online-Seminar am 21. Februar von 10 - 11 Uhr Uhr zeigen wir Ihnen die Möglichkeiten auf, die nötigen Daten automatisiert, valide und nachhaltig für neue Geschäftsprozesse und Digitalisierungsprojekte flott zu machen.
Ulrich Santelmann, Head of Financial Service Consulting
Axel Luckhardt, Expert Consultant FSC
Jürgen Jertz, Account Manager
Dirk Beerbohm, Technical Account Manager
In the latest Forrester Wave report for Big Data Hadoop Cloud Solutions, Microsoft Azure came on top beating some very esteemed vendors. Learn how to complete your big data solution and join Microsoft and Hortonworks as we showcase Hortonworks DataFlow and how it complements Azure HDInsight enabling users to easily move their data to the cloud for production, disaster recovery or development uses.
Powered by Apache NiFi, Kafka and Storm, HDF collects, curates, audits and delivers real-time data using a simple end-user interface unblocking customers from analysing their big data in Microsoft Azure.
During this webinar we will show:
How to ingest a variety of data sources from a selection of source systems using HDF
How to feed the data to HDInsight cluster on Azure
How to easily display and interrogate those datasources once they have been ingested
Accident Exchange, part of the AIS group, is a provider of mobility services for people that experienced a car accident. With large volumes of data in various formats to maintain the company recognized the potential compliance challenges ahead. Accident Exchange in partnership with Hitachi Data Systems implemented Hitachi Content Platform (HCP) providing them the capability to more securely store and abstract data.
Learn how Accident Exchange can now prove authenticity, automate and abstract data from different sources in multiple files and formats all within one platform to achieve greater compliance. HCP is the key to removing data silos.
Customers have been using Tableau and BigQuery to store and analyze large volumes of data for years, but BigQuery has recently released significant updates that will help Tableau customers find even more insight in their data.
Watch this webinar to learn:
- How expanded standard SQL support can help you work more efficiently
- Best practices for efficiency with Tableau and BigQuery
- How to connect to BigQuery and create a visualization with Tableau
Cloud analytics has great momentum and that is for a reason: it allows for real-time and live analytics without needing to prepare an environment. In this webinar you will learn how to apply SAP Cloud analytics using BusinessObjects Cloud and the Digital Boardroom. Be amazed by the easiness’ of use and the great visualization capabilities.
Iver van de Zand – SAP Analytics Leader – will provide a deep dive session on the modeling and visualization capabilities of this stunning product
W grudniu Dell EMC ogłosił nowych wersję kluczowych rozwiązań w obszarze backupu/odtwarzania:
3.Boost for Applications
Zapraszam na 30 minutową sesję w czasie której dotkniemy najważniejszych zmian i najciekawszych funkcjonalności.
Deploying applications locally and bursting them to the cloud for compute may seem difficult, especially when working with high-performance, critical information. However, using cloudbursts to offset peaks in demand can bring big benefits and kudos from organizational leaders always looking to do more with less.
After this short webinar, you’ll be ready to:
- Explain what cloud bursting is and what workloads it is best for
- Identify efficiencies in applying cloud bursting to high-performance applications
- Understand how cloud computing services access your data and consume it during burst cycles
- Share three real-world use cases of companies leveraging cloud bursting for measurable efficiencies
- Have seen a demonstration of how it works
Presenters will build an actionable framework in just thirty minutes and then take questions.
Solace, the leading provider of open data movement technology, combines with Hazelcast, the fastest in-memory data grid, to accelerate the processing of global data workloads. Solace integrates with Hazelcast to provide a distributed publish/subscribe backbone that enables remarkably fast, efficient cluster synchronization across WANs.
This joint solution enables multi-cloud and hybrid cloud replication of Hazelcast clusters for worldwide operation with enterprise grade reliability, massive WAN scale-out and low latency cluster replication.
In this webinar, Michael and Viktor are going to describe some example applications of this solution such as financial institutions that synchronize real-time position books and mobile carriers that use in-memory operational data to provide a fast and seamless end user experience.
Every year at Tableau, we look back at the previous 12 months and evaluate the new ways technology is changing the face of business decisions. That discussion drives our list of top business intelligence trends for the year.
In this webinar, explore:
•Emerging trends in business intelligence
•Tableau experts' take on the changing BI landscape
•Considerations for your 2017 business intelligence strategy
As healthcare organizations seek to unlock Electronic Health Record (EHR) data and develop modern clinical applications, health APIs can be used to simultaneously increase IT agility and improve data security.
Furthermore, APIs (Application Program Interface) and API management serve as a core enabler for development on the SMART on FHIR (Fast Healthcare Interoperability Resources) platform, which promises to improve clinical care, research, and public health by supporting application interoperability.
Join MuleSoft and ICOE Group for a discussion on how APIs can serve as a key enabler for EHR connectivity initiatives, and a live demo showcasing how MuleSoft’s Anypoint Platform can be used to develop a SMART on FHIR clinical application.
Attendees will learn:
- What impact upcoming legislative changes may have on health IT, and how APIs can insulate healthcare organizations from potential disruption.
- How to build and manage APIs in a way that improves health data security and increases IT agility.
- How MuleSoft's Anypoint Platform enables the development and adoption of SMART on FHIR clinical applications.
Cybercriminals are setting their sights on hospitality businesses across the U.S. and Europe with unprecedented malware attacks known as Carbanak, as part of precise and difficult-to-stop APT-style operation that we code-named 'Grand Mars'.
Our Trustwave SpiderLabs team of incident responders and researchers have spent months analyzing Grand Mars and its elements, and unlocking strategies that can be used to identify and mitigate this insidious campaign – which may soon spread to e-commerce and retail organizations as well.
Join our webinar where the author of the report will be sharing the findings including:
•How the attackers make initial entry and force infection
•How they achieve persistence
•How they perform lateral movements
•Which malicious files they use
•Which signs indicate you’ve been compromised
•Which countermeasures you should apply immediately
Broadening privacy regulations strain data retention policies, creating confusion from obtaining consent to the deletion of personal data. In the first of the three-part webcast series, Making Compliance Your Policy, HPE and Iron Mountain experts examine the upcoming European General Data Protection Regulation (GDPR) and the best practices that records and information managers can bring to the table for a risk-based approach to privacy compliance.
This webinar will discuss:
• How to prepare now for the rigor of the GDPR
• Implications the GDPR has for records and data retention
• How to achieve transparency
Hortonworks SmartSense provides proactive recommendations that improve cluster performance, security and operations. And since 30% of issues are configuration related, Hortonworks SmartSense makes an immediate impact on Hadoop system performance and availability, in some cases boosting hardware performance by two times.
Join SVP of BI from Sterling National Bank Patrick DeKnipp as we
discuss how his team is delivering 360 degree customer analytics to
everyone across sales, marketing and service with ThoughtSpot's
search-driven analytics platform.
A View of the World's Largest & Fastest Growing Data Center Spot and why you should be there
The Northern Virginia market, which includes multiple locations in/around Washington DC, continues to be one of the largest and fastest growing datacenter hot spots in the world and top market for cloud providers. While much has been said about the area, please join us to discuss the latest in: what makes the market special, why so many firms want datacenter space there, how the market is evolving, and which locations to consider when thinking about a deployment in the area.
Please join us for a webinar with Kelly Morgan, Vice President at 451 Research and Mark Kidd, Senior Vice President and General Manager for Iron Mountain Data Centers, for a discussion on the Northern Virginia market and Iron Mountain’s approach to the market.
Times they keep a-changin'. With unrelenting growth in big data, remote workforces becoming more prevalent and data being scattered across multiple locations, companies are forced to modernize their backup solutions to provide better protection and operational efficiency.
In this webinar members of Veritas and Western Digital will explore the latest data protection trends and learn how a unified backup and recovery solution can reduce complexity, enable you to protect, locate, and recover more information in less time, and keep you ahead of exponential data growth.
Times they keep a-changin', but that doesn't mean you have to be left behind.
With all of the hype about Technology Assisted Review, does keyword search still have a place in eDiscovery? You bet it does – if it is conducted properly. This CLE-approved* webcast session will cover goals for effective searching, what to consider prior to collecting ESI that will be subject to search, mechanisms for culling prior to searching, mechanisms for improving search recall and precision, challenges to effective searching and recommended best practices for searching and validating your search results to ensure effective search results.
+ Search Fundamentals
+ Considerations Before Searching
+ Mechanisms for Culling Before and After Searching
+ Mechanisms for Improving Recall and Precision
+ Searching Considerations and Challenges
+ Using Sampling for Defensible Searching
Doug Austin: Doug is the VP of Operations and Professional Services for CloudNine. At CloudNine, Doug manages professional services consulting projects for CloudNine clients. Doug has over 25 years of experience providing legal technology consulting, technical project management and software development services to numerous commercial and government clients. Doug is also the editor of the CloudNine sponsored eDiscovery Daily blog.
Karen DeSouza: Karen is the Director of Review Services, In-House Counsel, and a Professional Services Consultant for CloudNine. Also, Karen helps attorneys with CloudNine's software and Pre-Litigation Consulting Services. Karen is a licensed attorney in Texas and has over 15 years of legal experience. She also has a Bachelor of Science in Legal Studies - American Jurisprudence. Before CloudNine, Karen worked as an E-Discovery Director, Project Manager, and as an Associate at various law firms in Houston area where her primary focus was litigation.
* Approved in Selected States (To Be Published Prior To Webcast Delivery)
Spring Boot and Pivotal Cloud Foundry users won’t want to miss Spring team’s Madhura Bhave and Pieter Humphrey as they tour through the Spring Boot 1.5 release.
Inspired in part by cool community open source work from Ordina JWorks, one of the hottest new directions that the two teams are working on is the integration of Spring Boot Actuators with Pivotal Cloud Foundry.
Attendees will be given direct linkage to product management - this is your chance to influence future integration direction! You’ll also walk away understanding all the highlights of the Spring Boot 1.5 release, including exciting improvements in Kafka and JPA support.
How do you manage data and deliver insights when data volumes are exploding? Does this modern data challenge require more than a ‘traditional’ approach?
With 30 million+ members globally and hundreds of terabytes of data, Ebates' BI Team moved to a non-traditional approach as a matter of necessity. They defined what a non-traditional approach looks like and kept their Tableau business users happy.
In this webinar, EBates BI leader, Mark Stange-Tregear, will share why and how his team successfully transitioned from traditional BI on a traditional data-warehouse, to "all in" self-service BI on Hadoop.
Join us for this live presentation to learn:
1) Why: Why he chose to run BI on Hadoop at Ebates.
2) How: How Ebates made the transition; the plan, challenges and end-goals.
3) What: What he actually did, what he achieved, and lessons learned
Want to simplify your data center’s complexity without creating management silos? Challenged with running IT on a limited budget? Get an overview of Hitachi’s next-generation hyperconverged solution, powered by VMware vSAN. Learn how this power couple introduces advanced automation and enterprise availability for core and edge applications.
Join us for a live webcast on March 1, 2017. Hitachi Data Systems and VMware will discuss common challenges of legacy IT systems and how you can improve efficiency with a reliable hyperconverged solution.
Join Platform9 and Datera to learn how their software solutions can provide the DevOps agility, and operational simplicity of the cloud for containers. Mainstream adoption of stateful applications have been foundational for containerized applications to get widely deployed.
With Datera Elastic Data Fabric and Platform9 Managed Kubernetes, customers can manage any application on Containers.
You will learn:
How to automate provisioning
How this solution simplifies operations and management
How to save costs with elasticity
Today's data-driven organizations are challenged by typical EDWs which include the added costs of proprietary technologies and the labor-intensive inflexibility of the EDW model. To summarize, EDW is expensive, rigid and inefficient. Smarter organizations are now turning to modern solutions to renovate their EDW.
Join this webinar as we share the top 3 ways to optimize your EDW with Hadoop. We will cover, archiving, onboarding and the enrichment of data enabling you to kick start your journey to move data and processing to Hadoop.
Poor data quality can have serious financial consequences. Regulatory fines, monetary losses from bad business decisions, and legal fees resulting from errors can add up to millions of dollars. When it comes to patient or consumer safety, bad data can cost lives.
This webinar will highlight effective steps for preventing and fixing bad data, and processes to help ensure optimum integrity in your data. Best practices in data quality and real-world success stories will be featured.
Hortonworks SmartSense provides proactive recommendations that improve cluster performance, security and operations. And since 30% of issues are configuration related, Hortonworks SmartSense makes an immediate impact on Hadoop system performance and availability, in some cases boosting hardware performance by two times.
Research shows that 76% of companies suffered a data breach in 2016, so it’s now almost inevitable that hackers will gain access to your company and your sensitive data.
Security professionals are now looking to deal with breaches faster, to keep their company off the front page and with heavy GDPR fines on the horizon, they’re wise to do so.
Organisations are fearful of damaging data breaches but unsure of the best course of action to protect themselves from major cyber incidents. Whilst a large per cent of businesses focus on building up perimeter defences, not enough are concentrating on monitoring their own network to detect threats and mitigate them before significant damage is done.
Tune into this in-depth one-on-one interview to discover:
•More about the current threat landscape and the dangers to your organisation.
•How you can reduce the time to detect and respond to threats without adding staff to accomplish the job.
•More about how cutting edge technology can be used such as:
oAdvanced machine analytics, which are key to discovering potential threats quickly.
oSecurity automation and orchestration capabilities, which increase the efficiency of the threat lifecycle management process.
•The influence that GDPR will have and steps you need to take.
Join CloudEndure’s Gonen Stein and GCP’s Andy Tzou as they discuss how companies can take full advantage of the cloud without the technical, performance, or financial challenges that normally strain IT resources. They'll dive into the technology behind the new automated migration service, which utilizes CloudEndure’s continuous, block-level replication, automated machine conversion, and application stack orchestration.
This video will provide an overview of how Google and CloudEndure have partnered to create a self-service VM migration service. This service enables you to move your existing server infrastructure to Google Cloud Platform at no cost.
Have you been waiting for just the right moment to migrate to the cloud? Have your questions answered about how companies are using the new automated migration service to simply and easily move to the cloud.
In a retail setting, a floor set is not only a way to showcase the brand, but also provides customers with a summarized view of the product landscape as seen in the retailer’s mind.
Applying graph-based analytics allows the detection of historical product affinities and identification of groups of strongly connected products that can help form a highly productive floor set.
In this webinar, we will take a look at a method of analyzing customer purchase histories using graph-based analytics. We will provide an overview of storing data as graphs and demonstrate how affinities among products can be detected to make decisions on which products are best displayed together.
Join us to learn how to optimize retail floor sets with graph-based analytics and avoid unnecessary reactive movement of product.
You’re using Apache Hadoop and cloud-based data platforms, but can your BI and analytics tools keep up? Can you provide fast, secure, self-service access to all the data business users want?
Analyzing big data poses multiple challenges. Highly parallel distributed data architecture is one solution, but until recently it has been mostly limited to databases, not business intelligence (BI) application servers.
Join this informative webinar with guest speaker Boris Evelson, VP and principal analyst at Forrester Research, and Priyank Patel, co-founder and chief product officer at Arcadia Data. Enterprise architects, data scientists, and application development and delivery (AD&D) pros will learn::
- What is a distributed BI platform? How is it different from existing BI tools?
- How to scale BI and visual analytics for users without moving data
- What features matter most for distributed BI platforms for Hadoop
- How to unify security natively in Hadoop without more administration
Unstructured data is like renegade data. It doesn't fit into a spreadsheet with rows and columns, i’s not in an ERP or CRM where you know what kind of data is in each cell, or how it relates to the rest of the data. So how do you create a sense of order among all this data chaos?
In this webinar you’ll learn about an alternative storage option to overspending on SAN and NAS solutions: object storage. You’ll leave this webinar knowing:
- What is unstructured data
- What are the storage challenges
- The pros/cons of different storage options (SAN,NAS,Object) for scale, performance, durability and cost.
There are many purposes for storing unstructured data; greater customer insight, exposing security threats, or archiving original media files. The challenge from a storage perspective, is how to create order from the chaos.
You can’t talk about big data today without hearing about Hadoop. For many people, Hadoop is synonymous with big data, but it’s not necessarily the best tool for every use case and every project. Before committing to it, businesses need to ignore the hype, look at their needs, and determine if and where Hadoop fits into their big data initiatives.
Join Bill Theisinger, vice president of engineering for platform data services at YP(sm), formerly known as YellowPages.com, to hear how his company is maximizing the value they get from Hadoop by supplementing it with HPE Vertica, an analytical database.
In this webcast, you’ll learn:
• The benefits and drawbacks of using Hadoop
• Where Hadoop can fit into big data initiatives
• How to supplement Hadoop with an analytical database
There's an overwhelming amount of information that comes from the connected world. Information sources are endless, but their credibility can be questionable. Cyber security teams can often relate with an overload of threat data from a variety of sources. Building an effective threat intelligence capability requires drilling down through all of information to find the data that is most relevant to you. So where do you start? To avoid information overload, an organization needs to be selective about the sources that they need to stay ahead of the threats and exploits that can compromise them. Join (ISC)2 and our sponsor Recorded Future for a From the Trenches webcast on March 2, 2017 at 1:00 PM ET for a discussion on threat intelligence sources, what's available out there, and how to separate the signal from the noise so you can spend less time on data collection and more time on analysis.
Almost everyone is concerned with the tooling to manage the big data lifecycle. From business people engaged with self-service analytics, to data scientists, data analysts, and data professionals from BI and IT organizations, it seems that nearly everyone is both a consumer and a provider of data.
Big data management software spans the data lifecycle supporting data profiling, transformation, enrichment, cleansing, matching and other functions. It is the glue that binds a big data environment together, fostering continuous alignment of data with dynamic and changing business needs.
During this webinar, Dave Wells, Research Analyst at Eckerson Group, and Kelly Schupp, VP of Data-driven Marketing at Zaloni, will discuss the tools, and how to leverage them for high-impact analytics, leveraging research from Dave’s recent industry report titled “Big Data Management Software for the Data-Driven Enterprise”. Topics addressed:
- The kinds of tools that are needed to meet the challenges of big data
- The purpose, functions, and characteristics of data preparation tools
- The purpose, functions, and characteristics of pipeline management tools
- The purpose, functions, and characteristics of data cataloging tools
- The role of big data management tools for high-impact analytics
Dave Wells is an advisory consultant, educator, and industry analyst at Eckerson Group. He is dedicated to building meaningful connections throughout the path from data to business value. He works at the intersection of information management and business management, driving business impact through analytics, business intelligence, and active data management.
Kelly Schupp is Vice President of Marketing for Zaloni. Kelly has 20 years of experience in the enterprise software and technology industry. She has held a variety of global marketing leadership roles, and previously worked at IBM, Micromuse and Porter Novelli.