Welcome to the big data and data management community on BrightTALK. Join thousands of data quality engineers, data scientists, database administrators and other professionals to find more information about the hottest topics affecting your data. Subscribe now to learn about efficiently storing, optimizing a complex infrastructure, developing governing policies, ensuring data quality and analyzing data to make better informed decisions. Join the conversation by watching live and on-demand webinars and take the opportunity to interact with top experts and thought leaders in the field.
The rapid adoption of enterprise cloud-based solutions brings data integration as one of the greatest challenges. The challenge grows on increasingly numerous SaaS applications with lack of suitable connectors that fit your business needs. Join this webinar led by Primitive Logic team to see how data integration can be simplified for your cloud. This talk introduces the generic declarative zero-coding approach using Informatica Cloud Rest Connector as an example and illustrates its practical features, as well as best practices and hidden gems.
Moderator: Jill Reber, CEO, Primitive Logic
Panelists: JayJay Zheng, Technical Director & Eric Greenfeder, Chief Architect, Primitive Logic
Companies that embrace DevOps in the cloud develop apps faster, reduce overhead and can decrease downtime by 60% or even more. In this webinar, Glenn Mate, Solution Architect at 10th Magnitude & Ryan Lee, Cloud Infrastructure Engineer, 10th Magnitude will share several use cases around how to integrate your DevOps processes into Azure. They will also emphasize CALMS:
* The framework that integrates the DevOps teams
* Functions and systems within an organization around culture
* Automation, lean, metrics, and sharing.
By distilling the philosophy behind DevOps in the cloud, you’ll find automation and optimization at the core.
In this webinar, we dive deeper into analytics and talk about the advantages of data exploration and predictive modeling. We’ll also show attendees how to fine-tune the accuracy and precision of those analytic models so you can make critical decisions with confidence.
Do you know what your top ten 'happy' customers look like? Would you like to find ten more just like them? Come learn how to leverage 1st & 3rd party data to map your customer journey and drive users down a path where every interaction is personalized, fun, & data-driven. No more detractors, power your Customer Experience with data!
In this webinar you will learn:
-When, why, and how to leverage 1st, 2nd, and 3rd party data
-Tips & Tricks for marketers to become more data driven when launching their campaigns
-Why all marketers needs a 360 degree customer view
Successful analytics and transactional systems start with proper architectural principles and considerations. This webinar will be a detailed walkthrough of all the software and hardware infrastructure below the database server. It will explain how each layer affects database workloads as well as give tools and metrics for determining optimal architectural configurations.
From this webinar you will learn:
• How to design SQL server infrastructures for optimal performance
• How the software and hardware infrastructure affect database performance
• Tools and metrics to determine appropriate hardware allocations. For example, how many CPU cores, data volumes, tempdb files or GBs of memory are needed for your workload.
• Tools and metrics to find current bottlenecks, inside and outside of the database server
• Tools and best practices for monitoring all layers
Presented by Matt Henderson, who has over two decades of experience in designing high-performance database platforms and over five years working with the latest all-flash storage technologies. He currently is the Director of Microsoft Technologies at Vexata where he helps customers optimize and scale their mission-critical databases.
A key obstacle for doing data engineering at scale is having a robust distributed infrastructure on which frameworks like Apache Spark can run efficiently. On top of building the infrastructure, having proper automatic functioning of the infrastructure is another critical piece for running production workloads.
Join this webinar to learn how Databricks’ Unified Analytics Platform can help simplify your data engineering problems by configuring your distributed infrastructure to be in autopilot mode. Learn how:
-Databricks’ automated infrastructure will allow you to autoscale compute and storage independently.
-To significantly reduce cloud costs through cutting edge cluster management features.
-To control certain features in the cluster management and balance between ease of use and manual control.
Contracting is becoming more and more popular among professionals. In this episode Travis O’Rourke, Head of Hays Talent Solutions in Canada, discusses how you can determine if contracting is the right career path for you.
For more contracting tips and careers advice, visit our blog Viewpoint: www.haysplc.com/viewpoint
Bi-directional data movement need not be feared when using HVR for real-time data integration. In this video, Glenn Goodrich, Director of Enablement, explains how bi-directional data movement can be accomplished efficiently, accurately, and in real-time with HVR.
Chapter 1: What is Bi-Directional? 1:13
Chapter 2-1: How to Implement Bi-Directional 3:32
Chapter 2-2: Bi-Directional Considerations 5:03
Chapter 3: Why Implement Bi-Directional? 11:46
Additional Areas of Interest:
DDL Operations and Truncates: 7:15
Loop Avoidance: 8:15
Conflict Detection Resolution (CDR): 10:06
Paul Bruton discusses the move to a holistic approach to next gen data management. Looking at digital transformation strategies, he explains how Hitachi Vantara’s object storage can address common challenges - from cloud complexity to data governance and compliance - with its advanced custom metadata architecture to make data more intelligent.
Jason Hardy speaks about the evolution of the Hitachi Content Platform. Focusing on the latest addition to the portfolio, Hitachi Content Intelligence (HCI), he explains how it delivers a superior enterprise search experience. Learn how HCI can process and discover information from multiple data streams and find meaningful correlations between that data to enable data-driven decision-making.
Scott Nyman reflects on the long interesting road in object storage, from niche-oriented work in the 2000s to today’s digital enterprise where up to 95% of data is unstructured. Learn what makes Hitachi Content Platform stand out, and how it addresses even the most complex business requirements.
With the proliferation of cloud deployment options and platforms, management of application security across platforms has become a major problem for security teams. In this webinar, we address challenges posed by cloud proliferation, and how to approach development of a consistent security posture across platforms to better manage risks.
Increasingly, companies have turned to customer data platforms (CDPs) to help them run more relevant marketing campaigns using the large volume of customer data at their fingertips. Customer-focused software company Atlassian will discuss how they made that decision and the types of complex marketing campaigns that they can now run as a result. Research analyst David Raab will also be on hand to discuss how other companies can use CDPs to harness customer data to increase sales and loyalty.
In this webinar, you will learn:
* Key considerations when deciding to build versus buy a customer data platform
* How companies are taking advantage of CDPs for more relevant communications
* How data science can improve marketing efficacy
* The trends and market factors driving the need for customer data platforms
* Jeff Sinclair, Product Manager, Engagement Platform, Atlassian
* David Raab, Analyst and Founder at The Customer Data Platform Institute
* Jeff Hardison, VP of Marketing at Lytics
* Stewart Rogers, Analyst-at-Large, VentureBeat
The notion of a Smart Factory today is based on utilizing modern Industrial IoT concepts. While it is not practical for existing brown fields to immediately modernize and adopt all these technologies and best practices, it is however possible for factories to cherry pick individual technologies and practices as a path to gradually modernize over time. These include technologies and best practices that allow for seamless machine interoperability, easy data sharing between machines, MES, ERP, and SCADA systems.
This webinar will provide an overview of some of these technologies and will also discuss the new industry standards that make such a migration possible. We’ll also share a proof of concept, the IIC Smart Factory Web Testbed, which is a network of smart factories with flexible adaptation of production capabilities and sharing of resources and assets to improve order fulfillment.
Featured IIC Testbed: Smart Factory Web
The Smart Factory Web forms a network of smart factories with flexible adaptation of production capabilities and sharing of resources and assets to improve order fulfillment. Key challenges being addressed: How can we connect factories to the Smart Factory Web and exchange data reliably? How can we provide the information securely at the right granularity to authorized partners? How can production capabilities be adapted quickly and efficiently in response to orders?
Sari Germanos, IIC Safety Task Group Chair; Business Development Manager, B&R Industrial Automation
Dr. Kym Watson, IIC Testbed Lead, Smart Factory Web; Deputy Head, Information Management and Process Control Department, Fraunhofer IOSB
According to Pulse, over 84% of Digital Transformation efforts are failing. Companies are worried about how they balance legacy systems while keeping pace with new nimble players in the Cloud Era.
How do you overcome the traditional silos between IT and the business to achieve success in Digital Transformation? What are the challenges politically, technically, and operationally that need to be overcome to turn your cloud vision into transformation reality?
Join our host Jeanne Morain, Author & Cloud/Digital Strategist and her special guests Aditya Vasudevan, Co-Founder, appOrbit and Chris Orlando, Co-Founder & Chief Sales & Marketing Officer, ScaleMatrix, as they explore the top 5 challenges and solutions both they and/or their customers have faced to achieve success in Digital Transformation, both from an IT and a business management perspective.
Please note that this webinar is postponed to October 18th.
SAP Education is launching a series of 30 minute live interactive sessions on Wednesdays as from September 13th. You will join thought leaders to find out how your company can benefit from SAP® Education best pratices to enable successful adoption of SAP S/4HANA in support of your transformation goals.
Join us for a fast-paced and informative 60-minute roundtable as we discuss the latest—and potentially most game-changing—technology disruptors to traditional storage architectures since flash: NVMe over fabric and Storage Class Memory.
It was just five years ago that Flash technology transformed the traditional storage market forever. Modern flash-first arrays are now the new normal for traditional storage. Will a new shared storage access protocol called NVM express over Fabric (NVMe-oF) combined with the advent of storage class memory (SCM) be equally disruptive to traditional storage over the next five years as NAND flash technology was in the past?
Vendors are now announcing new products and future architectures to enable them to support these new technologies. With the help of a lively panel of experts from HPE, Pivot3, and Mellanox, we will unpack this topic and explore how their innovative approaches to leveraging Storage Class Memory and NVMe-oF can radically improve storage products and solutions.
As digitalization and the Internet of Things (IoT) become commonplace, big data has the potential to transform business processes and reshape entire industries. But antiquated and expensive data storage solutions stand in the way.
A new generation of cloud storage has arrived, bringing breakthrough pricing, performance and simplicity. Cloud Storage 2.0 delivers storage as an inexpensive and plentiful utility, so you no longer have to make difficult decisions about which data to collect, where to store it and how long to retain it. This talk takes a look into how you can cost-effectively store any type of data, for any purpose, for any length of time. Join us to learn about the next great global utility, Cloud Storage 2.0.
-The next biggest cloud storage trends and technologies that are shaping the industry
-How to embrace the era of digital transformation and IoT without breaking the bank
-Best practices for storing, analyzing and utilizing big data
Howard Marks looks ahead at the storage trends rolling into IT Operations in the new year.
Today’s data center is a much more dynamic environment than ever before leaving IT organizations struggling to compete, and or integrate, with public cloud services with tools and hardware designed for the much more static datacenter of yesteryear. In this webinar long time industry observer Howard Marks will examine five of the trends that IT operators should pay attention to in the coming year.
Our discussion will focus on the promise, tools and challenges involved in moving to hybrid architectures that bridge private data centers and cloud resources to shift workloads, support data-intensive compute operations, manage unstructured data growth, build affordable archives, and prepare for disaster recovery. These technologies will consider macro-level trends as well as solving operational challenges that come with the growth of cloud.
Infrastructure architects, systems engineers, storage engineers, data strategists, HPC specialists and other enterprise IT professionals will learn:
- Current environment and challenges in building hybrid infrastructures that support traditional and object storage
- Five not-to-be-ignored trends ripe for adoption in 2018
- How compute and storage can mix in the new cloud-enabled world
Join ecommerce and cybersecurity experts from BigCommerce, Coalition Technologies and Signifyd for an in-depth discussion on the opportunities and pitfalls associated with various methods of growing your business.
CFOs rejoice! CEOs take to the streets in celebration! Ok, maybe it’s not quite that exciting, but did you know that you can get the best of both worlds in storage? One of the biggest challenges in storage has been paying for it. Thanks to trying to plan for exactly how much storage you need right now versus how much you need in the future, people often just overbuy with the expensive hopes that they’ll grow into it.
You actually have a whole lot of financing options at your disposal to pay for storage, from buying to leasing to simply paying for what you use, just like the cloud. Why pay for storage that you’re never going to actually use?
And, what happens when your storage gets too old? You buy new. What if you didn’t have to? What if you could pay a bit more in maintenance on your current system in exchange for an upgrade when the time comes?
Join Rob Commins, Sr. Director of Product Marketing for Tegile Systems, as he takes a deep dive into:
- Best practices for storing your data in the cloud
- How to keep cloud storage costs to a minimum
- How to scale data growth and storage capacity
Rob Commins has been instrumental in the success of some of the storage industry's most interesting companies over the past twenty years including HP/3PAR, Pillar Data Systems, and StorageWay. At Western Digital, he leads the Data Center System's business unit's product marketing team.
Google and Chef have come together to collaborate to manage the Google Cloud Platform (GCP) natively with Chef. Leveraging the ability of idempotency and Chef you can now declare cloud resources, using Chef, to manage GCP. By combining Chef Automate with GCP, we will show off the ability to programmatically spin up complete cloud infrastructures, without leaving Chef or the need of other tools.
This webinar will show in action the first wave of products supported deploying a complete end-to-end solution into the cloud, from the ground up, straight from Chef.
If you have Chef in your infrastructure or are interested in using Chef, your administrators, auditors, and executives will be interested in viewing this webinar.
The EU General Data Protection Regulation (GDPR) is the most important new regulation involving individuals’ information to emerge over the last few decades: it provides the foundation for how multi-national organisations and government agencies must protect sensitive customer information, and also how they can derive value from enterprise data.
MAKE COMPLIANCE GOOD FOR YOUR BUSINESS
In order to prepare for the May 2018 deadline, organisations must interpret the GDPR requirements, map processes and technology to them to ensure compliance, and quickly identify the correct information to be tightly managed and protected. Information insight is the key to solving these challenges of large volumes of data and high levels of complexity. The risks are high - lost customer confidence, security breaches, fines, sanctions, and potential lawsuits.
DO YOU KNOW WHERE YOUR DATA AND STORAGE SHOULD BE?
Enterprise IT is being transformed with the maturing of public cloud providers that offer compute, storage and application services with unprecedented elasticity, scale, resiliency and availability, on a consumption based economic model. However, the choice between public cloud and on-premises infrastructure is not a binary one.
Register to attend our webinar on Tuesday 17th October at 11:00 GMT to discover how HPE Pointnext GDPR Consulting services can provide the expertise and support to de-risk your journey to compliance. Plus learn how Commvault Hyperconverged Architecture can help you to overcome the complexities of GDPR through the consolidation of all roles performed by discrete servers in the traditional data protection architecture into a single software defined stack.
Iver van de Zand will talk and demo on the latest SAP innovations for analytics in the cloud. Keywords are live connectivity and the closed loop of combined business intelligence, planning and predictive analytics all in one environment. Fully ready and prepared for big data.
Join Esther Spanjer, Director, Enterprise Business Development EMEA, for this webinar where she will discuss how SanDisk branded SSDs and HGST branded HDDs and SSDs are ideal for environments that run Hadoop databases. This webinar will look at cases where, SanDisk and HGST devices, can dramatically reduce query response times and minimize server sprawl by tuning compute and storage separately. Optimizing your Hadoop environment that suffer from queries that take too long to return data or in which applications are not meeting their SLAs, will enable you to reduce your server footprint at massively increased query performance.
Les grands esprits ont besoin de systèmes à la hauteur de leur imagination, Dell Precision Workstation met des outils ultra puissants à disposition des experts les plus talentueux pour donner vie à leurs idées.
In this talk we visualize geolocation data collected by our devices to find out more about ourselves.
In the course of our digital lives we share a lot of information about the things we do. We give our data freely in exchange for discount vouchers, free wifi or a scoop of ice-cream.
It's not just businesses who can access this information about us to better target their products.
In this webinar we look at geolocation data collected by our devices and bring it to live with visualisations to find out more about ourselves. We invite you to join us for this session and take what you learn to play with your own data.
Many business are adding, or are considering adding iPaaS infrastructure to their business and IT Infrastructure. Yet many are neglecting to add a critical element - B2B integration. This webinar covers how to add B2B As Part of Your Next Generation iPaaS infrastructure. You should attend this webinar to see the latest advancements in Cloud B2B technologies as part of an iPaaS and hear about some of the latest real life implementations.
Join us for the next webinar in the Bright Talk series of Advanced Analytics where we will discuss the future of advanced analytics, and how it can be shaped for everyone, regardless of technical expertise.
In this webinar, Michal Becker, Business Analyst from QbeeQ will give a futuristic view of:
•Advanced Analytics - how far away are we?
•What are the steps needed to achieve the future of advanced
•How to make advanced analytics more tangible and
reachable for the average business user
•How will AI and Machine learning bring us closer to achieving
This webinar will also include practical, hands-on examples from Adam Blau, Product Manager Sisense who will discuss:
•Use case of machine learning, bots, and physical indicators to
•Easy to consume data that will allow advanced calculations to
be sent to a wider audience
•Advanced ranking mechanisms that helped a UK health
organization improve operations
Big data creation, collection and applications have influenced business decisions more than ever in 2017, and if advances in artificial intelligence, virtual reality, and enterprise applications continue to prove valuable ventures, 2018 will be even bigger.
With this in mind, how businesses approach and advance their big data maturity and cloud strategy can make or break the year. Whether it's running Hadoop workloads on-premise, public cloud, or hybrid deployments; maintaining continuity among the data, the applications, and the users must be top of mind.
This webcam panel discussion will cover:
- The considerations for a hybrid-cloud strategy
- How should organizations approach a multi-cloud strategy
- What are the best use cases for cloud deployments?
- Futures for Big Data in the cloud
Matt Baird, CTO and Co-Founder, AtScale
Bruno Aziza, CMO, AtScale
David Tishgart, Director of Product Marketing, Cloudera
Dan Kogan, Director of Product Marketing, Tableau
Royal Philips of the Netherlands is a leading health technology company focused on improving people’s health and enabling better outcomes across the health continuum from healthy living and prevention, to diagnosis, treatment and home care. Philips focuses on creating people-centric innovations for 120 years.
By leveraging advanced technology and deep clinical and consumer insights, Philips innovates to deliver innovative, integrated solutions to its customers. These are meaningful innovations that let Philips’ customers be healthy, live well and enjoy life.
Join us as we speak to Philips about their modernization approach, master data strategy, and architecture to support data-driven innovations. In this webinar, you will:
•Hear Philips describe how the company uses a modern data integration hub architecture with master data management
•Learn how master data management provides one place to master clinical/health, consumer, vendor, product, financial, employee and machine-readable data
•See how a data integration hub publish/subscribe architecture simplifies controlled delivery of consistent master data
•Discover the benefits Philips has realized in improving decision quality, reducing cost of manual data reconciliation and alignment, and gaining competitive advantage through improved time-to-market efficiency
•See a deep-dive into the architecture and the benefits of this approach over traditional approaches
•Engage in a question and answer with Philips and Informatica experts
The Web is the most powerful communication medium and the largest public data repository that humankind has created. Its content ranges from great reference sources such as Wikipedia to ugly fake news. Indeed, social (digital) media is just an amplifying mirror of ourselves. Hence, the main challenge of search engines and other websites that rely on web data is to assess the quality of such data. However, as all people have their own biases, web content, as well as our web interactions, are tainted with many biases.
Data bias includes redundancy and spam, while interaction bias includes activity and presentation bias. In addition, sometimes algorithms add bias, particularly in the context of search and recommendation systems. As bias generates bias, we stress the importance of de-biasing data as well as using the context and other techniques such as explore & exploit, to break the filter bubble.
The main goal of this talk is to make people aware of the different biases that affect all of us on the Web. Awareness is the first step to be able to fight and reduce the vicious cycle of bias.
Ricardo Baeza-Yates areas of expertise are web search and data mining, information retrieval, data science, and algorithms. He is CTO of NTENT, a semantic search technology company based in California, USA since 2016. Before, he was VP of Research at Yahoo Labs, based first in Barcelona, Spain, and later in Sunnyvale, California, from January 2006 to February 2016. He also is part time Professor at DTIC of the Universitat Pompeu Fabra, in Barcelona, Spain, as well as at DCC of Universidad de Chile in Santiago.
Still running expensive legacy backups? Caringo Product Manager Glen Olsen explains how modern storage technology incorporates advanced data protection allowing businesses to reclaim wasted backup budget and resources.
This presentation details how to leverage technologies such as Hadoop, MapReduce, Pig and Sqoop to massively scale cloud integrations to services such as Salesforce.com. Attendees will learn how to overcome API limitations, throughput latency and infrastructure scaling challenges to significantly increase integration performance.
You’re a CIO, CISO or DPO - and you’ve been woken up in the middle of
the night because personal data held by your organization has been
discovered for sale on the dark web. This disclosure puts the privacy of
your customers at risk. What do you do next?
Join this session to learn about the impact of GDPR and go through a
breach investigation and response scenario as it would be after GDPR
comes into effect in May 2018. You’ll hear from Splunk’s Data Privacy
Officer Elizabeth Davies and Splunk’s Security Ninja Matthias Maier.
What you'll learn:
● What breach response will look like under the GDPR
● What tools and processes a data privacy officer will rely on in
case of a breach
● What departments and entities will be involved beyond IT
● What activities are currently happening within organizations to
prepare for the GDPR
● What the consequences of the breach could be
As an Enterprise customer, you are potentially using IBM Z in a hybrid cloud implementation. Let's understand how to benefit from cloud access to mainframe data without moving it outside z; thereby improving security, reducing integration challenges and answering your GDPR auditor's needs.
The concept of Data lakes evolved to address challenges and opportunities in managing big data.
Organizations are investing massive amounts of time and money to upgrade existing data infrastructures and build data lakes whether on-premises or in the cloud.
This talk will discuss architectures and design options to implement data lakes with open source tools. Also covered are challenges of upgrade & migration from existing data warehouses, metadata management, supporting self-service and managing production deployments.
The use of an emerging data fabric, offers enterprises a number of benefits and advantages including the ability to break through the gravitational pull of legacy data architectures and capture the full potential of all your data.
This webinar will detail how the deployment of a data fabric can enable enterprises to more quickly and easily scale across data volumes, data types and locations. The session will also provide an overview on how a data fabric reduces storage costs and increases application agility and reliability – with the underpinning to support the successful pursuit of:
* IoT through a data fabric’s capability of handling data flows from the edge to the cloud, centralizing learning, and distributing intelligence back to the edge for real-time responsiveness.
* Machine Learning/AI with the fabric able to handle the complex data flows and logistics to support the rapid deployment and coordination across machine learning models, algorithms and analytic tools
* Microservices and containers with the underlying data fabric able to support intelligent streams and support the mobility and flexibility for elastic stateful applications and analytic processes relying on shared data.
Many organisations aspire to become digital, data driven enterprises. In these organisations data is viewed as a critical asset, both to generate new digitally based products and services, and to guide and improve business operations and decision making. But many companies are failing to live up to this aspiration. They struggle to develop and implement data strategies that align with, and help to deliver, new business strategies.
This webinar will explore what becoming ‘data driven’ really means, examines some of the reasons why many organisations are failing to realise their ambitions, and propose ways of overcoming the challenges. Key to these is a strong emphasis on the increasingly critical importance of established data management disciplines, especially Data Governance, Data Quality and MDM, which all have a critical role to play in the digital business of the future.
This session will explore:
•What is a data driven organisation and how does it differ from a traditional company?
•The main challenges of creating a data driven organisation
•Building a data driven capability - the role of business and IT
•The central importance of a business aligned Data Strategy and how to achieve it
•Why a successful data strategy needs an integrated focus on Data Governance, Data Quality and MDM
Discover the newly launched features in Qubole, powered by Data Intelligence, that automates mundane Data Model performance appraisal and simplifies Data Ops. This session will provide a detailed walkthrough of Qubole’s latest offering in Data Intelligence that includes Data Model insights and Recommendations including Partitioning, Formatting, and Sorting that helps optimize data models for improved performance and computing resources. In addition, learn about Qubole’s latest offering in self-service analytics and how it can improve analysts productivity by making data discovery easy through column and table name auto-suggestion and completion, and insights preview.
Against the ever-changing and complex hybrid IT landscape, 451 Research surveyed more than 450 colocation customers globally – to understand the changing facets driving colocation demand and how colocation providers can be positioned for success today, and in the future. This research shows that the role of the colocation provider has never been more important – but the threats and challenges will intensify, and the opportunities will be uneven. Led by 451 Research’s Rhonda Ascierto, Research Director – Data Centers, and hosted by Greg Jones, VP Strategy – Cloud & Service Provider segment, Schneider Electric, this webinar will reveal the findings of this research along with actionable guidance colocation providers can use to plan for the future.
Public cloud deployments have become irresistible in terms of flexibility, low barriers to entry, security, and developer friendliness. But the sheer inertia of traditional data lakes make them difficult to transition to cloud. In this talk we'll look at examples of how leading companies have made the transition using open source technologies and hybrid strategies.
Instead of following a "lift and shift" strategy for moving data lake workloads to the cloud, there are new considerations unique to cloud that should be considered alongside traditional approaches related to compute (eg, GPU, FPGA), storage (object store vs. file store), integrations, and security.
Viewers will take away techniques they can immediately apply to their own projects.