Welcome to the big data and data management community on BrightTALK. Join thousands of data quality engineers, data scientists, database administrators and other professionals to find more information about the hottest topics affecting your data. Subscribe now to learn about efficiently storing, optimizing a complex infrastructure, developing governing policies, ensuring data quality and analyzing data to make better informed decisions. Join the conversation by watching live and on-demand webinars and take the opportunity to interact with top experts and thought leaders in the field.
The new GDPR regulation goes into effect on May 25th. While a majority of conversations have revolved around the security and IT aspects of the law, marketing teams will play a crucial role in helping organizations meet GDPR standards and playing a strategic role across the organization.
Join us to learn more, engage with your peers and get prepared.
This webinar will cover:
- How complying with the GDPR will drive better marketing and raise the standard of the quality of your customer engagement
- The GDPR elements marketers must know about
- The elements of PII that will be affected and what marketers need to do about it
- A deep dive on how GDPR regulations will affect your marketing channels - email, programmatic advertising, cold calls, etc.
- Tactical marketing updates needed to meet GDPR guidelines
Business intelligence (BI) has been at the forefront of business decision-making for more than two decades. Then along came Big Data and it was thought that traditional BI technologies could never handle the volumes and performance issues associated with this unusual source of data.
So what do you do? Cast aside this critical form of analysis? Hardly a good answer. The better answer is to look for BI technologies that can keep up with Big Data, provide the same level of performance regardless of the volume or velocity of the data being analyzed, yet give the BI-savvy business users the familiar interface and multi-dimensionality they have come to know and love.
This webinar will present the findings from a recent survey of Big Data and the challenges and value many organizations have received from their implementations. In addition, the survey will supply a fascinating look into what Big Data technologies are most commonly used, the types of workloads supported, the most important capabilities for these platforms, the value and operational insights derived from the analytics performed in the environment, and the common use cases.
Attendees will also learn about a new BI technology built to handle Big Data queries with superior levels of scalability, performance and support for concurrent users. BI on Big Data platforms enables organizations to provide self-service and interactive on big data for all of their users across the enterprise.
If you are not yet ready for the General Data Protection Regulation (GDPR), the time is now to make it a key business initiative.
In this webinar we will discuss a best practice strategy that:
- Helps you achieve compliance with the Regulation
- Aligns your data, policies and people with your business strategy.
About the presenters:
Alistair Cole, Partner at Ixium Group, has 25 years management & consulting experience covering a wide range of international blue chip clients including the Retail, Transport, Logistics, Finance and Telecoms sectors. From a Transformational Consultancy perspective, he has engaged at all levels with GDPR having been his main focus for the last 4 years. Now working with Clients across a number of different sectors, he has been responsible for establishing and driving GDPR compliance roadmaps & defining monitoring solutions.
Rex Ahlstrom, Chief Strategy Officer at BackOffice Associates, has over 28 years of technology industry leadership experience. He specializes in enterprise software within the data integration and information management space. Ahlstrom earned a Master of Science in Electrical Engineering from Johns Hopkins University and a Bachelor of Science in Electrical Engineering degree from Drexel University. He is an experienced global lecturer, regularly addressing investors, business professionals and large customer communities.
Die GDPR-Frist steht vor der Tür, und viele Unternehmen zittern in den Startlöchern. Diese neue Regelung muss jedoch nicht unbedingt eine Bürde sein, sondern kann als Chance für Gewinn und Produktivität betrachtet werden.
Natürlich sind Sie besorgt über das Risiko von Bußgeldern sowie die Compliance-Kosten. Aber die gleichen Tools, die Ihnen helfen, die Herausforderungen von GDPR zu bewältigen, können Ihnen auch helfen, wichtige Einblicke und echten Geschäftswert aus Ihren Daten zu gewinnen. Und wenn Sie sich auf ein besseres Datenmanagement konzentrieren, können Sie auch die Kosten senken und gleichzeitig die Effizienz und Produktivität steigern!
Melden Sie sich an zu diesem Live-Webinar mit Experten von Commvault und staunen Sie, wie diese einen anderen Blick auf GDPR einnehmen und Ihnen zeigen, wie man es von einem positiven, profitablen und produktiven Standpunkt aus angehen kann.
Even though the GDPR enforcement date is here, it’s never too late to get on board and start assessing your data processes to begin aligning with the requirements.
And even if your company is already prepared, May 25 is not a finish line.
Not only will the regulation itself continue to be clarified through interpretation, but certain triggers in your organization down the road will require a re-evaluation of your compliance status.
Join Jill Reber and Kevin Moos of Primitive Logic to learn:
- How to assess your GDPR readiness and identify gaps that need to be filled
- Why GDPR compliance is not a one-and-done endeavor
- The internal and external triggers that may demand a re-evaluation of your GDPR compliance
- How to build GDPR checks into your ongoing data management strategy
The world’s most expansive data privacy regulation -- the General Data Protection Regulation (GDPR) -- takes effect on May 25, 2018, and organizations of all sizes must be prepared to respond.
This session will provide powerful insights from recent Veritas research regarding consumer sentiment toward data privacy and the implications for accelerating your compliance journey. Additionally, we will cover a brief overview of the regulation’s most important tenants and key technology areas that can reinforce your compliance posture, including a helpful new framework for evaluating GDPR Readiness.
Privacy is About People. If you don't know whose data you have, you can't protect their privacy. GDPR data subject access rights requires companies to put that principle into practice. Yet, existing tools can't tell you what personal information belongs to which data subject in order to respond to data subject requests.
A person-centric approach to locating and understanding data enables companies to effectively tackle the most challenging GDPR requirement: fulfilling Data Subject Access Requests at scale, across data sources and in minutes, not days.
In this webinar, you will learn:
- Why data subject access rights are a cornerstone of GDPR requirements
- What are the elements of the data subject rights lifecycle
- How a person-centric approach addresses GDPR requirements centered on data subject rights.
- How a person-centric approach can bridge automation of request management with fulfillment at scale by IT teams.
- What automation and fulfillment of a data subject access request utilizing a person-centric approach looks
Data is at the center of digital transformation; using data to drive action is how transformation happens. But data is messy, and it’s everywhere. It’s in the cloud and on-premises. It’s in different types and formats. By the time all this data is moved, consolidated, and cleansed, it can take weeks to build a predictive model.
Even with data lakes, efficiently integrating multi-structured data from different data sources and streams is a major challenge. Enterprises struggle with a stew of data integration tools, application integration middleware, and various data quality and master data management software. How can we simplify this complexity to accelerate and de-risk analytic projects?
The data warehouse—once seen as only for traditional business intelligence applications — has learned new tricks. Join James Curtis from 451 Research and Pivotal’s Bob Glithero for an interactive discussion about the modern analytic data warehouse. In this webinar, we’ll share insights such as:
- Why after much experimentation with other architectures such as data lakes, the data warehouse has reemerged as the platform for integrated operational analytics
- How consolidating structured and unstructured data in one environment—including text, graph, and geospatial data—makes in-database, highly parallel, analytics practical
- How bringing open-source machine learning, graph, and statistical methods to data accelerates analytical projects
- How open-source contributions from a vibrant community of Postgres developers reduces adoption risk and accelerates innovation
With growing numbers of new vulnerabilities disclosed every year, increasing attacker sophistication, and a myriad of tools and teams that have to be synchronized for effective response, most organizations struggle with designing and implementing an effective vulnerability management program. In this webinar we discuss 3 key components that all modern vulnerability programs must address :
- Knowledge: How to create actionable intelligence from business context, threat intelligence, and any other relevant data source
- Automation: How to implement automation to streamline significant parts of the VM process
- Analytics: How to effectively engage and inform all stakeholders
An interactive webinar featuring Actian Vector analytics database. Sandy will provide a product overview and demo. Submit your question during the event for the live Q & A. Download the Evaluation Edition at https://www.actian.com/lp/try-vector/
Machine learning and Deep learning architectures are driving efficiencies and automation across enterprise systems and creating challenges for infrastructure architects in order to deliver the desired outcomes. These issues are manifested through infrastructure I/O bottlenecks that result poor ingest throughput performance that slow cognitive system training times due to excessive latencies. In this webinar, Toshiba and Vexata join forces to provide actionable information about the latest generation of NVMe based solid state drive (SSD) media and how to architect systems to deliver the low latency, high bandwidth I/O services to these advanced neural nets and cognitive systems.
Join this webinar and learn about the latest technologies, architectures and best practices purpose built for these AI ecosystems.
In this webinar
1. Latest generation of SSD technologies that deliver low latency performance
2. Advances in storage architectures that maximize the SSD media
3. Use cases and customer examples that have deployed these architectures for machine learning environments.
Speakers: Joe Beda, Co-founder and CTO, Heptio + Gwen Shapira, Principal Data Architect, Confluent
With the rapid adoption of microservices, there is a growing need for solutions to manage deployment, resources and data for fleets of microservices. Kubernetes is a resource management framework for containers that is rapidly growing in popularity. Apache Kafka is a streaming platform that makes data accessible to the edges of an organization. It's no wonder the question of running Kafka on Kubernetes keeps coming up!
In this online talk, Joe Beda, CTO of Heptio and co-creator of Kubernetes, and Gwen Shapira, principal data architect at Confluent and Kafka PMC member, will help you navigate through the hype, address frequently asked questions and deliver critical information to help you decide if running Kafka on Kubernetes is the right approach for your organization.
-Get an introduction to the basic concepts you need to know as you plan to deploy services on Kubernetes.
-Learn which parts of the Kafka ecosystem fit Kubernetes like a glove, and which require special attention.
-Pick up useful tips for getting started.
-See why Confluent Platform for Kubernetes is the simplest solution to deploying and orchestrating Kafka on Kubernetes, using container images and a Kubernetes operator.
GDPR’s enforcement date begins May 25th – a day after this webinar. Is your organization ready? What if you are just getting budget for GDPR compliance efforts. What if you just learned about the GDPR and your organization is subject to this law. If any of these scenarios apply, this is the webinar for you!
Lisa Berry-Tayman, a long-time privacy and information governance professional, will discuss:
• Benefits of being fashionably late to the GDPR party
• Getting your GDPR efforts back on track (or getting on the track for some folks)
• Rethinking the timeline and action items (or creating a timeline and action items for some folks)
Crawl out from underneath your desk, uncurl from the fetal position, and join us for a Executive Women’s Forum webinar on May 24th.
GDPR requires organizations to identify, classify, and protect personal information, but how do you prepare and protect against a possible breach if you don't know what data you have, where it lives, or how it's classified?
In this informative webinar we'll discuss:
• GDPR data classification requirements
• How to incorporate GDPR data analysis into your breech prevention and reaction plan
• How to classify and protect information across multiple data stores
• Solutions for automating classification and information protection
We look forward to sharing this information with you!
In this webinar, Bill Franks will cut through the hype surrounding artificial intelligence (AI) to help the audience better understand what they need to know about the state of AI today. He will share examples of AI in action and the value it can drive, what AI can and can't do today, how AI differs from (and is similar to) other analytics approaches, tips on getting started, and perspectives on the future of AI. The aim of this webinar is to be both educational and thought-provoking.
With the General Data Protection Regulation (GDPR) becoming enforceable in the EU on May 25, 2018, many data scientists are worried about the impact that this regulation and similar initiatives in other countries that give consumers a "right to explanation" of decisions made by algorithms will have on the field of predictive and prescriptive analytics.
In this session, Beau will discuss the role of interpretable algorithms in data science as well as explore tools and methods for explaining high-performing algorithms.
Beau Walker has a Juris Doctorate (law degree) and BS and MS Degrees in Biology and Ecology and Evolution. Beau has worked in many domains including academia, pharma, healthcare, life sciences, insurance, legal, financial services, marketing, and IoT.
Managing Massive Data Growth. It’s true – it is all about you–your data that is. Several different trends have converged to accelerate the creation of valuable data, presenting new challenges to enterprise storage management teams. Many customers along with some analysts suggest object-based storage as a key element in managing the growth of data. What is it that they see in object storage? Join us as we discuss: Recent Trends leading data growth, the benefits of software-defined object storage features to consider, and real-world examples where object solutions are managing data growth issues.
Producing and managing the most valuable live sport broadcasts is being transformed through AI-enabled tools automating critical parts the production process.
From AI cameras focusing in on the action, automated multi-channel sound mixes, to powerful media management tools linking live action with real-time metadata to speed production and automate match highlights creation.
In this webinar we'll explore cutting-edge tools for live sports production and hear from leading broadcasters that are using them.
Daniel McDonnell, Managing Director, Timeline TV
Dr Rob Oldfield, Co-Founder, Salsa Sound
Jérôme Wauthoz, Vice President Products, Tedial
Dr Ignasi Rius Ferrer, Product Owner Automatic TV, MediaPro
Robert Ambrose, Managing Consultant, High Green Media
Get a deeper understanding of the underlying technical architecture behind a Data Makeover.
Does data preparation and analysis currently consume 80 percent of your time, with only 20 percent of your time spent actually creating insights? Stop spending all of your time on rear-view reporting and start seeing the data you really want to see.
Who Should Attend
Technical leaders who are interested in leveraging Microsoft Azure for enterprise data and analytics, including Big Data, Modern Visualization, IoT or Machine Learning.
Why You Should Attend
This educational session enables teams to understand the impact a Capax Global Data Makeover has had on other companies in Finance Services and Retail companies and how we go about these projects in a way that gets progress into their hands quickly.
Aborder un projet Big Data avec comme unique préoccupation d’alimenter le data lake est à la fois restrictif et dangereux pour tenir le budget et les délais de livraison. Prendre en compte la sécurité des données, leur qualité et leur gouvernance dès le départ assurera la bonne conduite du projet et inscrira durablement ces nouvelles technologies dans le SI.
The three V’s of big data (velocity, volume, variety) continue to grow. There are more data types than ever, arriving faster, in sizes that traditional storage can barely keep up with. This is where transitioning to the cloud makes sense.
With its on-demand processing, storage scalability, and potential financial savings, the cloud is now a data-oriented organization’s dream. But what model is right for you? What challenges should you look out for? How do you migrate effectively?
Join Zaloni’s Director of Professional Services and Support, Raj Nadipalli, as he answers these questions - diving into cloud-based data lake use cases, a cloud-based data lake architecture, and more.
Topics covered include:
- Benefits of a cloud-based data lake (including hybrid and multi-cloud)
- Concerns with moving your data lake to the cloud
- Why metadata matters
- Cloud use cases
- A reference architecture
Platform companies are taking over, and changing the way companies do business. And with Open Banking now on the horizon, banks cannot afford to sit back and not seize the opportunities available from collaboration.
Forecasting the impact of increased collaboration in banking
What are the partnership features to look for in an over-crowded market?
Creating a mutually beneficial relationship: what should each party bring to the table?
Ensuring you have sufficient interoperability to deploy new technology in open banking
Understanding the impact of Open Banking: what will the future role of the bank be?
What strategies will be keep to retaining the customer interface?
PowerMax is the world’s fastest storage array, delivering up to 10 million IOPS with true end-to-end NVMe and built-in machine learning. Designed for performance, scale, and availability, PowerMax can handle even the toughest workloads including block, file, mainframe, open, and IBM i.
With rich data services such as SRDF, D@RE, deduplication and compression, and with features such as non-disruptive upgrades and migrations, data is always protected and available.
•Future proof with end-to-end NVMe delivers up to 50% better response times; Inline dedupe and compression minimize footprint and maximize investment:
The World’s fastest storage array
Up to 10M IOPS, 150GB per second bandwidth
PowerMaxOS is the only storage operating system designed to take advantage of next generation media with a built-in machine learning/AI engine.
Leveraging predictive analytics and pattern recognition
New inline deduplication and enhanced compression
No performance impact, works with all data services, turn on/off by application
These new capabilities come to Dell EMC customers without compromise, adding to the full set of data services previously available on VMAX All Flash arrays. PowerMax provides fantastic investment protection through the Dell EMC Future Proof Storage Loyalty Program.
Wondering how to maximize your workforce’s productivity? It all starts with their first day and your company’s onboarding process. Effective onboarding can mean 50% greater new-hire productivity, and with talent acquisition costs at 1/3 an employee’s compensation that’s a fact that can’t be ignored. Yet, many organizations struggle to develop a cost-effective and streamlined approach to technology enablement, though it’s critical to their onboarding efforts.
Join ASG Technologies on May 29th to hear from digital workspace expert, Michael Collins, and head of HR, Licia Williams, on how you can quickly onboard new staff, whether employees, contractors, or consultants, with the tools and information that they need, right from the start. All while reducing your onboarding costs substantially.
You’ll learn how digital workspaces:
• Support any device, anywhere – so you can engage the best talent regardless of location.
• Embed IT policies regarding security, governance, and regulatory compliance.
• Are tailored to an employee’s role, but support personalization to suit individual work styles.
• Offer a single point of control to facilitate a secure offboarding process.
Successful onboarding is NOT just a matter of completing paperwork, selecting benefits, and getting a tour of the office. Getting your workforce up and running quickly and effectively will accelerate productivity. With digital workspaces, IT and HR can partner to ensure every new employee hits the ground running on Day 1.
Data is increasingly available in open API format, and this is particularly applicable to SMEs and businesses - from accounting APIs to Open Banking
We see a future of collaboration in business banking that will ultimately benefit SMEs/corporate clients. Businesses do not work in the same linear manner as consumers - companies have complex needs with multiple maturity periods. This requires algorithm driven technology utilising multiple sources for data innovation.
Our platform utilises private and public data including data via Burrau van Dijk for company and sector data, ownership and Peps/sanctions; Fitch Solutions Financial Implied Ratings (FIRs) for global coverage of more than 23,000 banks including 20,000 previously unrated; and Defaqto for full coverage of business banking products, as well as OECD and Customs and Excise trade data. Add to this private data sources such as banking transactions via Open Banking, accounting via open APIs and tailored products via our panel banks.
Business banking will undergo a transformation over the next decade, with change due to a combination of company / corporate demand and regulatory pressure on the banks. In the business banking space, banks and other financial institutions will revert to a focus on conventional banking, with increasing partnerships providing services to companies across the spectrum of financial products and more useful analytics tools linked to applicable products that change as the business life cycle and stage develop. Banks are considered digital identity guardians and have access to a wide range of data. When this is used with an appropriate Data Analytics platform, along with simple Product Prompts and product Execution, this will ensure that companies will benefit significantly from the changes underway in financial services.
The Cloud Native Computing Foundation builds sustainable ecosystems and fosters a community around a constellation of projects that orchestrate containers as part of a microservices architecture. CNCF serves as the vendor-neutral home for many of the fastest-growing projects on GitHub, including Kubernetes, Prometheus and Envoy, fostering collaboration between the industry’s top developers, end users, and vendors. In this webinar, Dan Kohn, CNCF Executive Director, will present:
• A brief overview of CNCF
• Evolving monolithic applications to microservices on Kubernetes
• Why Continuous Integration is the most important part of the cloud native architecture
In this data-driven age, the most successful companies are those who have insight into their data and can share this information across the business. Join us to hear how you can stay ahead of your competition and provide easy access to data, dashboards sophisticated analysis and reporting.
Dr. Shiva Kintali will take you into the intriguing world of indispensable concepts of algorithms and data structures, cryptographic hash functions, digital signatures, hash pointers, Merkle trees and many other technologies that are used in making the blockchain a reality.
Join us for this technical deep dive and get all of your blockchain questions answered!
Data lakes are centralized data repositories. Data needed by data scientists is physically
copied to a data lake which serves as a one storage environment. This way, data scientists can access all the data from only one entry point – a one-stop shop to get the right data. However, such an approach is not always feasible for all the data and limits it’s use to solely data scientists, making it a single-purpose system.
So, what’s the solution?
A multi-purpose data lake allows a broader and deeper use of the data lake without minimizing the potential value for data science and without making it an inflexible environment.
Attend this session to learn:
• Disadvantages and limitations that are weakening or even killing the potential benefits of a data lake.
• Why a multi-purpose data lake is essential in building a universal data delivery system.
• How to build a logical multi-purpose data lake using data virtualization.
Do not miss this opportunity to make your data lake project successful and beneficial.
In this live webcast, Sean Hughes will provide an overview of Actian's data integration solutions including use cases and customer examples. Sean will also show a live demo before taking chat questions.
Data is the fuel and key differentiator for Jones Lang LaSalle (JLL), global financial and professional services firm that specializes in commercial real estate services and investment management. Delivering valuable, actionable insights based on its voluminous data - at its clients’ fingertips - is their mantra.
Managing multiple integration environments that combine data coming from back-end building management systems, Internet of Things sensors and public data, which is spread across on-premises, big data and cloud environments is not a trivial task.
In this webinar, Pankit Patel, Senior Architect at JLL, will share how JLL is leveraging machine-learning-based operational monitoring and analytics to get predictive insights and recommendations across all their deployments and assets, thus maximizing resource utilization with better capacity planning and optimizing costs, while improving business SLAs.
In this session, you will hear about:
•How JLL is unleashing the power of their operational data and analytics to drive IT productivity and business success
•Lessons learned by JLL in managing multiple integration environments to deliver data into the hands of their clients
•Optimally planning for the future to drive better business outcomes
The Executive Corner: Strategies to accelerate your transformation to world-class Employee Services.
- Find out how leading organizations are centralizing helping employees as they do customers, with a one-stop place to get all services from HR, finance, facilities, and more
- Learn to recognize if your firm is ripe for leveraging ServiceNow to power your shared service initiative
- Delight employees and free professional staff for more value-add activities
- Drive Digital Transformation with lean and 6 Sigma underpinnings with ServiceNow
Does it seem like eDiscovery technology today is only for the “mega-firms” and “mega-cases”? What about for the “rest of us”? Are there solutions for the small firms and cases too? What does the average lawyer need to know about eDiscovery today and how to select a solution that’s right for them? This CLE-approved* webcast will discuss what lawyers need to know about eDiscovery, the various sources of data to consider, and the types of technology solutions to consider to make an informed decision and get started using technology to simplify the discovery process. Topics include:
+ How Automation is Affecting All Industries, including eDiscovery
+ Drivers for eDiscovery Automation Today
+ Challenges from Various Sources of ESI Data
+ Ethical Duties and Rules for Managing Discovery
+ Getting Data Through the Process Efficiently
+ Small Case Examples: Ernie and EDna
+ Key Components of an eDiscovery Solution
+ Types of Tools to Consider
+ Recommendations for Getting Started
* MCLE Approved in Selected States
Presentation Leader: Doug Austin
Doug is the VP of Products and Professional Services for CloudNine. At CloudNine, Doug manages professional services consulting projects for CloudNine clients. Doug has over 25 years of experience providing legal technology consulting, technical project management and software development services to numerous commercial and government clients.
Special Consultant to CloudNine: Tom O'Connor
Tom O’Connor is a nationally known consultant, speaker, and writer in the field of computerized litigation support systems. Tom’s consulting experience is primarily in complex litigation matters.
Go behind the scenes of the new “Machine Learning with TensorFlow on Google Cloud Platform” specialization on Coursera, which will teach you how to build production-ready machine learning models. During this webinar, Lak Lakshmanan, Google Cloud Machine Learning expert, will preview some of the Machine Learning techniques you will learn, walk through a demo of how to debug TensorFlow programs, and share insights on the exciting field of Machine Learning. Register now and get free access to a Machine Learning lab to jumpstart your learning. All webinar attendees will also receive a free voucher to take the first course of the Coursera specialization for free.
“Data is the new oil.” Just as we have to drill to get oil, we also need to mine data to get information out of it. Google, Facebook, Netflix and other titans of the digital era use data to build great products that touch every part of human life.
Regardless of scale, building a managed data lake on AWS requires a robust and scalable technical architecture. They often use microservices during the build process. A microservice architecture is centered around building a suite of small services focused on business capabilities and are independently deployable. It uses lightweight protocols and run on its own processes, which makes a microservice architecture ideal for building decoupled, agile, and automatable data lake applications on AWS.
Join this session with Sabyasachi Gupta, Software Architect at Zaloni, to learn more about:
- The what and why of a microservices architecture
- The different layers of a data lake stack
- Why is metadata important and how to capture it in AWS
- The relationship between Serverless and Microservices and available options on AWS
- How to build a data lake using microservice architecture on AWS
Post-Cambridge Analytica / Facebook the use and misuse of personal data is high in the public’s mind. But what does the story mean for data and our perception of how it should be controlled and owned?
At In:Confidence 2018, the Open Data Institute’s Jeni Tennison made the argument that digital privacy rights require individual consumers to have ownership of data about them. Yet Personal data is often about multiple people, not just one, adding to the complexity of the debate around data ownership.
Jeni questions whether a prospective future where we benefit from our decisions being informed by data while being protected from any harmful impacts is realistic. And how contributing to, developing and promoting a global rights framework for data might seem like a hard journey, but it is one we need to make if we are to use data to build a better future and better society for everyone.
The data ownership and privacy debate is more relevant than ever. Tune in and explore as Jeni delves into the key talking points.
Information Governance helps organizations manage the proper usage of one of their most critical assets - data. Not only is it vital to set clear policies that govern the use of data, but equally important is the enforcement and measurement of those policies with feature-rich tools and technologies.
Emerging applications are business-user focused and contain a breadth of capabilities that assist in the day-to-day tasks of Data Stewards as they keep watch over critical information assets.
In this webinar, BackOffice Associates Vice President, Product Management, Tyler Warden weigh-in on:
• Information Governance best practices and key concepts
• Where the market came from and how it evolved
• The requirements of the practice and the technology supporting it for effective governance
• How they can drive innovation and reduce risk for your business
As VP, Product Management Tyler is responsible for the product strategy and roadmap of BackOffice Associates' software products. These software products include a platform for Data Stewardship, Data Migration, Enterprise Data Quality, Master Data Management, and Information Governance.
Alan Hays is currently a Vice President in the Americas and leads the Infor Services Account Management team for Manufacturing and Distribution. He has more than twenty years’ experience consulting with manufacturing organizations and is knowledgeable in many areas, including management of production and operations.
Learn how to use the new HPE 3PAR plugin for vRealize Orchestrator (vRO), Ansible, and Chef. Learn the benefits of automating HPE Storage within your companies Private Cloud and DevOps initiatives. This session will be covering Use Cases, and Demos, for the plugins for HPE 3PAR.
Predictive analytics provide SMBs with a competitive advantage by allowing them to make business decisions proactively, using predictive data, rather than reactively, using historical data. Whether building an online recommendation engine, rapid targeting for marketing, anomaly detection for security associated with credit card transactions, or automated workflows for service desks, predictive analytics can process more data faster, generate better quality results, and modify and repeat analytics tasks more efficiently than a team of human analysts.
Key takeaways from this webinar include:
- Understanding the business advantages gained through utilizing Machine Learning
- See how Machine Learning is currently being utilized
- Gain insights into how Machine Learning can be tailored for you business
In today's data-fueled world, organizations are increasingly looking to the cloud for analytics to help accelerate time-to-value, spin up resources at will, and reduce financial risk while enabling in-house talent to focus on value-add.
The path to cloud is rarely straight and wide, however, and a number of factors affect each company's journey:
- Data gravity and migration
- Network connectivity and latency
- Use cases, culture, skills, and expectations
Tune in to this exciting webcast to hear three cloud experts – an architect, an advocate, and an evangelist – engage in a lively, free-flowing conversation about the many considerations when moving analytics into the cloud. One thing's for sure: you'll pick up new knowledge and get a fun, fresh perspective on one of the hottest topics of our era.
- Kevin Bogusch, Cloud Solution Architect, Teradata
- Marc Clark, Global Field Enablement, Teradata
- Brian Wood, Cloud Marketing, Teradata
In this webinar you will learn what makes Actian's Vector the fastest available, along with use cases and customer examples. Pradeep will step through how easy it is to evaluate the product to see for yourself.
Search and AI-driven analytics is vastly expanding the reach of data-driven insights by bringing ad-hoc data access to the masses. Anyone can easily find answers to their data questions, including those questions they may not have thought to ask.
Tune in to this live webinar as we show you what it’s like to be your own data analyst with advances in Search and AI-driven analytics.
Serverless offers huge potential to transform the way businesses build and architect cloud applications. No need to provision infrastructure or deal with maintenance, updates, scaling, or capacity planning – simply upload your apps to Amazon Web Services (AWS) Lambda and everything required to run and scale your apps is automatically taken care of (including high availability).
Join AWS and TIBCO to understand what a serverless architecture is all about and the benefits of running your apps in the serverless environment.
What we’ll cover:
- Serverless overview
- Microservices to functions
- When and where to use a serverless architecture
- What TIBCO is doing to ease the transition to serverless
- Cost savings and other benefits
Positioning Information Security within the enterprise presents its own set of challenges. Our recent survey data from hundreds of senior security and IT leaders like you uncovered a number of systemic security challenges – from skills shortages to retention strategies; not to mention responding to new challenges around cloud and IoT, and other organizational and operational issues. Join 451 Research and (ISC)2 on June 5, 2018 at 1:00PM Eastern as Research Director and former CISO Daniel Kennedy discusses this survey data and takes questions from the audience.