Welcome to the big data and data management community on BrightTALK. Join thousands of data quality engineers, data scientists, database administrators and other professionals to find more information about the hottest topics affecting your data. Subscribe now to learn about efficiently storing, optimizing a complex infrastructure, developing governing policies, ensuring data quality and analyzing data to make better informed decisions. Join the conversation by watching live and on-demand webinars and take the opportunity to interact with top experts and thought leaders in the field.
Pepperdata has engineered a big data APM solution that empowers operators to automatically optimize the performance and capacity of their big data infrastructure while enabling developers to improve the performance of their applications.
Unlike other APM tools that merely summarize static data and make application performance recommendations in isolation, Pepperdata delivers complete system analytics on hundreds of real-time operational metrics continuously collected from applications as well as the infrastructure — including CPU, RAM, disk I/O, and network usage metrics on every job, task, user, host, workflow, and queue.
The result is a comprehensive, intuitive dashboard that provides a holistic view of cluster resources, system alerts, and dynamic recommendations for more accurate and effective troubleshooting, capacity planning, reporting, and application performance management.
Pepperdata diagnoses problems quickly, automatically alerts about critical conditions affecting system performance, and provides recommendations for rightsizing containers, queues and other resources. Leveraging AI-driven resource management, Pepperdata tunes and optimizes infrastructure resources to recapture wasted capacity and get the most out of the infrastructure.
Welcome to the new world of real-time big data application and infrastructure performance management.
Welcome to Pepperdata.
Optimize your infrastructure, your applications, and your time — at scale.
Use Application Spotlight for free on up to 20 Nodes
Pepperdata Application Spotlight is a self-service APM solution that provides developers with a holistic and real-time view of their applications in the context of the entire big data cluster, allowing them to quickly identify and fix problems (failed Spark applications, for instance) to improve application runtime, predictability, performance and efficiency.
Lars Muller, CRM Partners AG, TIBCO MVPRecorded: Mar 25 201937 mins
Data migration projects are growing more complex. With companies adopting more applications, especially cloud applications, and using more data sources than ever before, migrating just one system to the cloud can be a high-stakes, risky project. CRM Partners AG has a history of implementing successful migration projects, such as a recently completed project for a global manufacturer. Their methodology for data migration projects is designed to reduce the cost and risk they may encounter.
Watch this webinar to learn:
- The advantage of using the TIBCO Cloud™ Integration cloud-based (iPaaS) integration platform for your migration project vs. custom code
- Key decisions you need to make at the outset of your project that will minimize risks
- How to prepare your data for a successful migration
- How to test and execute your migration to ensure a smooth launch
Josiah kimani, Nicola AskhamRecorded: Mar 25 201939 mins
The next Master Data Management (MDM) webinar will focus on the effect that MDM has in ensuring that a Data Governance (DG) strategy comes to life. Many organizations have a great Data Governance strategy but struggle with buy in. Equally, many organizations have a great desire to resolve Data Quality issues, harmonize data across the enterprise, de-duplicate data and drive data ownership using technology. Could the two, DG and MDM, be the perfect combination to deliver the necessary solution that meets the business strategies? Josiah Kimani of Profisee will be answering this question
Florent Voignier, Co-founder and CTO – IndeximaRecorded: Mar 25 201921 mins
Hybrid architectures are growing as companies run more and more applications indistinctly in the cloud and in their own data centers.
This webinar aims to share best practices for business intelligence and analytics applications with hybrid data sources (cloud and on-premise).
1. Reduce query cost and increase your BI performance.
2. Key factors to succeed with BI on hybrid data sources
3. Get more from your existing data cluster.
4. Demo of Indexima on 3 data sources, allowing instant analytics on all business data for all.
Duration: 30 minutes
Speaker: Florent Voignier, Co-founder and CTO – Indexima
Rubrik Datos IO offers fast NoSQL data recovery, any-to-any data mobility, and cost savings to organizations that rely on Apache Cassandra, DataStax Enterprise or MongoDB. In this webcast we’ll do a deep dive into how Rubrik Datos IO can save time by automatically discovering and identifying your databases and tables, how to quickly build a test/dev database from production data, and how Datos IO saves 70% or more on secondary storage with semantic deduplication – a capability unique to Rubrik.
Mitesh Shah, Sr. Cloud Product ManagerRecorded: Mar 21 201959 mins
Organizations are adopting cloud at a fast pace and migration of critical enterprise information resources could be a challenge when dealing with complex and big data landscape. Building the right data services architecture can help alleviate the pain points, whereby Data Virtualization comes to the rescue by enabling the companies to gain maximum benefits from cloud initiatives in form of agility, cost savings and more.
In this webinar, you'll learn
* How Denodo Platform's multi-location architecture can simplify and accelerate cloud migration.
* Best practices of deploying Denodo Platform in the cloud.
* Leverage Denodo's virtual data services layer to address and augment cloud solutions such as Data Warehouse modernization, Data Science, and Data Lakes in the cloud.
* Watch a demo showcasing data virtualization and analytics in the cloud.
Tolga Tarhan, CTO at OnicaRecorded: Mar 21 201930 mins
Learn how AWS can accelerate your IoT strategy by allowing for rapid IoT prototyping in less than 6 weeks. By leveraging AWS' connectivity, storage, application, and analytics capabilities, you can build IoT-enabled apps quickly and at a fraction of the cost. We'll cover serverless IoT architecture as well as do a live demo on how to connect & configure devices on AWS IoT Core.
What we’ll cover:
* Device connectivity & security
* Ingesting data with AWS IoT
* Reacting to IoT data in real-time
* Storing data in Amazon DynamoDB and Amazon S3
* Building a simple RESTful API to consume the stored data
* Analytics: Batch processing, analytics at the edge device with AWS Greengrass
* IoTanium by Onica for quick hardware/software prototyping
* Customer success: automation and remote management with AWS IoT
Seth Earley, Founder and CEO, Earley Information Science, Tom Davenport, author of The AI AdvantageRecorded: Mar 21 201948 mins
Artificial Intelligence (AI) for organizations of all sizes is becoming more practical every day. Many successful implementations are having subtle, but far reaching implications for companies across industries. In this roundtable we will discuss:
1. Where to begin looking for practical opportunities to leverage AI in your organization
2. How market forces are driving this evolution
3. Prioritizing incremental wins over big hit projects
Kevin Bender, ASG & Brad Steinmeyer, Zia ConsultingRecorded: Mar 21 201929 mins
As companies strive to make the Cloud a coherent part of their overall IT infrastructure, we often hear customers asking, “Are we taking the best approach to get the most from our Content?” Navigating the best path for your organization is more varied and compelling than ‘just moving content to the Cloud’.
Blending traditional and modern IT models can offer cost savings as well as the agility you need to compete in the digital world. Watch this webinar to hear ASG and Zia Consulting discuss best practices developed while helping organizations construct and implement successful cloud strategies:
• How Cloud migration can enhance IT agility and business competitiveness
• What Consolidation vs. Migration really means
• How Efficiency and Security can and should ‘work together’
• How to measure immediate gains that align with your long-term vision
• Why the Cloud is important as you drive digital transformation
Chris Kissel, IDC, Research Director, Worldwide Security Products; Dawn Bedard, Sen Technologist; Elevi Assocs; B. DunlapRecorded: Mar 21 201960 mins
Today’s sophisticated cybersecurity attacks often unfold in the blink of an eye. To respond quickly, your security teams need to see security incidents as they happen to ensure that attempts to hack your server environment are thwarted before entry into your machines. Whether detecting malware, helping to prevent and disrupt command and control communication, ransomware and phishing attacks – DNS can help with this and much more. But are you leveraging it as part of your cyber strategy? Nearly all threats use the DNS system therefore threats in your network can easily be seen in your DNS data. Join Infoblox and (ISC)2 on March 21, 2019 at 1:00PM Eastern as we bring in experts from IDC and ELEVI for a discussion on how leveraging DNS can help identify attacks as they happen or even prevent them before they happen, remediate attacks faster, and help detect and stop malware from spreading.
The competition is coming. By 2020, eighty percent of SaaS players will move to a subscription-based model, which means your success strategies have to evolve — and that includes the all-important fundamental: how your subscriptions are structured.
Your subscription pricing and plan structure have a profound impact on subscriber acquisition and retention, and once you nail the formula, you'll see revenue increase. Do you choose an annual or monthly plan? Offer a trial or discounts?
This is where a test plan is critical. Combined with benchmark data, you can test and iterate your way to the right pricing and plan structure. And when important decisions are backed by comprehensive benchmark data, you’re far more likely to see an increase in adoption and revenue.
Join this VB Live event to discover essential SaaS benchmarks and best practices to maximize revenue, improve acquisition, and spur adoption.
Registration is free!
Attendees will learn:
* Important SaaS benchmarks by industry segment
* How to structure your SaaS subscription plans and pricing to maximize revenue and retention
* How successful SaaS companies use a test, learn, and iterate framework to optimize revenue
* The key metrics — and reports — to monitor for success and maximum LTV
* The results of an in-depth case study on SaaS testing and pricing
*Panelist: Emma Clark, Chief of Staff, Recurly
*Moderator/Analyst: Sean Joyce, Recurring Revenue Technologies, Navint
Keith Lockhart, VP Strategic Programs, AccessData, Richard Hickman, Mgr—Digital Forensics & Incident Response, Eide BaillyRecorded: Mar 21 201951 mins
More and more cybersecurity experts agree: it may no longer be a matter of if, but when an organization will be breached—either by accident, employee misconduct or malicious attack. What do you do when the inevitable happens? By acting responsibly and responsively, a company can save itself significant penalties and reputational damage. During this webinar, you’ll discover best practices for effectively navigating today’s alphabet soup of evolving data privacy regulations—including the GDPR, CCPA and other US and international data privacy rules—in the event of a breach:
• Understand responsibilities and ramifications of ever-tightening data privacy rules
• Minimize damages and mitigate risk exposure with rapid response strategies
• Identify various breach threats and common network vulnerabilities
• Analyze key learnings from prominent breach events
• See how technology can help you investigate, automate and document key processes
Join us as we continue this series of webinars specifically designed for the community by the community with the goal to share knowledge, spark innovation, and further build and link the relationships within our HPCC Systems community.
Featured speakers include:
Vincent Freeh, Professor NC State University, HPCC Systems as a Service (Haas)
There are numerous reasons to use an IaaS for HPCC Systems instead of dedicated hardware, especially if the workload does not execute 24/7. We developed a CloudFormation Template and an AMI for HPCC Systems and a reference architecture for HPCC Systems in AWS. Significant effort was expended to determine the best set of resources for HPCC Systems clusters. Furthermore, we created a program to create and manage HPCC Systems clusters in AWS from the command line. This talk will present the tools we created and also explain the reference architecture and many of the configuration options.
David de Hilster, Consulting Software Engineer, LexisNexis Risk Solutions, New ECL IDE Features in 7.0
The ECL IDE is an integrated development environment for ECL programmers to create, edit, and execute ECL code within the HPCC Systems platform. The latest 7.0 version includes new features and enhancements such as a more comprehensive autocomplete, tooltips and F12 capabilities. In this talk, David will discuss how users can leverage these features and more.
Bob Foreman, Senior Software Engineer, HPCC Systems, LexisNexis Risk Solutions - ECL Tip: A Tiny Trove of TABLE Tidbits
This month’s ECL Tip of the Month will focus on the ECL TABLE Function. Common (and some not so common) use cases will be discussed. Code example demonstrated will also be available for download.
Andrew Hollister, Chief Architect and Product Manager, LogRhythm LabsRecorded: Mar 21 201946 mins
Effective security operations are the first line of defence when it comes to preventing cyberattacks. In order to accomplish this, organisations need a mature security program that leverages people, process, and technology to enable rapid detection and response.
Yet some organisations struggle with the overall effectiveness of their security operations. They lack a model for maturing their capabilities. A mature security operation enables threats to be detected earlier in the cyberattack lifecycle which is critical.
Join this webinar to discover LogRhythm's Security Operations Maturity Model (SOMM) which was developed to help organisations assess their level of maturity and plan for making improvements over time.
Organisations can use this model to evaluate their current security operations and develop a roadmap to achieve the level that is appropriate in the light of their resources, budget, and risk tolerance. You will also hear about the critical measures of security operations effectiveness.
Simon Böpple - Territory Account Manager, Informatica | Sören Eickhoff - Sales Consultant, InformaticaRecorded: Mar 21 201932 mins
In diesem Webinar demonstrieren wir Ihnen anhand eines Praxisbeispiels, wie Daten schnell und einfach aufbereitet und verändert werden können, um bessere Kundenerlebnisse zu schaffen.
Lernen Sie gemeinsam mit Informatica, wie Herausforderungen bei der Integration von IoT- und IT-Daten überwunden und somit eine grundlegende Datensicht erstellt werden kann, um den Wert aller IoT-Projekte zu steigern.
Monica Mullen, Principal Solutions Marketing Manager, MDM & Prash Chandramohan, Director, Product MarketingRecorded: Mar 20 201969 mins
Companies of all sizes going through a digital transformation are relying on a foundation of accurate, complete, and relevant data. With trusted data, they fuel a number of powerful initiatives including customer experience, operational efficiency, and advanced analytics. But, errors in data exist. And when most companies start looking into their data, they realize the data problems are worse than they thought.
They look to improve the quality of their data but navigating the options for trusted data solutions can be daunting. In this webinar, Prash Chandramohan and Monica Mullen of Informatica will focus on the outlining the difference between Master Data Management and Data Quality.
If you’re trying to build a foundation of trusted data to fuel your digital transformation and want to better understand the tools and their purpose, register for this webinar today to learn more. We’ll share how a trusted source of clean, relevant, and complete data can help you succeed in 2019 as a data-driven company.
Erik Ottem, Senior Director, Western Digital; Stephen Hill, Senior Analyst, 451 ResearchRecorded: Mar 20 201959 mins
Data protection has always been one of the top priorities of IT; but the very nature of business data has been quietly changing behind the scenes. Today, unstructured data in the form of documents, images and other types of media files are making up the lion’s share of new data growth, and traditional storage and backup models are unable to provide the information and visibility needed to protect, manage and realize the full potential of the growing mountain of unstructured business data.
Ruben Diaz, Data Scientist at Vision Banco and Rafael Coss, Maker at H2O.aiRecorded: Mar 20 201960 mins
In the financial industry, data can translate to revenue if used correctly, yet financial institutions need to operate with scale, speed, and immense accuracy.
Data scientists at Visión Banco needed to improve the bank’s credit scoring process, including predicting existing customer behavior and churn, determining credit risk, and offering credit to new customers. Join our webinar to learn how the bank saved time and improved accuracy by building and deploying models using H2O Driverless AI. As a result, the Paraguayan bank has doubled its rate of customer propensity to buy.
Join our webinar to learn:
• How to automate machine learning modeling to create more models faster and scale data science efforts
• How you can use high-performance computing to solve complex data challenges such as real-time targeting of promotions or customer churn predictions
• How one financial institution now easily determines credit risks and expands offers to customers using H2O Driverless AI
• How you can optimize business processes across your financial institution, such as evaluating credit scores or credit risk, detecting fraud, or performing analysis for Know Your Customer (KYC)
John Armstrong, Head of Product Marketing, PepperdataRecorded: Mar 20 201928 mins
Moving workloads to the cloud is either a reality or a near-term goal for an overwhelming number of enterprises. For most organizations, optimizing cloud use to improve operational efficiency and achieve cost savings is the primary objective. But navigating cloud adoption is a complex process that requires careful planning and analyses to achieve desired economic goals and ensure success. It’s a technology decision that has significant impact on the business.
Economic benefits vs. costs must be accurately estimated and carefully weighed before making a move to cloud…not just for the cluster, but for every workload queue. This webinar will take the guesswork out of calculating cloud migration costs and provide you with the detailed analyses you need to make fully-informed technical and business decisions before embarking on your cloud migration journey.
This webinar addresses critical questions for organizations considering or already deploying big data workloads in the cloud:
- How accurate are my cloud migration and long-term deployment cost estimates?
- What queues will be more cost-effective in the cloud, and which ones are better left on-premises?
- What AWS, Azure, Google, or IBM cloud instances will work best for each of my queues? CPU-optimized? Memory-optimized? General purpose?
- How can I help my team to make a successful transition to deploying workloads using the public cloud?
Seyi Verma, Director of Product Marketing, DruvaRecorded: Mar 20 201950 mins
It's no secret that managing the backup and restore of your virtual machines on-premises and in the cloud is unnecessarily complex. Managing multiple vendors and disparate solutions not only increases your costs and degrades visibility, but it also puts your organisation's critical data at risk.
Luckily, there's a better way with an all-in-one, cloud-native solution.
Join our webinar and learn how Druva can solve your VMware data protection challenges, including:
- Reducing your overall storage footprint by consolidating your data
- Streamlining data management and protection, reducing costs by up to 60%
- Meeting your RTO and RPO requirements for better business continuity
- Leveraging enterprise cloud backup for instant disaster recovery
John Grim, David KennedyRecorded: Mar 20 201985 mins
Join us—the Verizon Threat Research Advisory Center – for our Monthly Intelligence Briefing (MIB) to discuss the Verizon Insider Threat Report and the current cybersecurity threat landscape.
•The time it takes to discover a breach after the first action
•5 insider threat types identified by our caseload
•Top 3 insider threat motivations
•Top industries suffering sensitive data breaches
Our Verizon Threat Research Advisory Center presenters will be:
•John Grim, Senior Manager, Investigative Response - America
•Domingo Jesus Alvarez-Fernandez, Senior Dark Web Hunter, Threat Intelligence
•David Kennedy, Managing Principal, Open Source Intelligence
Peter Simpson, VP Panopticon Streaming Analytics, Datawatch + Tom Underhill, Partner Solutions Architect, ConfluentRecorded: Mar 20 201953 mins
When it comes to the fast-paced nature of capital markets and IoT, the ability to analyze data in real time is critical to gaining an edge. It’s not just about the quantity of data you can analyze at once, it’s about the speed, scale, and quality of the data you have at your fingertips.
Modern streaming data technologies like Apache Kafka and the broader Confluent platform can help detect opportunities and threats in real time. They can improve profitability, yield, and performance. Combining Kafka with Panopticon visual analytics provides a powerful foundation for optimizing your operations.
Use cases in capital markets include transaction cost analysis (TCA), risk monitoring, surveillance of trading and trader activity, compliance, and optimizing profitability of electronic trading operations. Use cases in IoT include monitoring manufacturing processes, logistics, and connected vehicle telemetry and geospatial data.
This online talk will include in depth practical demonstrations of how Confluent and Panopticon together support several key applications. You will learn:
-Why Apache Kafka is widely used to improve performance of complex operational systems
-How Confluent and Panopticon open new opportunities to analyze operational data in real time
-How to quickly identify and react immediately to fast-emerging trends, clusters, and anomalies
-How to scale data ingestion and data processing
-Build new analytics dashboards in minutes
Paul Moxon, VP Data Architecture and Chief Evangelist, DenodoRecorded: Mar 20 201959 mins
Advanced data science techniques, like machine learning, have proven an extremely useful tool to derive valuable insights from existing data. Platforms like Spark, and complex libraries for R, Python and Scala put advanced techniques at the fingertips of the data scientists. However, these data scientists spent most of their time looking for the right data and massaging it into a usable format. Data virtualization offers a new alternative to address these issues in a more efficient and agile way.
Attend this webinar and learn:
* How data virtualization can accelerate data acquisition and massaging, providing the data scientist with a powerful tool to complement their practice
* How popular tools from the data science ecosystem: Spark, Python, Zeppelin, Jupyter, etc. integrate with Denodo
* How you can use the Denodo Platform with large data volumes in an efficient way
* About the success McCormick has had as a result of seasoning the Machine Learning and Blockchain Landscape with data virtualization
Nelson Petrack, CTO, TIBO SoftwareMar 26 201910:00 amUTC36 mins
According to the newest research released by SITA, Blockchain is fast emerging among airports and airlines as the priority technology for making the travel experience more efficient. The most commonly expected use of blockchain is for passenger identification, with 40% of airlines saying it would offer a major benefit, and more use cases around passenger experience and airport operations are emerging.
Watch this webinar with TIBCO CTO Nelson Petrack presenting:
-A new business model where the traveller is highly connected. Can a blockchain be used to streamline airport operations and improve the customer experience? Increase security?
-How can blockchain play a role in passenger identity management? What about issues like privacy, GDPR, and blockchain’s immutable behaviour (meaning no deletes)?
-What is the role of consensus in an enterprise/permission blockchain deployment? What are some examples?
Learn more about Project Dovetail™ at https://www.tibco.com/resources/community/project-dovetail.
Erik Bartholomy (LogRhythm) and Eric Parent (Sonepar)Mar 26 201910:00 amUTC47 mins
Building a Cybersecurity Architecture to Combat Today’s Risks:
Foundations, Visions, Testing and the Role of SIEM
The rapid development and adoption of cloud infrastructure, mobile workforces, IoT and other susceptible environments has mandated a reappraisal of security architecture. Modern organisations are recognising their security posture may not be keeping up with the threat landscape, and this leads to frightening discoveries around the safety of their data and networks.
Join Erik Bartholomy – a Security Architect at LogRhythm – and Eric Parent – CISO at Sonepar – to discover how security architecture is developing to face the current threat landscape. Failures in past layered approaches are frequent, and serve as valuable learning lessons on the importance of proactive monitoring and response.
During the webinar you will learn:
• How LogRhythm’s POC enabled Sonepar’s team to efficiently expedite threat detection, and improve their SOC and analyst efficiency
• The guiding principles and technology behind modern security frameworks and architecture, including the rise in popularity and value of the Zero Trust Framework
• How security architecture helps align IT security with business strategy and current threats
• Adapting architecture to accommodate different environments, including on-premises, cloud, and hybrid cloud
John Bell & Tony Lokko, Caringo Senior ConsultantsMar 26 20192:00 pmUTC60 mins
FileFly is the solution you need for complete, automated and flexible data lifecycle management of unstructured data—from creation to preservation. Working with both NetApp and Windows filers, FileFly allows you to set policies that guide the level and target of file data movement based on your organization’s requirements. Senior Consultants John Bell and Tony Lokko explain how FileFly works and can now be used with any of the Big 3 cloud providers (Azure, Google and Amazon) as well as Caringo Swarm Object Storage. They will demonstrate use of FileFly and have a live Q&A session so you can learn from their extensive experience in architecting, deploying and managing storage solutions.
Jim Davis, Fluke NetworksMar 26 20193:00 pmUTC71 mins
This presentation will look at some of the requirements for field testing Single Mode fiber to make sure it will support the new applications you are planning to run on it. Starting with inspection and cleaning it will answer many of those little questions, “can I use that click pen from UPC multimode connectors on APC connectors?, How do I inspect APC connectors?”
From there the presentation will move on to the calculation of a loss budget, what is the correct budget, and what wavelengths should be tested. We will look at the best practices in Tier I loss testing to assure the minimum of uncertainty in our test results. The presentation will explain how this can be done for both single connectors such as LC and SC as well as MPO connections.
Finally, we will look at troubleshooting the fiber using an OTDR and give an introduction to the identification of common events in Single Mode OTDR traces.
Jim Silhavy, Manager of Software Engineering, ProgressMar 26 20194:00 pmUTC45 mins
Get up-to-speed on the latest security advances for cloud, big data and relational databases. In the wake of increasingly frequent data breaches and emerging data protection laws like GDPR, enterprise security has become paramount. Spending on enterprise security is forecasted to increase to $96.3 billion in 2018.
As organizations continue to invest in business intelligence, big data, IoT and cloud, IT teams are introducing increasingly complex security mechanisms to authorize and encrypt access to data. Your analytics and data management tools need to be able to securely access data spread across different systems. They must run within existing policies, and without introducing security risks.
Listen to the Progress experts as they explain what it takes to securely access data from rapidly evolving enterprise data access layers.
Watch this webinar to learn about:
- Advances in end-to-end data security
- Latest technologies for big data and cloud—Knox, Ranger, Sentry, Kerberos, OAuth
- Newest security innovations in enterprise databases such as Oracle, SQL Server and more
- How Progress configures and tests security in complex environments
Any integration project can have gotchas, such as custom fields or duplicate records that can slow you down. Proper up-front planning can help you successfully manage the gotchas, reduce the time needed and achieve better results.
In this presentation, integration experts from LyntonWeb discuss:
- How to prepare for your HubSpot-Dynamics CRM integration project to get the results you want
- How to apply this advice to other marketing automation-CRM integration projects, including Salesforce integrations
Shane Swiderek, Brett Orr, Peter Girgis, Gavin AdamsMar 26 20195:00 pmUTC55 mins
Innovative IoT solutions are solving real-world problems, with data serving communities in ways you would never expect. Who would have predicted the hidden insights and cost savings waiting to be uncovered in our trash? Well, Bigmate! This IoT solutions provider turns real-time sensor data into insights.
Realizing the trash collection process was outdated and inefficient, Bigmate developed a system with IoT sensors providing location, temperature, and bin level monitoring to optimize trash pickup. Saving money, streamlining operations, and optimizing processes, IoT sensor data can help your business discover new ways to solve old problems.
Join our interactive webinar panel with Bigmate and AWS to learn:
-How city-wide IoT implementation is optimizing trash pickup
-How to translate raw data into actionable insights for users
-The ins and outs of Bigmate’s IoT platform
-Gavin Adams, Sr. Solutions Architect, IoT at Amazon Web Services
-Peter Girgis, Chief Technology Officer, Bigmate
-Brett Orr, General Manager, Bigmate
-Shane Swiderek, Product Marketing Manager, TIBCO Jaspersoft
Francois Dansereau, Director of Migration Services TeamMar 26 20195:00 pmUTC39 mins
Have you been reluctant to change your Source Code Management (SCM) system because it’s too risky? For Compuware ISPW customers, our migration process helped them switch and they found the grass to be greener on the other side. In this webcast, you’ll hear their stories. We know migrating away from an entrenched mainframe SCM system is daunting. Our customers have massive amounts of code in their SCM systems and numerous active development projects that cannot be disrupted. They also lack both in-house expertise in SCM migration and headcount to spare for large migration projects.
Leveraging 24+ years of experience, our team of experts in SCM migration, with their proven methodology and state of-the-art migration tools, seamlessly migrate customers to ISPW.The Compuware ISPW SCM Migration Service is a customized solution for organizations migrating from CA Endevor, CA Panvalet, CA Librarian, Micro Focus/Serena ChangeMan, IBM RTC as well as homegrown systems to ISPW SCM. This service is designed to quickly and economically take you through all the steps of your migration—from project planning, data migration and integration to testing and knowledge transfer. Our methodology and infrastructure provide a single, lighter, more robust and agile SCM environment.
This proven SCM migration practice: Accurately finds and documents codebase characteristics including histories and inter-application dependencies, so they can be fully preserved in the new environment
•Manages migration in targeted, rightsized stages to accelerate time-to-benefit and limit exposure to risk
•Rigorously tests code artifacts in the customer’s new SCM environment and brings them to a near-live state in parallel to the old environment to ensure zero disruption of development work
•Improves code structure and reduces code complexity due to reorganizing the code base during the migration
•Involves an end-to-end methodology addressing all critical dimensions of the migration project
When streaming data meets machine learning and advanced analytics, the innovation possibilities can be endless. Operationalization of data movement in a hybrid cloud architecture is key to making your technology investments deliver on their promises. Without it comes frustrated developers, failed projects and technology disillusionment.
Join Doug Cutting, Apache Hadoop creator and Chief Architect at Cloudera, and Arvind Prabhakar, co-founder and CTO at StreamSets as they discuss how to use DataOps to avoid common pitfalls associated with adopting modern analytics platforms.
Srini Srinivasan, Co Founder and Chief Development OfficerMar 26 20195:00 pmUTC60 mins
The Aerospike Connect family of add-on modules as well a new REST-API now make it even easier to integrate the Aerospike Database into both new and existing enterprise infrastructure systems, helping customers modernize and optimize their data architecture. We are happy to announce that Aerospike Connect for Kafka and Aerospike Connect for Spark are our two newest add-on modules.
Join us on March 26 at 10am PT to hear from Aerospike Founder and Chief Product Officer, Srini Srinivasan, who will cover:
- The Aerospike Connect product strategy
- How Aerospike Connect for Spark enables companies to directly integrate the Aerospike database with their existing Spark infrastructure
- How Aerospike Connect for Kafka makes it easy for enterprises to exchange data bi-directionally between the Aerospike database and enterprise transactional systems at the edge and core
- How Aerospike REST Client can be utilized as an easy and standard interface for the Aerospike database that enables developers to work with the Aerospike data layer
Guest speaker Mike Gualtieri, Forrester Research and Ingrid Burton, H2O.aiMar 26 20196:00 pmUTC60 mins
Artificial Intelligence (AI) is influencing every industry and decision makers are being asked: What is your AI Strategy for 2019? Most have begun thinking about how AI can be incorporated into their business strategy but the exponential growth of AI resources and offerings is making it difficult to find the right fit for one's organization. What is needed is a practical approach to AI that filters out the signal-to-noise ratio when deciding on an enterprise AI strategy. In this webinar, guest speaker and Forrester Research Vice President & Principal Analyst, Mike Gualtieri, maps out the seven key elements of an enterprise AI strategy.
Paul Brebner, Technology Evangelist, InstaclustrMar 26 20199:00 pmUTC45 mins
As distributed applications grow more complex, dynamic, and massively scalable, “observability” becomes more critical. Observability is the practice of using metrics, monitoring and distributed tracing to understand how a system works. In this webinar we’ll explore two complementary Open Source technologies: Prometheus for monitoring application metrics; and OpenTracing and Jaeger for distributed tracing. We’ll discover how they improve the observability of an Anomaly Detection system - an application which is built around Instaclustr managed Apache Cassandra and Apache Kafka clusters, and dynamically deployed and scaled on Kubernetes (on AWS EKS).
Priya Shivakumar, Director of Product, Confluent + Ryan Lippert, Product Marketing, Google CloudMar 27 201910:00 amUTC56 mins
Most companies start their cloud journey with a new use case, or a new application. Sometimes these applications can run independently in the cloud, but often times they need data from the on premises datacenter. Existing applications will slowly migrate, but will need a strategy and the technology to enable a multi-year migration.
In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka® service, to migrate to Google Cloud Platform. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.
Register now to learn:
-How to take the first step in migrating to GCP
-How to reliably sync your on premises applications using a persistent bridge to cloud
-How Confluent Cloud can make this daunting task simple, reliable and performant
James Bonamico, Senior Engineer (syslog-ng product specialist)Mar 27 201910:00 amUTC32 mins
We can help eliminate your struggle to reliably collect and send your log data to Splunk.
Register for our free webinar to learn how you can easily manage and scale log ingestion across your whole enterprise environment.
In this webinar we will demonstrate:
•syslog-ng Premium Edition’s high-performance Splunk HTTP Event Collector (HEC) destination
•How to batch and send messages to multiple Splunk HEC nodes
•How to eliminate the need for forwarders and external load balancers
Marco Bal, Principal Systems Engineer - Benelux & NordicsMar 27 20191:00 pmUTC45 mins
While it is still widely used, Disk to Disk to Tape (2D2T) architectures have long been optimized for backup performance, with little regard for the true goal – recover data when and where it’s needed- within business Service Level Agreements (SLA’s).
Tapes may relieve compliance concerns, but they lock data away, stranding any value residing in that data.
Learn how data protection is about much more than just backup and how Pure Storage with the industry’s first Flash-to-Flash-to- Cloud (F2F2C) Object Engine solution taps the power of flash and cloud to modernize data protection. With Object Engine organizations can deploy a cloud-native solution with scale-out rapid recovery on premises, benefit from cloud economics and reuse data with cloud apps such as analytics and test/dev.
During this technical webcast you will find out how Object Engine can help your organization to rapidly restore mission critical data exactly where it’s needed – and within an SLA.
Jasmine Glasheen, Contributing Editor at RetailWire and Content Lead at Retail MindedMar 27 20194:00 pmUTC45 mins
Maybe it’s their large amount of customer data, or their storefronts as a source of gathering data.
Retail is the industry that never sleeps and in the past decade, many legacy retailers have fallen by the wayside in favor of stores with fresh retail concepts: Claire’s has been replaced by BaubleBar, Sears has given way to Amazon, Bon-Ton has given way to HauteLook, and so on. The reality of the current market is that some of these young upstarts understand something that many legacy retailers realizetoo late: advanced personalization initiatives are necessary to stay in the game. It isn’t a coincidence that 45 percent of retailers will use artificial intelligence to improve their operation in 2019. However, simply investing in AI isn’t enough… you need to know how to use it.
This webinar will answer the following questions:
• What will customers expect from retailers in 2019?
• How are new retailers redefining the game?
• Which legacy retailers have successfully evolved to meet customer demand?
• What advantages do legacy retailers have, and how can they leverage those advantages to compete?
• What are retailers’ top challenges to scaling personalization initiatives?
• How will master data management (MDM) empower retailers in the years to come?
Matt Madill, Storage Systems Administrator, Duquense UniversityMar 27 20195:00 pmUTC45 mins
Join Matt Madill, Storage Systems Administrator at Duquesne University, as he shares his journey with Komprise and how his team:
- Identified and archived years of cold data to enable the move to an all-flash array.
- Build data management policies that meet the unique needs of each department as well as the university as a whole
- Plan to offer cloud services and infrastructure
Erik Weaver, Global Director, M&E Market Development, Western Digital, Tridib Chakravarty, CEO, StorageDNAMar 27 20195:00 pmUTC60 mins
In Part 1 of this two-part series on DNAFabric, we will explore how and why legacy storage infrastructures must shift to address today’s digital content demands in media & entertainment. We’ll start the discussion by exploring what abstraction means and why it is an important consideration in asset identification, resource allocation, collaboration, and the creation of indelible metadata in a hybrid world. In Part 2 of our DNAFabric series, we’ll dig deeper into how abstraction, combined with Tableau, allows a new generation of data insights.
Presentation Leaders: Doug Austin and Tom O'ConnorMar 27 20195:00 pmUTC75 mins
If you think you’re hearing more and more about blockchain and bitcoin, you’re probably right. Blockchain is even being discussed as having potential application in legal technology and electronic discovery. But, what exactly is it? How does it work? And, how do you need to be prepared to address it as a legal professional? This CLE-approved* webcast session will discuss, define and describe blockchain and how it can apply to legal technology and eDiscovery today and in the future. Topics include:
+ History of Blockchain and Bitcoin
+ Defining Key Terms
+ How Blockchain Works
+ Advantages and Challenges of Blockchain
+ Smart Contracts and Other Use Cases for Blockchain
+ Impacts of Blockchain on Legal Technology and eDiscovery
+ Is Blockchain Really as Secure as People Think?
+ Future of Blockchain
+ Resources for More Info
Doug Austin is the Vice President of Products and Services for CloudNine. Doug has over 30 years of experience providing legal technology consulting, technical project management and software development services to numerous commercial and government clients. Doug is also the editor of the CloudNine sponsored eDiscovery Daily blog, which is a trusted resource for eDiscovery news and analysis, and has received a JD Supra Readers Choice Award as the Top eDiscovery Author for 2017 and 2018.
Tom O’Connor is a nationally known consultant, speaker, and writer in the field of computerized litigation support systems. Tom’s consulting experience is primarily in complex litigation matters.
Boris Evelson, VP - Principal Analyst at Forrester & Andrew Yeung, Sr Dir PMMMar 27 20195:00 pmUTC60 mins
The pace of innovation in analytics has hit breakneck speed. Technologies like AI, natural language processing, and voice-driven analytics make it easy for insights-driven businesses to harness insights from data to operate smarter and faster in a fast-moving competitive environment. Turning these data-driven insights into action fuel an organization’s intelligence to improve customer experience, streamline operations and drive growth. And the payoff is immense. According to Forrester Research, insights-driven businesses grow on average 30% year-over-year, seven to ten times faster than the global economy1.
Join ThoughtSpot and guest speaker, Forrester Research VP and Principal Analyst Boris Evelson to learn how to transform your organization into an insights-driven business by letting business users uncover actionable insights in a whole new way using search and AI.
Ryan Peterson, Global Technology Segment Lead at AWS & Scott Gidley, Vice President of Product at ZaloniMar 27 20196:00 pmUTC58 mins
Today's enterprises need a faster way to get to business insights. That means broader access to high-value analytics data to support a wide array of use cases. Moving data repositories to the cloud is a natural step. Companies need to create a modern, scalable infrastructure for that data. At the same time, controls must be in place to safeguard data privacy and comply with regulatory requirements.
In this webinar, Zaloni will share its experience and best practices for creating flexible, responsive, and cost-effective data lakes for advanced analytics that leverage Amazon Web Services (AWS). Zaloni’s reference solution architecture for a data lake on AWS is governed, scalable, and incorporates the self-service Zaloni Data Platform (ZDP).
Join our webinar to learn how to:
- Create a flexible and responsive data platform at minimal operational cost.
- Use a self-service data catalog to identify enterprise-wide actionable insights.
- Empower your users to immediately discover and provision the data they need.
Ritika Sehgal Director PreSales Analytics and Ashwin Datla Senior Solution Consultant TIBCO SoftwareMar 28 201910:00 amUTC29 mins
How does a modern city transform into a smart city? A city that would like to function as an integrated orchestration of people, processes, services, organizations and technologies to enhance living conditions has some challenges.
What’s needed is an agile AI and machine learning-enabled ecosystem that can provide intelligent, connected and predictive services in healthcare, utilities, and/or transportation.
Watch this short webinar on smart cities for real-life customer examples of how TIBCO’s Connected Intelligent solutions can help:
- Make use of data sources for informed decision-making
- Monitor and manage change through streaming analytics
- Realize new revenue streams and engagement models for modern urban environments
Alessandro Valassina, Italy & SEE MDM Specialist di Informatica Italia.Mar 28 201910:00 amUTC75 mins
I dati sono il cuore di ogni azienda in tutti i settori e di tutte le dimensioni. E per un'organizzazione impegnata ad affrontare il mercato, una vista affidabile a 360 gradi dei propri dati è fondamentale per competere e crescere, in quanto abilita una visione globale del business, migliora la capacità di analisi e permette di ottimizzare la customer experience.
Iscriviti al webinar dal vivo organizzato da Informatica, leader mondiale nel data management, per scoprire come molte aziende tra cui Coop Alleanza e BMW Group oggi raggiungono i propri obiettivi di business grazie a una data strategy basata sul Master Data Management (MDM).
Con Alessandro Valassina, Italy & SEE MDM Specialist di Informatica Italia, scoprirai:
- L’importanza di poter contare su una singola fonte della verità e una visione a 360 gradi dei dati;
- Una panoramica delle funzionalità e dei vantaggi del Master Data Management;
- Una guida step by step all’introduzione dell’MDM in azienda.
Al termine del seminario online, i nostri esperti saranno a disposizione per rispondere alle tue domande.
Informatica, leader mondiale indiscussa in tutti gli ambiti del data management, aiuta le aziende ad avere successo nella trasformazione digitale basata sui dati grazie a soluzioni best-of-breed di data governance, big data management, iPaaS, data quality, data integration, master data management, data security ed enterprise data cataloging.
Esther Spanjer, Enterprise Business Development Director; Cagatay Kilic, Business Development ManagerMar 28 201910:00 amUTC45 mins
Join Cagatay Kilic and Esther Spanjer from the EMEAI Business Development team, for this webinar if you are interested in understanding more about HDD technology features and how they are implemented in the Western Digital Enterprise Grade HDD product line. Cagatay will discuss topics such as PRM vs SMR vs XMR recording, micro-actuator vs dual-actuator , HAMR vs MAMR and why Western Digital fills its high capacity drives with Helium. Additionally, Esther will share with you Western Digital’s view on the HDD Market and the co-existence of HDD and SSDs in the datacenter. For those of you that want to know of what’s going on “under the hood” in an Enterprise Grade HDD will come away with knowledge on the following:
1. HDD technology features
2. Market trends for HDDs in the datacenter
3. Western Digital’s enterprise grade HDD offering