Welcome to the big data and data management community on BrightTALK. Join thousands of data quality engineers, data scientists, database administrators and other professionals to find more information about the hottest topics affecting your data. Subscribe now to learn about efficiently storing, optimizing a complex infrastructure, developing governing policies, ensuring data quality and analyzing data to make better informed decisions. Join the conversation by watching live and on-demand webinars and take the opportunity to interact with top experts and thought leaders in the field.
For enterprises looking to protect cloud app data, Cloud Access Security Brokers (CASBs) have quickly emerged as the go-to solution. But how have CASBs matured and encompassed critical pieces of the security puzzle, from identity management to data leakage prevention? Join Bitglass and (ISC)2 on October 27, 2016 at 1:00PM Eastern for Episode 1 of the CASB Wars webinar trilogy for a discussion about the evolution of CASBs from app discovery to complete cloud security suites and basic API-based controls to more capable multi-protocol proxies.
There is no question - flash arrays have made a dramatic impact to high performance enterprise applications. But what about all the rest of the data? There are use cases that require cold data be stored for long term compliance and simple “keep everything forever” company cultures. There are also heavy bandwidth applications that need much more capacity than what's needed to accelerate a database redo log.
Come hear about how flash array technologies are advancing to finally deliver on the promise of an All-Flash data center. Tegile 's Rob Commins will discuss these advancements and identify key points for IT leaders to consider as they develop their digital transformation strategy.
This interactive session will look at how you can leverage the concept of secure content management to support privacy and information security transparently within your information governance program. How can the need for security, business continuity and data protection be balanced with collaboration and productivity expectations to successfully deliver the desired business outcomes for your stakeholders and customers. There will be opportunity throughout the session to share your thoughts and experiences.
For sites that focus on content distribution, value is often measured by how quickly and easily digital assets are delivered to the audience. Caringo VP of Marketing Adrian Herrera and PM Ryan Meek share how object storage can be used to store and deliver massive amounts of unstructured data including video and images.
A new set of international reporting standards (IFRS 9) goes into effect for Canadian financial services institutions late next year. With those standards come new complexities and risks, new data and analytics requirements, and new business opportunities for banks that can get ahead of the curve by modernizing their accounting and risk management platforms and putting those platforms to innovative use.
This webinar will explore the challenges and opportunities related to IFRS 9 and provide insights into how financial services institutions can optimize for both.
Have you considered a move to converged infrastructure, but need to be convinced with solid return on investment (ROI) data? Respected industry analyst IDC interviewed global enterprises that use and rely on Hitachi Unified Compute Platform (UCP) to calculate its true business value. IDC found that each organization earned an average return of 360% or about $32.7 million over a five-year period.
On October 27, 2016, join IDC and Hitachi Data Systems for a live webcast and learn IDC’s methodology to determine the customer results. The comprehensive independent analysis includes a breakdown of business benefits in four key areas and shows how UCP:
•Increases business productivity with improved application performance, reduced provision time and faster time to market.
•Improves risk mitigation and user productivity with a high level of resiliency, performance and efficiency.
•Boosts IT staff productivity with decreased IT time to set up new infrastructure and launch new applications.
•Reduces IT infrastructure costs for the provision and deployment of compute, storage and networking resources.
Classify data stored within Microsoft SharePoint using the familiar and intuitive interface provided by all Classifier products. A bulk document classifying feature enables you to classify large volumes of documents as they are uploaded to SharePoint quickly and efficiently.
Find Out More: https://www.boldonjames.com/products/sharepoint-classifier/
Banking, Insurance, Asset Management and Payments have faced multiple challenges since the 2008 Great Financial Crisis – fee compression driven by technology and competition, a low-growth zero interest rate environment, ceaseless regulations, fintech disruption and race to the mobile platform, among others.
Join us for this webinar where we illustrate several of the innovative, data-driven solutions designed by your competitors to address these business challenges, directly:
- Grow earnings via data-driven products and cost carve-out
- Outpace the competition
- Transform the application to attract and retain customers and talent
- Enhance employee-driven value
The maintenance bill is due. It’s hundreds of thousands or millions of dollars, and you know you will need to purchase more storage capacity:
-Do you pay your maintenance and commit to another 3-5 years?
-Do you upgrade your data center architecture?
-Do you shift to the public AWS cloud?
In this webinar and demo, we covered:
-Pros/Cons: on-premises SAN/NAS vs. hyperconverged infrastructure vs. AWS cloud storage
-Demo: “Lift and shift” on-premises storage to AWS without re-architecting applications
-How to fund a move to public cloud storage with existing budget
-Other public cloud use cases
Deduplication isn’t built into object storage, so what do you do when your data is highly dedupable? Current appliances on the market only enable CIFS & NFS shares, NOT S3 API. This puts your Cloud First model in the back seat, as applications continue to work in legacy mode, preventing you from taking full advantage of the S3 API.
Cloudian and StorReduce have teamed up providing a S3 enabled deduplication gateway. This can be seamless integrated with any existing S3 based application. Certified by Veritas NetBackup and Commvault, learn how the StorReduce and Cloudian solution can:
Lower TCO for backup data
Enable Cloud First
Reduce network bandwidth
Tier On Premises S3 data to AWS Cloud
IT asset data overlaps with configuration management, monitoring systems, and even Fixed Asset systems. Understanding these relationships is essential to managing digital systems. This webinar will provide an in-depth discussion of the various sources for IT asset data, what they mean, and how they interact. What does it mean to have something in the IT Asset Management system but not in the CMDB? Or in the Fixed Asset system but not the Monitoring system?
Attendees will a hear a detailed and specific analysis of the various cases that make managing IT data so interesting and challenging.
Could you benefit from a Big Data COE? Whether you formally call it a CoE or not, your big-data analytics initiatives should be led by a team that promotes collaboration with and between users and technologists throughout your organization. Join this webinar to hear from Justin Coffey, a senior staff development lead at the performance marketing company Criteo, to hear how they used a Center of Excellence model to drive business- and customer-value results with big data analytics.
Guided walk-through of the newest features of Hortonworks DataFlow 2.0. Highlighting productivity enhancements via Apache Ambari for streamlined deployment and configuration management, and Apache Ranger for centralized authorization and policy management; collaboration capabilities in Apache NiFi for enterprise data sharing and visibility across teams – specifically, multi-tenancy flow editing similar to how google docs supports multiple simultaneous collaborators with differing degrees of view/edit rights; framework enhancement of Apache NiFi, including control plane high availability via zero master clustering; and edge intelligence powered by Apache MiNiFi.
Join us to learn how HDF 2.0 can reshape data flow management in your enterprise environment.
Apache Spark has become an indispensable tool for data engineering teams. Its performance and flexibility made ETL one of Spark’s most popular use cases. In this webinar, Prakash Chockalingam - seasoned data engineer and PM - will discuss how Databricks allows data engineering teams to overcome common obstacles while building production-quality data pipelines with Spark. Specifically, you will learn:
- Obstacles faced by data engineering teams while building ETL pipelines;
- How Databricks simplifies Spark development;
- A demonstration of key Databricks functionalities geared towards making data engineers more productive.
Industry analysts believe that “eighty-nine percent of companies plan to compete primarily on the basis of the customer experience.” To fulfill this vision in the manufacturing industry, the role of master data management (MDM) has been strategically evolving. Now, in addition to traditional uses, MDM is helping manufacturers tackle an array of business challenges like cross-sell and upsell, vendor management, or targeted marketing campaigns. Early adopters have seen significant business advantages by using MDM to build a 360-degree view of customers across products, regions and lines of business. In fact, with a next-generation view of customers and the products they own, one $60 billion manufacturing company improved cross-sell and upsell success by five percent.
In this webinar, Suresh Babu Balasubramanian from Wipro and Hamaad Chippa of Informatica will discuss how great customer data can help tackle some of the major strategic and operational business challenges manufacturers face. Our conversation will focus on:
• Industry trends impacting the customer experience and need for a 360-degree view of customers
• An overview of key Total Customer Centricity solution capabilities and value-add
• Business benefits of improved customer information
• A customer success story
Comment les entreprises gèrent la croissance des volumes et la variété des données sans augmenter les coûts ?
Est-ce que votre architecture de données est capable de gérer les nouveaux défis du Big Data ?
Rejoignez-nous pour ce webinar pour apprendre comment optimiser votre architecture de données et réduire ses coûts avec Hadoop.
Particulièrement, nous allons traiter les sujets suivants :
- Réduire les coûts de stockage en déplaçant les données vers Hadoop
- Optimiser les traitements type ETL et en les exécutant sur Hadoop
- Faciliter la collecte et l’ingestion de différentes sources de données via NiFi
Today's storage world would appear to have been divided into three major and mutually exclusive categories: block, file and object storage. Much of the marketing that shapes much of the user demand would appear to suggest that these are three quite distinct animals, and many systems are sold as exclusively either SAN for block, NAS for file or object. And object is often conflated with cloud, a consumption model that can in reality be block, file or object.
But a fixed taxonomy that divides the storage world this way is very limiting, and can be confusing; for instance, when we talk about cloud. How should providers and users buy and consume their storage? Are there other classifications that might help in providing storage solutions to meet specific or more general application needs?
This webcast will explore clustered storage solutions that not only provide multiple end users access to shared storage over a network, but allow the storage itself to be distributed and managed over multiple discrete storage systems. In this webcast, we’ll discuss:
•General principles and specific clustered and distributed systems and the facilities they provide built on the underlying storage
•Better known file systems like NFS, GPFS and Lustre along with a few of the less well known
•How object based systems like S3 have blurred the lines between them and traditional file based solutions.
This webcast should appeal to those interested in exploring some of the different ways of accessing & managing storage, and how that might affect how storage systems are provisioned and consumed. POSIX and other acronyms may be mentioned, but no rocket science beyond a general understanding of the principles of storage will be assumed. Contains no nuts and is suitable for vegans!
Successful data science and analytics projects should focus on the explanation and delivery of business impact – this requires strong collaboration between business analysts and data scientists. Preparing rich representative data is important, but data discovery and predictive analytics are equally valuable – they enable team members to quickly evaluate which events are drivers or inhibitors of success, and to predict future outcomes. Throughout the course of a project, how can business analysts stay focused on the most important details? How can data scientists prototype and operationalize models in a quick, productive, and easy-to-use manner?
In this webcast, we will use SAS Visual Analytics and SAS Visual Statistics to teach you how to:
• Quickly identify predictive drivers
• Discover outliers by using interactive tools
• Use drag-and-drop features to build predictive models
• Simultaneously build models and process results for each group or segment of data
• Visually explore your predictive outputs or values
• Compare your models and apply them to new data
Join us as we discuss new self-service data prep strategies that will address these challenges. We will take a deep dive into ThoughtSpot’s technology stack and show how within one analytics platform it is possible to simplify and automate the entire analytics pipeline—from data connections to data prep, and then using a simple search interface for analysis. We will also share how customers have used ThoughtSpot to streamline these operations and accelerate time to insight for their business users.
With the advent of Big Data platforms, Banking & Financial Services companies are building applications that create massive business value. However, the datasets being used often contain significant amounts of confidential, proprietary and highly sensitive data and so the potential benefits are held back by privacy concerns.
In this joint webinar, Hortonworks and Privitar will draw on their experience of delivering technology which enables data innovation whilst ensuring compliance and risk mitigation across a range of use cases within Financial Services. With a review of the benefits of Hortonworks' Data Platforms and an introduction to Privitar's privacy preserving technology solution.
Keine Lust mehr auf die Herausforderungen im Zusammenhang mit der Ausführung eines Rechenzentrums mit herkömmlichen Infrastruktur?
Dell EMC ScaleIO Software defined Block Storage auf standard-Serverhardware bietet hohe Performance, massive Skalierung und erstklassige Flexibilität.
Wählen Sie aus 3 verfügbaren Optionen, um Sie;
1. Vollständige Flexibilität: ScaleIO auf Ihrer Auswahl von Server, Switch und Rack
2. Vorab validierte: ScaleIO in Kombination mit leistungsstarken Dell PowerEdge-Servern
3. Sofort einsetzbare, vollständig entwickelte Lösung von VCE
Sie können jetzt Ihre geschäftlichen Ziele erfüllen, und gleichzeitig erheblich vereinfacht den Prozess der Beschaffung und Bereitstellung eines Server-SAN
Dashboards are the most important, fundamental, tool for delivering Business Intelligence insights to your users. But, data visualization expert Stephen Few, has declared that most BI dashboards fail. Attend this webinar and learn how to construct best practice dashboards, achieve high ROI and create BI success; not failure.
Companies have struggled to find their feet when it comes to combining technology, people and workflow in their mobile application development strategies. All too often fragmented technologies have impeded progress. As technologies mature however and mobile (as distinct from pure web development) becomes better understood, there is light at the end of the tunnel. In particular integration across the lifecycle is delivering significant productivity gains for developers and business stakeholders that makes moving from handfuls of apps to a scaled mobile app strategy more of a reality.
Estamos entrando na terceira geração de dados, que nós chamamos de “Data 3.0”. A Informatica tem se tornado cada vez mais forte em cada geração.
A terceira geração de dados representa uma mudança no mercado atual e criará uma quantidade de oportunidades tanto para nossos clientes quanto para as companhias de tecnologia.
Assista o nosso webinar e descubra se a sua empresa está pronta para essa nova geração de dados.
Creating the ultimate master record with a 360-degree view just got re-mastered. Join us as we talk about Informatica MDM – a true multi-domain MDM solution, available both on-premise and in the cloud.
Take a look “under the hood” as we share the freshest updates for our data quality, data integration, and business process management workflow integrations with Informatica MDM. Learn how to improve your customer and business profiles with our Contact Validation Data-as-a-Service powered by Dun & Bradstreet, which helps you fill in the missing information you need, for accurate records you can swear by.
Join us for a live webinar to tour what’s new in Informatica MDM:
•See the latest features and functionalities available in our flagship MDM solution
•Explore the flexible and powerful user interface with rich new page layouts that make it easier to view, add and update business-critical data and relationships
•Meet Entity 360, an intuitive UI platform for building business user focused rich interfaces such as Customer 360, Product 360, and Supplier 360
Who is this webinar for?
•Current Informatica MDM customers who want to know what’s latest from industry leader
•MDM consultants, practitioners, and developers who want to stay on top of MDM advancements
•You! Just browsing or researching MDM options? Perfect! Or maybe you’re MDM-curious? Awesome! This webinar will help you understand what Informatica MDM does, how it’s used, what’s possible, and whether or not it fits your business needs
Ranked as a clear leader in top analyst reports, including Gartner MQ, Forrester Wave, and The Information Difference MDM Landscape, Informatica MDM has been awarded top scores for customer satisfaction, innovative technology, and market strength.
Can’t attend the live webinar? Sign up today and we’ll send you the on-demand recording afterward.
HCL Technologies and Dell EMC are announcing their Business-Analytics-Platform (BAP). The BAP for Hadoop is a scalable, ready to-run enterprise platform that is pre-configured and optimized specifically for big data. As a purpose-built, integrated hardware and software solution for big data analytics, the appliance runs Hadoop software from Hortonworks or Cloudera backed by a rock solid Dell EMC infrastructure. Enterprises are tasked with the implementation and on-going operations of Hadoop and often are facing a lack of technically skilled resources, the complexity of integrating Hadoop with other parts of the ecosystem, as well as demands for enterprise features like administration, monitoring, supportability and serviceability. Tune in to discover how Dell EMC and HCL can help minimize the complexity required for a successful Hadoop deployment, whether this is your first attempt at data analytics or you simply just need a boost on an existing program.
Discover breakthrough performance and scale for your analytics use cases with Informatica Cloud’s new release
Join the Informatica Cloud product team to find out what's new in the Fall 2016 release and explore all the key capabilities. The Fall 2016 release includes various new capabilities and enhancements expanding the potential of Informatica Cloud data and application integration, core platform, data management and easy connectivity to provide outstanding performance, scalability and efficiency.
•New Data Quality capability to assess and fix the data quality issues in your Salesforce orgs
•Maximize scaling, speed of execution and flexibility for your data integrations tasks using advanced transformations like partitioning and router
•Increased performance for cloud data warehouse like Amazon Redshift with enhanced lookup and caching support
•Advanced support for complex hierarchical data sources via XML, JSON, REST
•Out-of-the-box analytical solution templates with dynamic creation of relational targets
•New and enhanced connectors include Amazon QuickSight & Aurora, Microsoft SQL Server 2016, Azure Table, Microsoft Dynamics AX2012, Microsoft Dynamics GP 2016, Google Storage and Big Query, Oracle Eloqua, Qlik, Teradata and more
Please join the Informatica Cloud team to learn more on Informatica Cloud’s exciting new features and capabilities with the Fall 2016 release.
Improving the efficiency of existing processes is critical for enterprises, and one of the first proof points of many big data projects. In the long-term enterprises may be looking for big data to generate revenue from new projects and applications, but proving the potential benefits by improving the efficiency of existing business processes – such as optimizing supply chains, or accelerating compliance – is a good place to start. Additionally, operating more efficiently at lower cost and with lower risk enables an organization to redirect budget towards driving growth.
Join Oracle and 451 Research for a webinar exploring how to make this operational efficiency possible through a combination of data management, statistical analysis and visualization.
Eva Tsai will share her experience as a woman in tech charting a journey spanning multiple disciplines and excelling as a strategic business leader, her thoughts on the challenges women are facing, both in entering and advancing in their careers, and what her recommendations are on both professional and personal fronts.
Eva has extensive experience leading go-to-market transformation and managing company telemetry to drive growth worldwide. Her innovation and leadership has been recognized with significant industry awards and patents. She was selected by Silicon Valley Business Journal in 2015 as one of Silicon Valley's 100 Most Influential Women and won the 2014 Marketer That Matters™ award, sponsored by The Wall Street Journal. At vArmour, Eva has transformed the company into a growth machine via innovative and well-executed go-to-market strategy, insightful telemetry, and process optimization. vArmour has been named a “Company to Watch” by TechCrunch and a "Cool Vendor" by Gartner. Prior to vArmour, Eva held strategic roles at Citrix, BroadVision and Oracle. Eva holds a BS and a MS in Computer Science from Massachusetts Institute of Technology.
Spireon, an IoT company, needed a BI platform that could turn over 3 billion records of data, with a projected growth rate of 3x per year, into operational insights both for their internal teams as well as their customers.
Watch this webinar and learn:
- How Spireon used Birst to transform semi-structured data into analytic format
- Best practices in approaching analytics for data volumes with exponential growth
- How analytics is used internally by product and customer success teams
- How Spireon has disrupted the IoT market by creating a data product
As a cloud platform, OpenStack has storage requirements that are challenging for traditional storage solutions and appliances. As cloud architectures based on OpenStack transition further into mainstream use cases, storage architectures will need to become more flexible and cost efficient for clients to meet customer needs.
Please join us on November 2 at 12 PM EST for a discussion on the storage challenges facing enterprise and service provider customers, and the potential to leverage Software-Defined Storage to make infrastructures more agile and flexible.
Your organization has its own unique IT infrastructure, business model, risk profile and tolerance. The best strategy for streamlining your annual Payment Card Industry (PCI) validation process is to make sure that your Qualified Security Assessor Company (QSA-C) employs a sound and forward-looking methodology for your assessments. A key first step is understanding the critical differences between risk acceptance and risk mitigation and the implications for your business.
Tune in to learn best practices in PCI services methodology and how they apply to your specific requirements. Michael Aminzade, VP of Global Compliance and Risk Services at Trustwave, will discuss:
-The impact on PCI assessments, including PCI Data Security Standard (DSS), Payment Applications DSS and P2PE (Point to Point Encryption).
-How sound methodology helps you build a better foundation for security and compliance - whatever your starting point.
-Top considerations for evaluating methodology.
Adoption of a modern data platform is a journey. Every step requires different levels of technology, people and process capabilities. A reliable services partner with deep expertise is key for your success at each step of the way. Hortonworks service model is designed to provide expertise needed at each step of your adoption journey. We defined our offerings to address unique needs at each level.
Hortonworks IAM Services (Implementation, Advisory, and Managed Services) are delivered by our global professional services consultants, to help you succeed with the adoption of connected data platforms. Hortonworks IAM services are based on proven methodologies that are developed by our experts in collaboration with product management, and committers from our R&D teams
Part four in a five-part series, this webcast will be a demonstration of the installation of Apache MADlib (incubating), an open source library for scalable in-database analytics, into Hortonworks HDB. MADlib is an open-source library for scalable in-database analytics. It provides data-parallel implementations of mathematical, statistical and machine learning methods for structured and unstructured data. This webinar will demonstrate the installation procedures, as well as some basic machine learning algorithms to verify the install.
Hear directly from Hewlett Packard Enterprise senior executives about the recently announced Spin-Merge and how this news will positively impact HPE’s Information Management & Governance products and services.
On September 7, Hewlett Packard Enterprise announced plans for a spin-off and merger of our Software business unit with Micro Focus, a global software company dedicated to delivering and supporting enterprise software solutions. The combination of HPE’s software assets with Micro Focus will create one of the world’s largest pure-play enterprise software companies.
Join senior executives from HPE Software, and the Information Management and Governance business unit, to learn how this spin-merge will positively impact HPE’s technology portfolio and the ability to address your (or your customers’) needs. Hear firsthand about investments that are being made in HPE’s Adaptive Backup & Recovery, Secure Content Management, Verity, and Digital Safe suites. And have your questions answered by top executives. We look forward to you joining us.
Implementing data science and machine learning at scale is challenging for developers, data engineers, and data analysts. Methods used on a single laptop need to be redesigned for a distributed pipeline with multiple users and multi-node clusters. So how do you make it work?
In this webinar, we’ll dive into a real-world use case and discuss:
- Requirements and tools such as R, Python, Spark, H2O, and others
- Infrastructure complexity, gaps in skill sets, and other challenges
- Tips for getting data engineers, SQL developers, and data scientists to collaborate
- How to provide a user-friendly, scalable, and elastic platform for distributed data science
Join this webinar and learn how to get started with a large-scale distributed platform for data science and machine learning.
Natively integrated into the latest version of Informatica Master Data Management 10.2, customers will soon be able to gain instant access to Dun & Bradstreet data. Powered by Informatica Data as a Service, MDM customers will be able to access D&B's powerful business data to power their business.
Whether you’re looking to extend your protection storage to inexpensive cloud storage for long-term-retention, protect applications that have been moved to cloud or that “were born in the cloud”, like Microsoft Office 365, Google Aps or Salesforce.com, Dell EMC will help you plan your safe journey to the cloud.
Join our live webinar with Mark Galpin, Dell EMC Data Protection Expert, to learn how Dell EMC’s Data Protection Solutions help organizations to:
•protect data regardless of where it resides: on premises or in the cloud
•keep the ownership and control of data in the cloud
•provide flexibility to move data to/ from the cloud
Hadoop and The Internet of Things has enabled data driven companies to leverage new data sources and apply new analytical techniques in creative ways that provide competitive advantage. Beyond clickstream data, companies are finding transformational insights stemming from machine data and telemetry that are radically improving operational efficiencies and yielding new actionable customer insights.
During this webinar we will:
- Discuss real world case studies from the field across a variety of verticals
- Describe the strategies, architectures, and results achieved by Fortune 500 organisations
- Outline the best practices on how to improve your operational efficiency
New technology investments in Healthcare are fundamentally transforming patient care, enabling precision medicine and contributing to better outcome. At Dell EMC we believe that the Healthcare business is about embracing IT transformation for care delivery to increase the quality of patient care. So the collection and analysis of data across the care ecosystem will become a key asset to any healthcare organization. Existing and new clinical devices and operational systems will lead to a data explosion.
Watch this 30-minute webinar and hear
• What drives the digital transformation of Healthcare.
• Which are the clinical and non-clinical applications responsible for the data explosion
• Introduction to the Dell EMC Core and Extended Healthcare Data Lake for current, new, and future Healthcare workflows
Apache MiNiFi is designed to make it practical to enable data collection from the second it is born, ideal for IoT scenarios where there are a large number connected devices or a need for a smaller and more streamlined footprint than Apache NiFi. Join us as we walk through how Apache MiNiFI works, and how it can enable edge data collection from the likes of connected cars, log services, Raspberry PI’s and more.
Identity is the new perimeter for Security in the digital enterprise. According to Forrester, 80% of security breaches involve the use or abuse of privileged credentials. At the same time, compliance mandates (such as PCI) require organizations to focus on how they manage and control privileged users in order to protect these critical resources. Privileged Access Management (PAM) provides a host of capabilities that enable organizations to address these critical challenges. Join CA Technologies and (ISC)2 on November 3, 2016 at 1:00PM Eastern as we continue examining the steps for strengthening your enterprise and increasing customer engagement highlighting emerging requirements in privileged access management and present key capabilities that are important in a comprehensive PAM solution.
Join us to find out the perfect blend of creative freedom and data governance that comes from leveraging the power of SAS Visual Analytics and the familiarity of Microsoft Office. We'll load data from Office applications to SAS Visual Analytics, analyze the data to get insights in seconds, and empower you with smarter presentations to tell the best story about your business.
Gartner states that by 2020, 30% of organizations will leverage backup for more than just operational recovery (disaster recovery, test/development, DevOps, etc.). Business continuity plans are a must in today’s environment, where even a few minutes of downtime result in lost revenue. Data backup procedures are a critical part of these business continuity plans. But as the menu of backup options increases with the introduction of new technology, it can be difficult to choose which backup method is best for your business.
In this webinar we’ll cover:
-Replacing tape backups: Disk or Cloud?
-Backup challenges: Restoration, security, compliance and time
-How to backup and archive your data on AWS
-Backup and archive use cases
**The first 100 attendees to register during the live webinar will receive a FREE $100 Amazon Web Services credit!**
This live event will be held on Tuesday, November 8, at 1:00 pm ET/10:00 am PT. Register today!
Are you struggling with data piling up in primary storage for High Performance Computing (HPC) workloads? Is the storage that came with your microscope, spectrometer or other research device only enough for a few weeks of data? We will review options for using Cloud and Object Storage to offload up to 90% of the data from primary storage, cutting costs by 75% with built-in protection, eliminating the need for backups.
In this webinar you will learn:
1. The difference between cloud and object storage vs file-based storage
2. How to transparently plug in a second tier of limitless, continuously protected storage into HPC workloads
3. How to easily provide storage services to internal or external parties with easy to manage access and full metering and chargeback functionality
4. How to consolidate HPC data sets and enable search
Businesses are extracting value from more data, more sources and at increasingly real-time rates. Spark and HANA are just the beginning. This webcast details existing and emerging solutions for in-memory computing solutions that address this market trend and the disruptions that happen when combining big-data (Petabytes) with in-memory/real-time requirements., It provides an overview and trade-offs of key solutions (Hadoop/Spark, Tachyon, Hana, NoSQL-in-memory, etc) and related infrastructure (DRAM, Nand, 3D-crosspoint, NV-DIMMs, high-speed networking) and discusses the disruption to infrastructure design and operations when "tiered-memory" replaces "tiered storage"