Practicing business intelligence allows your company to transform raw data into sets of insights for targeted business growth. The business intelligence and analytics community on BrightTALK is made up of thousands of data scientists, database administrators, business analysts and other data professionals. Find relevant webinars and videos on business analytics, business intelligence, data analysis and more presented by recognized thought leaders. Join the conversation by participating in live webinars and round table discussions.
Tras crear un prototipo inicial de su aplicación para una vista previa limitada ya es hora de que el equipo pase a consolidar la arquitectura haciéndola más robusta y tolerante a los fallos antes de lanzarla oficialmente al público final.
En este capítulo se tratan conceptos de la infraestructura de AWS tales como regiones y zonas de disponibilidad; además, se explica cómo utilizar tales características para incrementar la tolerancia de la aplicación a los fallos.
Servicios y características tratados
•Conceptos clave sobre infraestructura (regiones y zonas de disponibilidad)
•Equilibro de carga elástico (Elastic Load Balancing)
•Creación de una AMI basada en una instancia en ejecución
•Creación y configuración de un equilibrador de carga elástico
•Zonas de disponibilidad múltiples con Amazon RDS
•Alarmas con Amazon CloudWatch
Una vez expandida con éxito la capacidad del centro de datos a Amazon Web Services para los entornos de desarrollo y prueba, el equipo de IT se enfrenta a un nuevo reto en cuanto a la capacidad, es decir, cómo almacenar la cada vez mayor cantidad de datos generados por las aplicaciones empresariales y mantener los costes a la baja. Además, también se enfrentan al reto de mantener copias de seguridad de esos datos de manera adecuada.
Este capítulo aborda ambas cuestiones con servicios como Amazon S3 y Amazon Glacier.
•AWS Storage Gateway
•Datos de Amazon S3 a Amazon Glacier
Join backup and recovery experts to find out how to build your backup and recovery requirements checklist. By the end of this session, you’ll learn how you can:
-Cut storage requirements by up to 80%
-Save on storage costs and performance hits to your network.
-Leverage near-instant recovery technology for protected virtual machines or servers.
-Automate application-aware backups and testing for data corruption.
Questo è il primo episodio di una serie di webinar che illustreranno le diverse modalità in cui AWS viene utilizzato dai team di sviluppo agili. Tutti gli episodi faranno riferimento a una startup impegnata nell'apertura di una nuova area di business, illustrando i vantaggi offerti dall'utilizzo di AWS. La startup puo' essere una nuova realtà o un centro di innovazione all'interno di una azienda esistente, ad esempio per seguire il lancio di un nuovo prodotto.
In questo episodio vengono descritti i principali vantaggi di AWS per le startup e i team IT agili, soffermandosi su come il team abbia sviluppato rapidamente un prototipo funzionante utilizzando i diversi servizi offerti dalla piattaforma.
Savvy marketers spend a lot of their time analyzing big data, on the lookout for exciting new insights which can translate into action items and strategic advantage. Unfortunately, “giraffes” in their data – portions of data which dominate the rest of it – often hide important insights and lead to erroneous strategic decision making. In this webinar, we will discuss how to spot giraffes in your data and how to make sure they’re not misleading you.
A modern Hadoop-based data platform is a combination of multiple source projects brought together, tested, and integrated to create an enterprise-grade platform. In this session, we will review the Hadoop projects required in a Windows Hadoop platform and drill down into Hadoop integration and implementation with Windows, Microsoft Azure, and SQL Server.
Join AWS for this Building Scalable Web Applications webinar where we will explain the key architectural patterns used to build applications in the AWS cloud, and how to leverage cloud fundamentals to build highly available, cost effective web-scale applications.
You will also learn how to design for elasticity and availability within AWS using a common web architecture as a reference point and discuss strategies for scaling, security, application management and global reach. If you want to know how to make your applications truly scale then join this webinar to learn more.
Reasons to attend:
• Understand the architectural properties of powerful, scalable and highly available applications in the Amazon cloud
• Learn about Amazon regions and services that operate within them that enable you to leverage cloud scaling
• Discover how to manage data with services like Amazon S3, Amazon DynamoDB and Amazon Elastic MapReduce to remove constraints from your applications as your achieve web-scale data volumes
• Hear about customer case studies and real-world examples of scaling from a handful of resources to many thousands in response to customer demand
Who should attend?
• Developers, operations, engineers and IT architects who want to learn how to get the best from their applications in AWS
Impala raises the bar for SQL query performance on Apache Hadoop. With Impala, you can query Hadoop data – including SELECT, JOIN, and aggregate functions – in real time to do BI-style analysis. As a result, Impala makes a Hadoop-based enterprise data hub function like an enterprise data warehouse for native Big Data.
In this webinar featuring Impala architect Marcel Kornacker, you will explore:
* How Impala's architecture supports query speed over Hadoop data that not only convincingly exceeds that of Hive, but also that of a proprietary analytic DBMS over its own native columnar format.
* The current state of, and roadmap for, Impala's analytic SQL functionality.
* An example configuration and benchmark suite that demonstrate how Impala offers a high level of performance, functionality, and ability to handle a multi-user workload, while retaining Hadoop’s traditional strengths of flexibility and ease of scaling.
Dr. Ralph Kimball describes how Apache Hadoop complements and integrates effectively with the existing enterprise data warehouse. The Hadoop environment's revolutionary architectural advantages open the door to more data and more kinds of data than are possible to analyze with conventional RDBMSs, and additionally offer a whole series of new forms of integrated analysis.
Dr. Kimball explains how Hadoop can be both:
A destination data warehouse, and also
An efficient staging and ETL source for an existing data warehouse
You will also learn how enterprise conformed dimensions can be used as the basis for integrating Hadoop and conventional data warehouses.
The C-Suite in every organization is obsessed with the buzz around Big Data and according to industry pundits almost 9 out of 10 organizations today have included this growing trend in their IT plans for 2014. But we all know that when it comes to execution and extracting true business value out of Big Data, only a fraction of the companies are successful. Believe it or not, infrastructure platforms play a key role in demonstrating the power of performance to deliver blazing speed analytics and makes all the difference if you can get a query answered in 3 seconds vs. 3 hours!
Welcome to the new style of IT and a paradigm shift towards converged infrastructure or as IDC states it as the “3rd platform”, where you are no longer bound by the limitations of your traditional datacenter. Instead of plumbing or retrofitting your existing landscape you now have proven alternatives to augment your legacy environment with leading innovative platforms that are purpose built, seamlessly integrated and can be deployed in days vs. months. Learn from the best practices of some of our customers who have embarked on this journey already and paved the way for handling Big Data!
Google BigQuery is an analytics service offered in the cloud as a part of Google Cloud Platform portfolio. In this webinar we will explore how customers are using BigQuery to jumpstart their BigData Analytics initiatives. We'll also provide details on how you can get started with BigQuery.
Today’s consumer has unprecedented access to product and pricing information. Virtual merchandising technology allows retailers to extend the benefits of online shopping into the physical store. Interactive in-store apps blend the best of both worlds, physical and virtual, for an amazing in-store experience.
Beyond converting sales, omnichannel is about creating a consistent experience across all channels. By extending shelf space to a limitless number of products and offering access to online product information and content, virtual merchandising is a new and exciting way for customers to shop and experience your brand.
During this session, we’ll discuss how bringing virtual elements into the physical store gives retailers added control, flexibility and customizability to the customer experience.
Alen Puaca, Creative Director, iQmetrix
Perry Kramer, Vice President, Boston Retail Partners
Embracing big data allows you to improve business performance. During this webinar, we will discuss ways to use newer concepts, capabilities and technologies available to empower business users with business intelligence (BI) and analytics on big data -- all based upon field tested deployments and customer-proven success. Among the questions we will answer:
- How do you change the relationship between users and big data, give them the means to embrace it, work with it very interactively, in almost near instantaneous fashion?
- How can you enable users to apply more sophisticated BI (i.e. advanced analytics), even on vast amounts of data?
- How do you reconcile and enable various levels of analytics maturity between users, from simple executive-ready dashboards, all the way to sophisticated programmers and data scientists?
Join us to learn how to address volume, variety, velocity of big data, without compromising speed to results, or sophistication of analytics.
Retail Winners recognize that customer service is the single most important driver of success in today’s omnichannel marketplace, and they are racing to deploy all their human, store and technology assets in order to compete.
Recognizing the need to provide the best omnichannel experience in the most cost-efficient way, retailers are investing in a central commerce hub as the answer to this challenge. Maintaining multiple, siloed systems is expensive and puts retailers at risk of delivering a disjointed experience.
At end of day, omnichannel customer service is what customers expect, and is the cost of doing business today. Where once it was a differentiator, excellent omnichannel customer service must be provided and promoted, or retailers will not be getting the wallet share they’re targeting. In this webinar, we’ll explore effective strategies and technologies to improve customer service and re-engage disaffected consumers.
Cloudera's Director of Data Science Josh Wills and Senior Manager, Solutions Marketing Sandy Lii explain how advanced analytics with an enterprise data hub will allow you to use all of your data, do more with your data, and deliver insights sooner. This breaks-down the barriers caused by increasingly high data storage and/or processing costs, silo-ed data sources, complex management and security, and lack of analytics agility.
Exploding data volumes, variety, and velocity has led to a constantly growing variety of data integration technologies and approaches. There is a better and faster way with Informatica's VibeTM Virtual Data Machine: one language, one SDK, and one engine that lets you design data integration jobs or quality rules once and deploy anywhere.
Built on technology that Informatica has developed for over 20 years, the Vibe Virtual Data Machine future-proofs your data integration while enabling tight business/IT collaboration. Whether you are moving to cloud computing or Hadoop, focused on big data analytics, building custom applications or Internet of Things, Vibe will allow you to Map Once, Deploy Anywhere to maximize your potential to compete in today’s interconnected information age.
In this 45 minute webinar, you will have a chance to dive deeper into Vibe and hear:
•How Vibe is powering the Informatica platform
•Use cases of how Vibe is being employed today to solve customers’ challenges
•Roadmap of what’s to come
Social data is exploding and companies of all sizes are struggling to understand how to make sense of the data. Even the largest and most sophisticated brands are just scratching the surface of social business potential.
DataSift is powering the next generation of social business for leading brands, social technology companies and agencies around the globe. We are transforming how organizations convert social data into key insights that drive better decisions for brand leaders and market disrupting companies.
Join Chris Parsons, Product Marketing Manager at DataSift, for a live webinar on Thursday, April 10, where he will share how to unlock the potential of social data, including:
*Making social make sense in the real world – what the core tenets of a successful social strategy are and where you should start
*How to move from data to insight: A closer look at common use cases and brand best practices
*How to get started and get to fast impact: Actionable steps and best practices to unlock social insight and value for your business
Getting answers from data in your systems is hard enough. Getting it from social networks and other unstructured data sources is even harder.
This session is designed to answer your questions around how as a marketer you can get the right actionable information to understand your customers. The best part is that you can do it yourself without having to wait for IT to run reports for you.
Regardless of whether you call it "business intelligence", "big data", "analytics" or just plain old "math", we have many tried and true techniques for dealing with uncertainty. But ambiguity is a separate matter and, at least in my experience, is the hardest part of creating value from data.
During this talk, I will illustrate how the design process can be used to solve ambiguous problems by drawing on projects we've done at Datascope.
It’s time for retailers to throw out the idea that workforce management exists only to control labor costs. If implemented effectively, WFM solutions can help businesses meet the demands of the modern consumer and provide a new lever for driving revenue.
This webcast will explore how retailers can maximize revenues with smart schedules that improve customer service by deploying sufficient employees with the right knowledge and selling skills at the right time. Using business driver data like traffic, conversions and predicted sales, WFM and labor forecasting can highlight the peak hours for your business and help you understand where you may be missing opportunities or have a chance to reinvest your labor budget. Learn how optimized schedules and accurate labor forecasting can deliver better business results.
You know “in theory” that Big Data can give you new insights into a wide range of business operations and help you make changes that will drive better business performance. But let’s get specific: do you know exactly how Big Data can complement your business intelligence to drive positive business outcomes and improve your brand?
Here’s an opportunity to gain some insight. In this webinar, we’ll examine two real-world customer cases to illustrate how applying HP Big Data Analytics, including Predictive Analysis, can help companies identify opportunities to boost revenue, service quality, brand image, and even job creation.
Join us for this illuminating session. Learn how to mine your data and business intelligence for the golden nuggets that help you protect your brand, serve your users and customers, and grow your business.
Forty-four states, DC and four territories have adopted the Common Core State Standards (CCSS). This means that school districts across the country are planning for 100% online assessments during the 2014-2015 school year. One of the most important conditions needed for being able to administer online assessments is network infrastructure readiness.
Attend this 30-minute webinar and join Gavin Lee, Senior K-12 Business Development Manager at Juniper Networks, to discuss the critical network must-haves that all school districts should consider when looking to deploy a robust and supportable network. You will also receive practical guidance on how to get the most out of your network infrastructure and how to best prepare for the CCCSS assessments:
• Consortia network infrastructure
• Wired and wireless network capabilities
• Robust network security
• Network support readiness
• Juniper Networks network infrastructure readiness resources
Fibre Channel (FC) has come a long way since its introduction to provide a networking solution that would provide the efficiency and speed necessary for storage to be networked and shared across multiple hosts. However, recently some have begun to question whether FC still has relevancy. The answer is a simple and definitive yes! It is all about the right technology for the job. Come and hear Ben Woo, Managing Director, Neuralytix, describe why Fibre Channel is the right tool for your job.
Dopo avere ampliato con successo la capacità del data center con Amazon Web Services per gli ambienti di sviluppo e testing, il team IT deve affrontare un nuovo problema di capacità: come archiviare i crescenti volumi di dati generati dalle applicazioni aziendali senza aumentare i costi? Inoltre, come è possibile assicurare un backup appropriato di questi dati?
In questo episodio, verranno risolti entrambi i problemi grazie ad Amazon S3 e Amazon Glacier.
Dopo avere creato un prototipo iniziale funzionante dell'applicazione per la limited preview, è il momento per il team di consolidare l'architettura, rendendola più robusta e fault-tolerant. Poi potranno passare al lancio della versione finale.
In questo episodio vengono descritti concetti relativi all’infrastruttura di AWS come le Region e le Availability Zone, illustrando come utilizzare queste funzionalità per migliorare la fault-tolerance di un'applicazione.
How many logs files do you have?
• Ever use them?
• Wonder what they might contain?
• Are they lying around unexploited, cropped or deleted to save space?
• Do you ever try to exploit all the log data your systems produce?
• Do you have home grown tools to manage them, only to find it’s a nightmare to keep these tools up-to-date?
HP OLi (Operations Log Intelligence) is here to help. Based on industry leading log file management technology from ArcSight, now operations data in log files can be collected, centralized, archived and searched. The intuitive, visually-compelling dashboards provide easy, fast and powerful access to ALL your log files.
In this webinar we'll discuss the advantages of centralizing log file management like this, and show you a quick demonstration of this new and exciting product from HP Software.
Salesforce is rapidly being used for many cloud-first initiatives – from accelerating sales and marketing performance, to developing custom apps to solve a variety of business needs. Due to its wide-ranging use across the enterprise, CIOs are becoming very involved in designing the entire implementation and integration strategy for Salesforce within their organizations.
Join this fireside chat between Derald Sue, CIO of InsideTrack, a recognized IDG Computerworld’s 2014 Premier 100 IT Leaders, and Eric Johnson, CIO of Informatica and learn:
•How InsideTrack achieved a 100% return on its Informatica Cloud investment in just one month
•How InsideTrack generates valuable strategic insights from millions of rows of data each day from disparate sources
•How InsideTrack eliminated costly and time-consuming manual coding and realized a 6X return on integration development productivity
In this 60-minute webinar we will kick-off our Consistent Customer Experience webinar series. We will clearly identify the elements that define the customer experience and will explore why delivering a consistent experience is critical for your brand and bottom line. We’ll outline the processes, tools and technology that enable retailers to deliver their vision for their customers.
Sessions will take attendees through the entire process of optimizing the customer experience, including strategies around: payroll, labor allocation, communication, space optimization and data analytics. Each presentation also will include an inside look at these strategies in action.
Many organisations first make use of AWS as a development and test environment. The flexible and pay as you go nature of AWS makes it perfect for compute environments that need to be spun up quickly and disposed of when not needed, and placing this power at the fingertips of developers means you can make step changes in productivity as you progress applications through the dev/test cycle.
In this webinar, we'll introduce some key mechanisms that will help you use AWS as a flexible deployment environment and faster development-deployment-testing-release cycles, talk about customers who are using AWS for development and test, and provide some tips and tricks to help you be more agile and manage your AWS infrastructure and keep it cost effective.
Reasons to attend:
• Understand why AWS is such a great place for running high churn development and test environments
• Learn about deploying applications to AWS as part of your development cycle
• Discover mechanisms for templating environments so you can recreate carbon copies each time you deploy a new application version
• Hear about customers and the benefits they have felt since moving to a cloud model for performing their dev & test
Who should attend?
• Developers, operations, engineers and IT managers who want to learn how migrating dev & test to the cloud makes a perfect first step on a journey into the cloud.
The growth of cloud applications means that your data increasingly fragmented across your on-prem and cloud applications and repositories. And for the foreseeable future, most organizations will be dealing with hybrid architectures to manage their enterprise data. Do you have a plan to manage your data whether it is in the cloud or on-premise?
This webinar will cover specific solution patterns for dealing with common hybrid data management issues. The discussion will include the 4 solution patterns, best practices and tips & tricks.
Today, data drives discovery. And discoveries create are key to creating sustained advantages. The better your critical workflows are able to create and access data – the better you’ll be able to discover new, innovative solutions to important problems, or to create entirely new products. More than ever before, data intensive applications need the sustained performance and virtually unlimited scalability that only parallel storage software delivers.
Designed for maximum performance and scale, storage solutions powered by Lustre software deliver the performance at scale to meet today’s storage requirements. As the most widely used parallel storage system for HPC, Lustre-powered storage is the ideal storage foundation.
But scalable performance storage by itself only solves half the problem. Today’s users expect storage solutions that deliver sustained performance, scale upward to near limitless capacities, and are simple to install and manage. Intel(r) Enterprise Edition for Lustre* software combines the straight line speed and scale of Lustre with the bottom line need for lowered management complexity and cost.
As the recognized leaders in the development and support of the Lustre file system, Intel has the expertise to make storage solutions for data intensive applications faster, smarter and easier.
As your customers and partners become increasingly global, managing the end user experience all over the world becomes increasingly challenging as well. The reality is that sometimes even when your webservers, networks and SLAs are telling you everything is ok, they might be pulling their hair out at the other end!
With a growing number of applications being SaaS based or delivered by a managed service provider, and being accessed anywhere and anytime via mobile devices, how do you truly understand what's going on outside your own datacenter and firewall when it comes to the end user experience?
Join Simon Campbell, Senior Consultant from CA Technologies to learn how you can precisely identify how your cloud, web or mobile applications are responding for users all over the world to give you the complete 360 degree global picture so you can quickly, easily and very cost-effectively isolate end-user experience problems down to local connectivity, page or performance issues or your own service provider and prevent a severe case of customer experience jetlag!
In addition, this session will also cover how to:
Manage your end user experience around the world far more proactively and effectively
Manage the "3am problem" and test performance even when there is no live traffic
Extend your global performance monitoring to your own Datacenters or Secure FTP
This is part 1 of our 2-part series on Big Data Visibility with Network Packet Brokers (NPBs).
Even as network data has exploded in volume, velocity and variety, network monitoring solutions have been behind the curve in adopting new technologies and approaches to cost-effectively scale and accommodate a widening virtualization trend. Customers are demanding greater freedom in how applications are deployed and are moving to a consolidated, shared model of data using big data frameworks, such as Hadoop, which enable large-scale processing and retrieval for multiple stakeholders.
Join Andrew R. Harding, VP of Product Line Management at VSS Monitoring, as he discusses:
- Big data and its implications for network monitoring and forensics
- Why network monitoring solutions are lagging from a virtualization standpoint and why this is a problem for network owners
- How certain traditional network monitoring functions will eventually be offloaded to adjacent technologies
- How Network Packet Brokers can accelerate the adoption of virtualized probes, “open” storage, and big data technologies within network management / monitoring
• How a Big Data Visibility architecture can enable network data to become part of the “big data store,” allowing it to integrate with the rest of enterprise data
With a library of over 100 profitability reports, Trackmax Solutions INSIGHT Business Intelligence Profit Analytics and Management Software delivers the most in-depth, multi-metric visibility into what really drives profit at the item level. INSIGHT’s proprietary account penetration module, VMAP, enables the development of action-oriented sales plans that deliver the highest profit potential opportunities for both distributors and manufacturers based on real-time SKU-level data. Companies that understand the important connection between category management and profitability management increase their sales and market presence with INSIGHT.
See the big picture in Big Data. A customer service call can transform an internal process. Understanding Tweets and Likes can lead to improved product design. Previously unconnected data can reveal patterns and relationships so you can jump on more opportunities. The moment to tap the full potential of Big Data is now.
HP HAVEn is a Big Data platform that analyzes 100% of the data relevant to your organization. HAVEn delivers the analytics services & engines to exploit data from any source, and the services and solutions to reshape your IT infrastructure to realize the value of your data.
Attend this webinar to learn about HP’s Big Data platform and hear how HAVEn users are now able to know their customers better, to design better products, and to run their operations more efficiently.
When you have 100% of your data informing every decision, anything else is just an educated guess.
Today in order to remain competitive, there is a tremendous demand on lines of business to be able to analyze more aspects of business data and do this very quickly. There is a control shift to the business analyst so they can analyze what they need to, NOW!
Big Data requirements are also emerging, even as everyone is being asked to accomplish more with less. Join us to learn how the combination of DB2 10.5 with BLU Acceleration and Cognos BI 10.2 can help bring sanity to satisfying these ever increasing demands.
We will provide an overview and technical deep dive of DB2 10.5 with BLU Acceleration, Cognos BI 10.2 and Dynamic Cubes. Join experts from the DB2 and the Cognos BI teams who will also explain how these two offerings fit together and the benefits they have provided to clients already. See the value that DB2 with BLU Acceleration can bring to Cognos BI, and ask these experts your questions, so you can understand how this solution helps solve business challenges.
Join us to learn how investments in advanced analytics for IT Operations can lead to faster problem resolution, more effective use of IT toolsets, increased operational efficiencies, more effective optimization of the IT infrastructure in support of service delivery, and more automated insights into business alignment.
Dennis Drogseth of Enterprise Management Associates (EMA) will discuss the underlying foundations for Advanced Operational Analytics (AOA) as EMA defines it, with an eye to HP’s current, leading-edge solution set. The web event will wrap up with process and skill set considerations relevant to helping IT move to the more effective, service-centric model that advanced analytics can enable—including EMA’s maturity model for moving your IT organization forward with analytics as a catalyst.
Join Amazon Web Services Amazon Elastic MapReduce (EMR) Masterclass webinar where AWS Evangelist, Ian Massingham, will explain how to get started.
EMR enables fast processing of large structured or unstructured datasets, and in this webinar we'll show you how to setup an EMR job flow to analyse application logs, and perform Hive queries against it. We'll review best practices around data file organisation on Amazon Simple Storage Service (S3), how clusters can be started from the AWS web console and command line, and how to monitor the status of a Map/Reduce job. The security configuration that allows direct access to the EMR cluster in interactive mode will be shown, and we'll see how Hive provides a SQL like environment, while allowing you to dynamically grow and shrink the amount of compute used for powerful data processing activities.
Reasons to attend:
• Understand what Amazon EMR does and how to get started
• Learn how to launch EMR job flows, configure Hadoop, and install Map/Reduce tools such as Hive
• Discover how to perform interactive and batch queries against structured and unstructured data
• Find out how to scale up data processing clusters to meet business time requirements.
Who should attend:
• Developers, engineers and architects wanting to get more hands on with Amazon EMR.
This is a continuation of our 2-part series on Big Data Visibility with Network Packet Brokers (NPBs).
Big data techniques and technologies can be powerful tools for scaling network monitoring and forensics. They can also facilitate new use cases for network data, potentially beyond the scope of Operations.
Gordon Beith, Director of Product Management at VSS Monitoring, will discuss practical considerations for migrating to a Big Data Visibility Architecture, including:
• Accommodating network volume, velocity and variety using sophisticated hardware preprocessing and APIs
• Metadata versus flow statistics versus full packet capture – considerations and use cases for each
• Open versus proprietary formats for storage
• Pros and cons of integrated capture/storage/analysis solutions versus separate capture/ storage solutions coupled with virtualized analysis probes
• Addressing retrieval in an “open” forensics model
• Leveraging a distributed computing framework for processing large-scale data stores
Despite advances in project management a surprising number of projects still fail by missing budget and timing goals, not delivering real business value or in other ways. Managers also still struggle with visibility into the entire portfolio and with communicating program status and success with stakeholders. Implementing a Project and Portfolio management toolset in the enterprise has proven to be one of the fastest IT payback investments and highest first-year Returns on Investment (ROI) available. In this session you will learn how others have implemented a toolset and improved their business results right away. You’ll also hear about a proven approach that focuses on performance, adoption, and simplification and how that will accelerate the benefits of a holistic approach to project and portfolio management.
Join Jay Krackeler, an HP Product Manager with extensive experience in this space and Russ King, from HP Partner ResultsPositive, for an informative session packed with real-world examples from companies across a variety of industries.
In this webinar, Anton will explain the main advantages of NoSQL and common use cases in which the migration to NoSQL makes sense. You will learn about key questions that you have to ask before migration, as well as important differences in data modeling and architectural approaches. Finally, we will take a look at typical application based on RDBMS and will migrate it to NoSQL step by step.
Key topics that will be covered:
* Why would you want to migrate to NoSQL
* Conceptual differences between RDBMS and NoSQL
* Data modeling and architectural best practices
* "I got it. But what exactly I need to do?" - Practical migration steps
Anton has been an active user of many NoSQL databases, including Cassandra, MongoDB, MarkLogic, Aerospike and HBase. Like many people, he learned some of the difficulties behind polyglot persistence and choosing the right NoSQL solution the hard way, and performed many migrations of systems, from relational to NoSQL databases. His goal with this webinar is to help others avoid common pitfalls while learning more about NoSQL solutions in general and the migration process in particular.
ABOUT THE PRESENTER
Anton Yazovskiy is a Software Engineer at Thumbtack Technology, where he focuses on high-performance enterprise architecture. He has presented at a variety of IT conferences and “DevDays” on topics such as NoSQL and MarkLogic.
Join Amazon Web Services for this Storage and Backup webinar to learn more about how you can use the AWS Cloud as a storage and backup platform.
A wide range of assets can be cost effectively held in highly durable storage systems within the AWS Cloud, for global distribution, long-term storage or low-cost cold archive. Learn about a range of use cases for the Amazon Simple Storage Service (S3) beyond simple object storage, and how Amazon Glacier can revolutionise long term archive economics and technology.
Reasons to attend:
• Understand why AWS is a perfect platform for the storage of digital assets, data, media and backups
• Learn how S3 is a powerful platform that goes beyond simple storage
• Discover how Glacier can revolutionize your long term archive management by removing the need for costly and fragile media types
• Hear about real customer use cases and a rich partner ecosystem of services built on AWS storage services
Who Should Attend:
• Developers, operations, engineers and IT managers who want to learn how AWS makes a cost effective and highly capable environment for the storage of digital assets
Business intelligence tools were born to query and report. But now analysts and business users don't just want dashboards, they want to dive deep into ad hoc analyses, to explore dozens of hypotheses in minutes. The BI industry is responding by tacking on better visualization and calling it analysis.
But visualization and analysis are very different. If they weren't, why do most analysts prefer to query data with BI tools, then do their actual analysis in Excel (or statistical tools)? Or why do Tableau's help documents literally suggest you pull out a calculator if you'd like to run a correlation?
For many companies, this misunderstood distinction is the final barrier to reaching the promised land of data for all. We'll explore the distinction, as well as the growing divide between exploratory analysis tools and predictive analysis tools. We'll also talk about the reasons that cloud-based analysis tools will leave the rest further and further behind.
Analytics 3.0 breaks through constraints that companies have faced around enabling any analytics, on any data, at any time. For many companies, the scope of data analysis has expanded from gathering operational business intelligence to performing product offering analysis using embedded data intelligence.
Join our live webcast on April 3, 2014 at 12 PM EST and learn how analytics has evolved from business intelligence to the new era of data-enriched offerings, and how you can apply Analytics 3.0 to produce measurable business benefits and optimize internal decision processes for your organization.