Operational Efficiency – it’s not just about cost cutting
WHY automate? Automation can help cut costs, but it’s only worth automating to improve efficiency without reducing quality of service. Assuming automation is the HOW of the solution, WHAT are the outcomes and benefits needed, and WHICH processes should be automated.
During this presentation Matthew Burrows of BSMimpact will share his experience, giving practical and pragmatic guidance to all faced with the challenge of increasing operational efficiency.
RecordedApr 7 200935 mins
Your place is confirmed, we'll send you email reminders
Bill Briggs, CTO, Deloitte Consulting LLP Ross Mason, Founder and VP of Product Strategy, MuleSoft
Almost everyone is undergoing digital transformation, but not everyone knows the right way to do it. In a recent MuleSoft survey, 96 percent of respondents are executing on digital transformation initiatives or planning to do so in the near future. However, the results also showed that just 18 percent of IT decision makers are confident that they will succeed in meeting this year’s digital transformation goals. To do this effectively, IT teams need different skill sets, tools, and more importantly, different mindsets. Join Bill Briggs, CTO at Deloitte, and Ross Mason, Founder and VP of Product Strategy at MuleSoft, to learn how IT can grow beyond “business as usual".
Application-centric enterprises recognize the importance of automating infrastructure to deliver applications quickly. Software-defined networking solutions such as Cisco ACI enable programmable infrastructure for networking and operations teams. However, appliance-based load balancers require manual configurations, cannot scale horizontally, and are expensive in many cases. They represent a "last-mile" challenge for automation.
A software-defined approach to application services can extend the value delivered by SDN solutions to load balancing and L4-L7 services.
Join Avi Networks to learn how native integration with Cisco APIC/ACI:
• Simplifies deployment with one-click service insertion
• Enables zero-touch autoscaling
• Provides tenant isolation and security
• Reduces troubleshooting time to minutes with pinpoint analytics
NoSQL databases like Cassandra and Couchbase are quickly becoming key components of the modern IT infrastructure. But this modernization creates new challenges – especially for storage. Storage in the broad sense. In-memory databases perform well when there is enough memory available. However, when data sets get too large and they need to access storage, application performance degrades dramatically. Moreover, even if enough memory is available, persistent client requests can bring the servers to their knees.
Join Storage Switzerland and Plexistor where you will learn:
1. What is Cassandra and Couchbase?
2. Why organizations are adopting them?
3. What are the storage challenges they create?
4. How organizations attempt to workaround these challenges.
5. How to design a solution to these challenges instead of a workaround.
Kevin Orbaker, Technology Solution Professional, Microsoft Corporation
Employees might feel intimidated or overwhelmed with the amount and diversity of business data available to them.
Accessing and harnessing all of that data could be a game changer for them, but they're not sure how to do it. Up until now integrating, analyzing, and understanding all those data sources hasn't been easy, timely, or affordable for your team - not to mention your business users. Your main role has been telling them what they can't do. What if you could offer a comprehensive solution that business groups could use on their own terms - to answer their own questions? Just imagine how your role would change from being a traffic cop to a strategic advisor.
In this webcast, viewers will learn more about:
•Data accessibility for business users and managers: the human element of data
•Managing the backend data infrastructure
•Internet of Things (IoT) and what that means for businesses
•The future of automation in intelligent data systems
Alex Li, Product Manager, Ahyoung An, Product Marketing Manager, and Damian Sima, Lead Software Engineer, MuleSoft
How are you validating the behavior of your Mule app before you send it to QA? Did you know that there is a testing framework that automates the testing of Mule apps? Join this webinar to learn what you may not know about testing your Mule applications within Anypoint Studio.
Learn how to use MUnit to:
- Test API implementations
- Do other types of unit and integrated testing
- Automatically generate coverage reports
- Integrate testing into your continuous integration process
In today’s environment speed and quality are two of the biggest factors in delivering products and/or services to your end users. As a result, organizations and individuals have recognized the role that Performance Engineering practices delivers as a competitive differentiator.
We will deliver a 1 hour vendor neutral webinar discussion, focusing on “Performance Engineering as a Competitive Differentiator.” The expert panel will collaboratively discuss how the capabilities of Performance Engineering have been recognized as a significant competitive differentiator across all: geographies, industries and markets.
You will get tangible examples and stories that you can use to share and promote how Performance Engineering practices can be used within your organization and teams, so to drive competitive advantage in the market.
Nial Darbey, Principal Solutions Consultant, John D'Emic, Client Architect, Rupesh Ramachandran, Principal Solutions
Microservices is one of the hottest technology trends for 2016. Join us to learn what microservices mean for your business, as our technical experts take you on a journey through microservices best practices and implementation using Anypoint Platform.
DevOps is gaining traction in the industry. The idea economy demands faster go to market capabilities with high quality, low risk, and low costs. Achieving the benefits of DevOps requires enablement of some core attributes within the organization including visibility, collaboration, and automation. Join us as we walk through an integrated DevOps Solution and help you accelerate the implementation of Continuous Delivery.
In this webinar you will learn about:
The importance of integration of lifecycle products
Shift-Left optimization via integration
Providing value to your organization through DevOps and Integration
Rick Friedman, VP, Cycle Computing & Scott Jeschonek, Director, Avere Systems
While cloud computing offers virtually unlimited capacity, harnessing that capacity in an efficient, cost effective fashion can be cumbersome and difficult at the workload level. At the organizational level, it can quickly become chaos.
You must make choices around cloud deployment, and these choices could have a long-lasting impact on your organization. It is important to understand your options and avoid incomplete, complicated, locked-in scenarios. Data management and placement challenges make having the ability to automate workflows and processes across multiple clouds a requirement.
In this webinar, you will:
• Learn how to leverage cloud services as part of an overall computation approach
• Understand data management in a cloud-based world
• Hear what options you have to orchestrate HPC in the cloud
• Learn how cloud orchestration works to automate and align computing with specific goals and objectives
• See an example of an orchestrated HPC workload using on-premises data
From computational research to financial back testing, and research simulations to IoT processing frameworks, decisions made now will not only impact future manageability, but also your sanity.
The massive amount of data continuously flowing through today’s IT organizations has made problem detection and remediation infinitely more challenging. In addition, the associated costs and resources required has placed the need for automation at the forefront of CIO agendas.
According to a recent survey, 69% of IT professionals said that higher customer expectations was their biggest challenge in managing IT operations today – making finding and fixing problems quickly and efficiently that much more important.
Join us and learn from the IT Operations Analytics (ITOA) experts at Hewlett Packard Enterprise how HPE Operations Analytics can harness the power of machine learning to automate troubleshooting, set up alerts to help find problems before business and customers are impacted, and eliminate the need for war rooms.
Software continues to change our world, and gaining or maintaining a competitive advantage demands that applications are delivered at a blistering pace, and with the highest quality.
In a world driven by continuous delivery, Agile and higher levels of automation, HPE would like to introduce you to the next generation platform to support how modern ALM is evolving.
HPE’s fully web-based platform will help your teams collaborate more easily, manage the software delivery pipeline, fully understand the impact of change, and help the team make the most informed project-decisions as they support the business.
DevOps, and its predecessor Agile development, provide a gateway to the best practice of “find and fix early.” However, allocating sufficient time and resources to thoroughly test your enterprise applications is a continual challenge – given the frequency and volume of change. With resource constraints and the limitations of today’s existing tools, many business-critical applications go minimally tested while the quality of these applications is fundamental to the business’ success.
In this session we’ll discuss 5 key strategies for improving QA and Delivery within a DevOps environment. Learn how to leverage advanced data-driven testing to dynamically validate your applications, while allowing you to easily and rapidly create and maintain your functional tests.
We’ll cover in-depth the features and integration of scriptless test automation to enable:
Broad Usability – reduce the time it takes to create and maintain tests, simplifying test development for the entire DevOps team
Reusability – build a growing library of test assets that can be leveraged by all users, to support multiple applications and end-to-end tests
Simplified Test Maintenance – easily detect changes with each new software iteration, and update tests automatically to ensure minimal delays in the DevOps process
Data-Driven – run multiple data sets against the same test and increase test coverage and agility
Hewlett Packard Enterprise recently announced the release of the latest Service Pack for the Functional Test Automation Solution.
In this latest release, HPE continues to invest in supporting a wide variety of technologies, expand the API testing and mobile testing capabilities and introduce some new integrations including HPE Network Virtualization.
In this webinar, we will cover what’s new in UFT 12.52 and BPT 12.52 and look at:
UFT’s new UI and API Testing capabilities
BPT features, including LeanFT for BPT
Enhanced and flexible reporting via the new HTML report
Randy Franklin Smith: Windows Security Subject Matter Expert. Greg Foss: Sr. Security Research Engineer, LogRhythm, Inc.
PowerShell is like nuclear fission—it’s powerful, and it can be used for good and evil. The bad guys love to exploit PowerShell for at least three reasons:
1. It’s already installed on most versions of Windows.
2. It’s powerful. You really can do just about anything in PowerShell—even call into the Win32 API if enabled.
3.There are no EXEs or DLLs to upload.
Lee Holmes (Microsoft’s PowerShell extraordinaire) will be joining me to show you how to catch intruders exploiting PowerShell to their own ends.
First, we will provide a brief overview of PowerShell security capabilities especially enhancements in PowerShell 5.0t. There are some really good preventive steps you can take to limit your exposure to PowerShell-related risks. And PowerShell 5.0 is available on Windows 2008 R2 SP1 and Windows 7 SP1 and up, so this isn’t vaporware.
Then we will zero in on the auditing capabilities in PowerShell. We’ll show you how to enable PowerShell logging so that you get events for every script block executed. We’ll show you sample events and discuss how to interpret them, how to filter the noise and more.
I’ll also briefly point out some less powerful, but easy-to-implement techniques for just detecting the use of PowerShell itself using Process Tracking events. This can be useful for highly controlled endpoints where use of PowerShell at all is very limited and easy to recognize if PowerShell is being used in an unusual way.
Of course producing valuable audit data is one thing. Collecting, analyzing and alerting on it is another. And that’s where our sponsor, LogRhythm, comes in. The security experts at LogRhythm have been following the increased exploitation of PowerShell by the bad guys and been publishing their own tips on how to combat. Greg Foss will briefly demonstrate LogRhythm’s built-in knowledge of PowerShell and its ability to correlate PowerShell events with all the other security intelligence LogRhythm collects from your enterprise.
How To Break “The Cycle” and Move To Hyperconvergence
In this webinar, Storage Switzerland's George Crump and SimpliVity's Adam Sekora compare and contrast the suitability of SANs vs. hyperconverged architectures; examine the benefits of consolidating and reducing the number of discrete IT devices in lieu of hyperconverged infrastructure; and discuss the merits of simplified IT and its impact on technology refresh initiatives.
Hubert Yoshida, CTO, Hitachi Data Systems, Greg Knieriemen, technology evangelist, Hitachi Data Systems, Adrian De Luca, CTO
Innovative technology companies that quickly capitalize on business opportunities and satisfy the demands of today’s empowered consumer have caused a wave of disruption. In 2016, businesses will turn to IT for solutions that will keep them competitive. Chief information officers will invest in faster delivery of applications and analytics, and transform IT by leveraging the third platform, social, mobile, analytics and cloud to reduce infrastructure expenses. Learn how to avoid distractions and remain focused on the IT trends that matter in 2016, as well as gain the knowledge to help accelerate your IT transformation and success.
Pharmaceutical companies play an ever increasing role in the treatment and prevention of illnesses. Many leading Pharma companies depend on third-party organizations, like the Almac Group, to help conduct testing, research and trials to improve the overall drug development process.
Almac is a global leader in contract pharmaceutical development and manufacturing services, and recently deployed scriptless test automation to improve its core business application testing in support of its overall business agility goals.
This session will focus on how Almac and TurnKey Solutions partnered to successfully shorten test cycles and improve test outcomes - supporting more rapid deployment cycles for core applications and end-to-end business critical processes.
Shauna Quinn, Software Test Manager for Almac, will share recent results and 4 lessons learned in Almac’s implementation of scriptless test automation, including:
How to build a better alternative to manual testing methods, which for Almac, previously took 20 people and 12 weeks to run
How to streamline operations and shrink testing cycles from multiple weeks to only 3-4 days
How automation helped to quickly and easily validate core business systems using a comprehensive regression suite
How this intuitive software provided a solid testing strategy moving forward
Weston Morris, Global Portfolio Solution Manager; Dan Huberty, Vice President; Andrew Harsch, Global Director
To keep pace with the increasing demands to become a more digital business, IT organizations need to have the foundational digital service management pillars in place to; create, launch, secure, manage, analyze and restore digital services via strong strategy, governance, intelligence and automation.
Time is money, whether launching new services, fulfilling orders, improving performance and engagement or reacting to predictive intelligence. The ability to create, enable and deliver these capabilities to the rest of IT, to your internal associates and to your clients will determine your future success and survivability.
Join us for part one of a three part series where we will discuss how cloud impacts ITSM from both a tool and overall infrastructure perspective and how mobility first is key to your future and look out to 2016 and the forces shaping our industry.
As you are asked to do more with less, automating network configuration and compliance tasks provided one organization with a 95% increase in efficiency. This allows you to reallocate administrators from tedious manual tasks to moving the organization forward through innovation.
Join us and hear:
How to improve your ROI by elevating the implementation of network management best practices
How to use automation and compliance capabilities together with monitoring
A case study on the value of integration between HP NNMi and HP Network Automation
Increase Productivity while Reducing the Cost of IT Operations
This online channel will feature live webcast presentations by leading IT and business executives. In today’s economic environment it is imperative that IT find ways to become more efficient while maintaining, or even increasing, the levels of service that it provides. Automating IT functions is the best way that IT can achieve these conflicting objectives. The IT Automation channel speakers will look at how to increase the productivity while reducing the cost of IT operations through automation and the application of industry best practices. The presentations will show you the steps to take and the tools that you will need to improve the performance of your IT operations.
Rapidly evolving trends among digital financial services are driving fintech developers to create much more personalized customer experiences within their applications. By harnessing the power of consumer transaction data, these industry innovators can enhance customer engagement and drive more targeted cross-sell and upsell revenue opportunities.
Analyzing consumers’ financial data is quickly becoming the future of online banking and those in the fintech community that can leverage this information quickly and effectively will have the edge and be able to deliver the personalized services necessary to attract and retain the next generation of banking customers.
Join us on August 24 to hear from a panel of industry experts as we discuss the evolution of consumer transaction data analytics and how to leverage it to create a more contextualized and personalized user experience.
* The increasing demand for fintech apps to create an individualized digital banking experience
* The rapid advancement of gathering and analyzing consumer transactional data
* How fintech developers can leverage contextual data to improve customer products
* Alex Cram, Co-Founder and Chief Technology Officer, Track Technologies
* Robin Verderosa, Senior Product Manager, Envestnet | Yodlee
* Jim Del Favero, Chief Product Officer at Personal Capital
* Mani Fazeli, VP of Product, Wave
* Evan Schuman, Moderator, VentureBeat
As you begin evaluating hyper-converged infrastructure (HCI) options, you may run into conflicting ideas about what it is and what it can do.
In this webcast, we will debunk some common myths around HCI, discuss what’s behind these misconceptions, and clarify how the right HCI solution leads to a simpler and more efficient way to manage your resources.
Don’t let myths distract you from what you should be considering when choosing the right HCI, which includes:
•A proven hypervisor in an integrated HCI stack, with reliability evident in its track record and usage.
•Broad HCI use cases, that include business-critical apps, VDI, managed hosting platforms, micro-services, and more
•Reduced costs through the automation of manual tasks with simple, policy-based management, and a broad choice of competitive hardware platforms.
•Fewer silos by extending existing server and virtualization infrastructures and supporting existing external storage systems
Join us and learn how the modular architecture of HCI serves as the fundamental building block for growing into a private cloud and full SDDC.
Discussing the Storage Performance Development Kit (SPDK) an extension of the Data Plane Development Kit (DPDK) into a storage context. We cover how SPDK got started, what the benefits are of an NVMe* polled mode driver, how SPDK supports protocols like NVMe over Fabrics and the future areas of development for SPDK.
A Case Study presented by Kurt Jackson, Platform Lead, Autodesk
Companies such as Autodesk are fast replacing the once-true- and-tried physical data warehouses with logical data warehouses/ data lakes. Why? Because they are able to accomplish the same results in 1/6 th of the time and with 1/4 th of the resources.
In this webinar, Autodesk’s Platform Lead, Kurt Jackson,, will describe how they designed a modern fast data architecture as a single unified logical data warehouse/ data lake using data virtualization and contemporary big data analytics like Spark.
Logical data warehouse / data lake is a virtual abstraction layer over the physical data warehouse, big data repositories, cloud, and other enterprise applications. It unifies both structured and unstructured data in real-time to power analytical and operational use cases.
Attend and Learn:
-Why logical data warehouse/ data lakes are the bedrock of modern data architecture
-How you can build a logical data warehouse using data virtualization
-How to create a single, unified enterprise-wide access and governance point for any data used within the company
As you look to transform your approach to infrastructure to capitalize on new innovations and data, you also face the challenges of increased complexity, costs and compliance. Join us to learn how SUSE Manager 3, the best-in-class open source infrastructure management solution - now with Salt integration can help you Tame IT. In addition to talking about the features and benefits of SUSE Manager, we will also take an in-depth look at the integration with Salt and the advanced configuration management capabilities it provides. We look forward to seeing you this interactive session.
Software-defined Storage Service interruptions wreak havoc on business. Enterprises expend huge budgets to deliver datacenter redundancy and continuous application availability and still it is not easy. Storage is often the biggest limiting factor. The good news is that recent storage innovations combined with the flexibility of VMware vSphere simplify and lower the cost of delivering automated mobility of services between sites to ensure downtime avoidance and fault recovery.
Watch this webinar and learn:
• How to deliver real-time virtual machine migration across a geographically stretched cluster with no data loss or service interruption
• How software-defined storage streamlines delivery of vSphere Metro Storage Clusters (vMSC)
• How a multi-site storage system managed as a single logical entity benefits your virtualization deployment
Server virtualization was supposed to consolidate and simplify IT infrastructure in data centers. But, that only “sort of happened”. Companies do have fewer servers but they never hit the consolidation ratios they expected. Why? In one word, performance.
Surveys show that 61% of companies have experienced slow applications after server virtualization with 77% pointing to I/O problems as the culprit.
Now, companies are looking to take the next step to fulfill their vision of consolidating and reducing the complexity of their infrastructure. But, this will only happen if their applications get the I/O performance they need.
This is where DataCore’s Parallel I/O technology comes in. By processing I/Os in parallel leveraging multi-core, multi-processor systems, Parallel I/O delivers industry leading I/O response times as well as price/performance. The net benefit is that fewer storage nodes can provide much better performance, allowing you to reduce and simplify your infrastructure.
Do you run a mix of virtualized and diverse workloads, including block storage? Are you looking to increase density and maintain blazingly fast speeds? If so, this webinar is for you!
In this webinar, speakers from DataCore and SanDisk® will discuss the performance and economic advantages of combining software-defined-storage with all-flash storage. We’ll also share two customer stories on how they were able to:
- Achieve effortless and non-disruptive data migration from magnetic to flash storage.
- Prevent storage-related downtime
- Dynamically control the movement of data from flash to high-capacity storage
- Strike the right economic balance between fast performance and low cost
Don’t let data growth and complex workloads slow you down. Attend this webinar and learn about new possibilities.