The availability and integrity of data is critical to the business enterprise. When a problem occurs and data is compromised or unavailable, it is paramount to recover from such an 'outage' as quickly as possible. Equally important is the ability to recover efficiently by quickly identifying recovery assets and minimizing the recovery scope, required resources and effort. This session will explore in detail the Advanced Recovery Solutions for DB2 for LUW.
RecordedJan 26 201289 mins
Your place is confirmed, we'll send you email reminders
In this session, we will review foundational cryptographic concepts and discuss the key highlights of the new DB2 native encryption capability, including encrypting online data, encryption backup images and key management.
dashDB is a newly announced data warehouse as a service deployed in the cloud that leverages technologies like BLU Acceleration, in-database analytics and Cloudant to allow you to focus more on the business and less on the business of IT. In this DB2 Tech Talk you will learn a little more about IBM’s cloud initiatives and the value proposition around dashDB as well as...
-dashDB’s architecture and use cases
-pricing and offerings as as service
-competitive differentiations and customer feedback
DB2's most recent release added significant new capabilities for both transaction and analytic processing. This presentation will take you through the new features and what you need to know. Attend this DB2 Tech Talk to learn about:
• BLU Shadow Tables
• Expanded Oracle SQL compatibility for data marts
• Enhanced SAP Business Warehouse support
• Standard TCP/IP sockets for pureScale
• Additional capabilities
In-memory, columnar, faster analytics, high availability and scalability. There is a lot happening in the data management space these days. In this presentation, we will go through some of the key capabilities in the newest release of DB2, including BLU Shadow Tables, standard TCP/IP sockets for pureScale, and other enhancements. We'll also talk through some of the latest competitive announcements and how DB2 stacks up to these offerings.
Soloman Barghouthi (presenter), Rick Swagerman (host)
In a 3 tier architecture, an application server is considered tier 2 and a database is considered tier 3. IBM offers a well known Composable, secure and high scalable application server called WebSphere Application Server. IBM also offers multiple databases one of which is is DB2. This session will explain some of the key integration features between the two products and why this combination helps you create a secure, scalable and highly reliable solution for your middleware and backend database. Topics covered in this session are:
-- Security: Trusted connection support
-- Scalability: Heterogenous pooling support and Lock sharing
-- Resiliency: Automatic client reroute and failover enhancements
-- and more
In this session from the IDUG 2014 Technical Conference in North America, Randy Ebersole highlights some of the new features of DB2 11. He introduces some of the new functionality to get you thinking about moving to DB2 11!
Guersad Kuecuek (presenter) and Rick Swagerman (host)
Analytics are the hot business need because the competitive advantage goes to the organization who is first to discover and capitalize on the business insight. Data is at the root of this, but it is the analytics that provide data value. For many years DB2 has been used with OLAP and OLTP systems by SAP customers worldwide to deliver a cost-effective database engine to SAP applications.
Now with DB2 10.5 with BLU Acceleration, you can use this next generation in-memory technology to deliver these insights from data fast and simply. DB2 with BLU Acceleration is an in-memory technology, which makes analytical jobs by factors faster right out of the box, while requiring dramatically less storage and nearly eliminating the need for tuning.
Although BLU Acceleration is in-memory optimized, it is not main memory-limited. DB2 BLU is highly optimized for accessing data in RAM, but performance won’t suffer as data size grows beyond RAM. These benefits are achieved through next generation columnar processing, operating on compressed data, carefully exploiting modern microprocessor designs, and accessing memory efficiently. The result is a system that simultaneously looks and feels like DB2 while being in-memory optimized, CPU-optimized and I/O-optimized.
SAP-DB2 customers have seen tremendous performance improvements with the same hardware. Because of this, customers can reuse their existing IT infrastructure and simply update to the newer DB2 10.5 version-- converting row-store tables to column-store tables.
Join DB2 - SAP expert Guersad Kuecuek for a discussion of the technical features of this solution including how to use BLU, how to derive performance improvement and how to do the sizing for BLU.
Judy Ruby-Brown gives a thorough overview of system level backup and using FlashCopy for disaster recovery as provided by various vendors. This presentation is quite thorough but yet very easy to understand, even for a beginner. Judy explains the types of services that need to be setup for disaster recovery as well as easy scenarios for the DBA in the implementation of disaster recovery. Just the summary at the end is enough to pass on to the system programmers in your shop as a checklist for implementation.
Jef Treece and Shawn Moe, IBM Labs; with Rick Swagerman (host)
The "Internet of Things" is the growing number of devices and sensors that communicate via the Internet, offering vast new opportunities. Harnessing data from billions of connected devices lies in the ability to store, access and query SQL and NoSQL data together, seamlessly. The result of all this data is the need for fast development of web and mobile applications and speed-of-thought insight for fast business decisions.
Learn how to use Informix and DB2 to bridge the gap between new computing technologies designed for big data, cloud, and mobile computing with the enterprise world of relational data in the "Internet of Things" era. Jef Treece and Shawn Moe will present how to develop mobile applications using techniques that preserve the native look and feel specific to such devices as Android and iOS. They will also show device-agnostic techniques and discuss new approaches to handling a range of types of mobile devices. A discussion of how to keep data on the device in sync with a back-end database or cloud service, such as DB2 or Informix on BlueMix will also be included. And finally, using an embedded database to achieve a truly scalable and reliable Internet of Things architecture will be included.
In this comprehensive overview of DB2 10 for z/OS Lori Ann Galluzzo goes through the features of DB2 10 in great detail. Each feature is discussed, along with how and when to take advantage of the feature, as well as some pitfalls and things to watch out for.
Les King, Kelly Schlamb, Vladimir Stojanovski (presenters) and Rick Swagerman (host)
Today in order to remain competitive, there is a tremendous demand on lines of business to be able to analyze more aspects of business data and do this very quickly. There is a control shift to the business analyst so they can analyze what they need to, NOW!
Big Data requirements are also emerging, even as everyone is being asked to accomplish more with less. Join us to learn how the combination of DB2 10.5 with BLU Acceleration and Cognos BI 10.2 can help bring sanity to satisfying these ever increasing demands.
We will provide an overview and technical deep dive of DB2 10.5 with BLU Acceleration, Cognos BI 10.2 and Dynamic Cubes. Join experts from the DB2 and the Cognos BI teams who will also explain how these two offerings fit together and the benefits they have provided to clients already. See the value that DB2 with BLU Acceleration can bring to Cognos BI, and ask these experts your questions, so you can understand how this solution helps solve business challenges.
Conference Chairman Bob Vargo gives attendees a detailed overview of the conference, seminars, keynotes, and the technical sessions that will be featured at this year's North American conference. Viewers are introduced to the schedule, speakers, and are given tips and insights into the conference.
Vladimir Stojanovski, Predictive & Business Intelligence Segment Leader
Watch a demonstration of IBM Cognos Business Intelligence running on a DB2 BLU Acceleration in-memory database.
The demo compares, side-by-side, the performance of a dashboard sitting on top of a traditional row-based data warehouse with indexes, versus a dashboard sitting on top of the same database, with BLU Acceleration.
BLU Acceleration is an innovation from IBM Research and Development labs that delivers a new generation of In-memory database technology, for "speed of thought" analytics. Coupled with Cognos Dynamic Cubes, it results in sub-second query result times. See it for yourself!
SQL is a powerful language that you will learn to use to a greater extent in this talk! Join us to explore some of the less well-known features of SQL and see how they can help in some practical situations. We will also look at some common mistakes and misconceptions and discuss ways of avoiding them. Among the topics:
- tricks with date and time values
- use of common table expression and recursive queries
- use of data change tables
- use of nested tables
- use of the MERGE statement
- and more.
Nick Ivanov is presenting this talk based on his consulting experience with many customers to maximize use of IBM DB2 and other products; move applications to DB2; and tune database and query performance.
Although the DBA’s tuning efforts often result in both a decrease of CPU consumption and end user response times, the perception of DBA activities at the C-level sometimes is more of the kind “why didn't they do that before?”.
This presentation helps you to better understand the C-level perspective and shows you how we managed to improve the image of the DBA team from a reactive team of techies to a proactive team that is directly involved in improving the business results.
Sriram Padmanabhan (presenter) and Rick Swagerman (host)
IBM InfoSphere Information Server is a market-leading data integration and governance platform. It provides a set of capabilities to build confidence in your data by helping you understand, cleanse, transform and deliver trusted information to critical business initiatives, such as big data, master data management and point-of-impact analytics. Information Server provides comprehensive connectivity to DB2 (including DPF and BLU Acceleration) and it can also push computation to DB2 when appropriate. By using Information Server with DB2, you get its best-of-breed capabilities to integrate DB2 data and manage its metadata as well.
Join us for this DB2 Tech Talk to learn how using these products together can help you deliver trusted information to your business, support big data initiatives, and more.
In March 2013 the IDUG DB2 11 Editorial Committee was formed, comprised of volunteers from IDUG’s worldwide community of DB2 users and consultants. Working alongside IBM's formal Early Support Program (ESP), the Editorial Committee gained valuable insight into exactly what makes the new release tick. The White Paper, available on the IDUG web site, contains their findings, including practical experiences and independent evaluations of the new features.
DB2 11 contains a large number of enhancements, but some will be of more interest than others depending on your job role and background. The paper has been organized according to the features likely to be most applicable to each of the three major technical roles in a DB2 for z/OS environment: Systems Programmers, Database Administrators and Application Developers.
This presentation from the IDUG EMEA 2013 Technical Conference looks at some of the key findings presented in the paper.
IBM InfoSphere Optim Workload Replay can significantly improve your database testing experience and help you successfully manage change in your database environments, whether you are adopting BLU Acceleration, upgrading DB2, or making the move to DB2. Using InfoSphere Workload Replay, you can build realistic testing environments to help assess a wide range of database changes before production deployment and without extensive script creation or application setup. This makes the change process faster and less risky.
In this session, we will examine the use of InfoSphere Workload Replay to facilitate database testing in efforts to validate version, platform or infrastructure changes. We will discuss how to use the tool to prepare for BLU Acceleration, and review some of the key use cases. Learn the testing methodology and the various features to control capture, replay and generate reports. Using this realistic testing approach you will find key issues sooner in the test cycle, and resolve them before the deployment day.
Anson Kokkat (presenter), Tony Leung (presenter), Rick Swagerman (host)
Stored Procedures and user-defined functions are a great way to consolidate database logic, improve database performance, code re-use, security and integrity. Stored Procedure development in DB2 has always been an integral part of the database, and in this DB2 Tech Talk, we will look at how IBM Data Studio helps accelerate the development and debugging process.
IBM Data Studio is a comprehensive database development and administration environment. It includes the tools needed to develop and deploy Stored Procedures. We will look at the various wizards and options available to create a SQL Stored Procedure and walk through the debugging of that stored procedure in a DB2 environment.
Joint DB2 Tech Talk series sponsored by IDUG with IBM
This one-stop webcast channel makes it fast and easy to build and extend your technical skills on the IBM DB2 product family. IDUG has joined forces with IBM to bring together a wide array of learning sessions on technical product features, related products, new product developments and more.
Join this webinar to be certain of making the right decisions on moving resources to the cloud. You’ll see how to evaluate which workloads are candidates for cloud migration PLUS measure how efficiently you’re utilizing your own resources.
The CloudPhysics Cost Calculator for Private Cloud lets you apply basic costing models to determine your actual costs per virtual machine (VM) in terms of power, compute resources, memory, storage, licensing, and more to generate a cost baseline.
Now you can apply CloudPhysics rightsizing intelligence to your VMs. See your “as is” costs beside your rightsized costs at peak, 99th percentile, and 95th percentile. Capture savings by reducing workloads to match actual demands and reduce overprovisioning.
When mapping your VMs to their public cloud instances, apply the same peak, 99th percentile, and 95th percentile data to reveal cost difference for private versus public cloud.
Attend this webinar to be sure you’ve optimized decision-making before you move.
Cybersecurity has jumped to the top of companies’ risk agenda after a number of high profile data breaches, and other hacks. In an increasingly digitized world, where data resides in the cloud, on mobiles and Internet of Things enabling multitude of connected devices, the threat vectors are multiplying, threatening the firms’ operations and future financial stability.
Organizations with the ability to view cybersecurity breaches as a risk, with associated probabilities and impacts, can strike the right balance between resilience and protection. By bringing together leadership and capabilities across fraud, IT, cybersecurity and operational risk, organizations can connect the dots and manage their GRC program more effectively. Organizations need to employ a proactive approach to review their existing risk management processes, roles and responsibilities with respect to cybersecurity to re-align them into an overall ERM strategy with boardroom backing.
Attend this panel webinar, as we discuss these issues and address ways to develop an evolving GRC program to cope with the growing threat landscape.
Join us for the third and final webcast in our series on micro-segmentation, how it protects networks, and how it works with perimeter firewalls. We’ll also discuss its advantages beyond protection in automating security workflows and more.
In this webcast series, we’ve explored the security benefits of micro-segmentation with NSX, notably how it protects data centers inside the perimeter firewall. But did you know that with micro-segmentation, IT can also automate security workflows such as provisioning, moves/adds/changes, threat response, and security policy management? Join us as we discuss:
• How to improve accuracy and gain better overall security in the data center
• Security policy approaches with network virtualization
• How to automate security workflows to gain greater agility
Build a fundamentally more agile, efficient and secure application environment with VMware NSX network virtualization on powerful industry standard infrastructure featuring Intel® Xeon® processors and Intel® Ethernet 10GB/40GB Converged Network Adapters.
Data communication speeds are constantly increasing to keep up with the demand in bandwidth. Ethernet speeds of 100 Gb/s are being deployed and 400 Gb/s or more are being considered. As the speeds increase, the reach of multimode fiber gets shorter. One way to mitigate the shrinking distance is to the use the highest bandwidth fiber. What if we tell you that the transceivers can help mitigate as well?
Topics to be discussed include:
- Characteristics of cable, connectivity, and transceivers and how they can maximize network reach and flexibility
- Current trends in Ethernet and Fibre Channel and what is coming in the near future.
Telco Cloud represents an enormous opportunity for communications service providers to transform their business practices. By bringing together the best of telco and cloud tools and technologies, communications service providers can deploy network functions anywhere to provide the best user experience without sacrificing service reliability. In this webinar, we will highlight the technical problems and challenges and offer a variety of solutions for the audience to address performance, availability, security, and manageability and automation as they consider their options for transforming their networks in a Telco Cloud environment.
Over a decade ago, API-accessible key/value-based storage was introduced as a service by Amazon and as software by Caringo. Amazon’s service-based approach matured quickly, while the general storage market slowly adopted the API approach. The on-premise market demanded a traditional file interface, leading to various “gateway” products. However, for the majority of use cases, the logical file gateway architecture is flawed. Is it time for file to evolve with object? This webinar examines the past decade in file interface gateways and the needs of the market versus what is available.
The use of broadband Internet connections in SD-WAN environment has many benefits, however for any enterprise, performance and reliability cannot be compromised. An SD-WAN solution must include all the functionality needed to meet these essential requirements that deliver outstanding performance and Quality of Service by:
•Actually improving the quality of the bandwidth you already have, instead of routing around it
•Enabling centralized control and administration of network-wide policies
•Providing detailed visibility into real-time and historical application and network trends
•Allowing for the modular deployment of WAN optimization to insure performance when you need it, where you need it
This all adds up to an enterprise-grade, performance-centric offering that allows your SD-WAN to rapidly connect users to the applications they need. Deployment times are reduced significantly and enterprises enjoy enhanced performance, visibility and control over the entire network.
Within the financial services industry, middle office analytics and simulations continue to grow in volume and complexity. Massive compute and storage demands cause strain on IT resources. While new technologies promise speed and scalability, evaluating this unique middle office environment requires a look at compliance, risk, and pricing analytics to determine potential gains and losses. In this webinar, IDC – Financial Insights Research Director, Bill Fearnley, looks at current middle office IT workflows supporting analytics, backtesting and financial modeling and evaluates a hybrid cloud infrastructure to support growing demands.
In this webinar, you’ll:
· Hear an IDC Analyst’s view on the current financial services IT environment
· Learn of common challenges and approaches to combat growing strain on compute and storage infrastructure
· Join in a discussion about the viability of enabling cloud services to expand compute and storage capacity
· Gain guidance on how large hedge funds and investment banks are overcoming inherent cloud challenges like latency, data accessibility, and cost management
There has been a great deal of interest in Graphene. Some would call it hype. But with its flexibility and heat conduction properties, this atom-thin layer of carbon, which has been touted as the strongest material ever measured, has enormous product and market potential for the ICT industry.
Because graphene is conductive at nano-scale layers, it can be used for lightweight, flexible yet durable display screens, electric circuits and solar cells. It is also currently being made into inks and 3D printable materials. Imagine what this can mean for the design of communications devices, or circuitry, or batteries. Imagine the impact on wearables, the design and development of IoT sensors, or large scale retail store windows. Graphene holds a great deal of promise.
Despite the potential graphene promises, it has taken longer than expected to transform research and development into commercialized product.
This webcast will explore both the tremendous potential harbored in those structured carbon atoms and the business reality. The focus will be on the use of the material for the ICT industry. We will also look at other use cases that may be the first steps on graphene’s path to commercial application.
- Dr. Stephen Hodge, Research Associate at the Cambridge Graphene Centre, Engineering Department, University of Cambridge
- Anthony Schiavo, Research Associate, Advanced Materials Team, Lux Research, Inc.
- Limor Schafman, Director of Content Development, TIA (Moderator)