DB2 Tech Talk: Part One Certification Prep for DB2 10 Fundamentals Exam
Validated DB2 skills help you stand out with your employer, meet annual education objectives, and create an edge in the employment marketplace.
IBM is offering the all new DB2 10.1 Fundamentals Certification (exam 610) to attendees of the Information on Demand global conference in Las
Validated DB2 skills help you stand out with your employer, meet annual education objectives, and create an edge in the employment marketplace.
IBM is offering the all new DB2 10.1 Fundamentals Certification (exam 610) to attendees of the Information on Demand global conference in Las Vegas, Nevada from October 21 – 25, 2012. And if you are not attending the conference, IBM is offering the Take it Again promotion through November 30, 2012.
This two-part Tech Talk can help you prepare for the exam. Part one is presented on Thursday July 26 from 12:30 to 2:00 PM ET. Part two will be presented on Thursday August 2nd from 12:30 – 2:00 PM ET.
Basics of the DB2 products and tools, DB2 pureScale feature, plus data warehousing and OLAP concepts
·Security including authentication, authorization and privileges, and the new Row and Column Access Control (RCAC) feature
·Working with databases and database objects including use of DDL statements to create objects
·Accessing data using SQL, to query data including XML data with XQuery, and use of Time Travel Query with temporal tables
·Working with tables, views and indexes including XML and Oracle compatibility data types and the new temporal tables
·Data concurrency mechanisms
This talk is hosted by Serge Rielau, SQL architect for DB2 for LUW, with presentations by experienced DB2 instructor Andre Albuquerque.
The time is 12:30 PM ET, however note that your view may present in your local time zone.
In the traditional world of EDW, ETL pipelines are a troublesome bottleneck when preparing data for use in the data warehouse. ETL pipelines are notoriously expensive and brittle, so as companies move to Hadoop they look forward to getting rid of the ETL infrastructure.
But is it that simple? Some companies are finding that in order to move data between clusters for backup or aggregation purposes, they are building systems that look an awful lot like ETL.
Join us for a live webinar featuring Dr. Jim Metzler, Distinguished Research Fellow at Ashton Metzler & Associates, as he discusses the future of the WAN and why now is the time to rethink the architecture in order to evolve/accelerate your business.
Why take the traditional approach to branch office expansion by relying on providers that can’t meet the urgency of your business needs? A new era has come where complexity, cost, flexibility and time-to-market are no longer a hurdle. Welcome to the world of the Software-Defined WAN.
Join this webinar featuring Jim Metzler, industry specialist to discover:
•Why it’s time to re-architect your WAN
•How today’s WAN infrastructure is failing IT
•New market dynamics that the traditional WAN cannot support
•A step-by-step approach to evolving your WAN with minimal disruption
For the Data Scientist, Data Science is complex; for the average business user it is a mystical art form that promises a lot, but often under delivers against expectation. For many established companies the result of this has been a lack of investment in an area that is, for others, quickly becoming an area of competitive advantage.
Helping the business understand the value of Big Data and Analytics, whilst also helping translate their business requirements and expectations, is a critical foundational step of the Data Analytics Lifecycle that can lead to greater investment from the business and greater profit for the organization. By way of customer examples, this presentation discusses the importance of engaging the business early and the importance of being able to tell an engaging story about the ‘Art of the Possible’.
New video! Manage your vSphere environment with a powerful solution that leverages the cloud, always-on analytics, and data science to optimize visibility, insight and control. Five minutes to activate, and you can increase utilization, boost performance, diagnose, troubleshoot and fix issues—even proactively—ten times faster. Eliminate capacity shortfalls and drill down to root causes, hazards and more.
Join us for a live Big Data Analytics customer case study webcast featuring Dana Gardner, a leading IT industry analyst at Interarbor Solutions, as he interviews Procera Networks executive, Cam Cullen.
Learn how Procera Networks dealt with massive data volume challenges to provide network performance benefits to its global users, powered by HPE Vertica. HPE Vertica is the industry’s first comprehensive, scalable, open, and secure platform for Big Data Analytics.
Financial advisors espouse that proper asset allocation during times of market volatility can help us sleep better. There is a parallel in the IT world. With volatility driven by technology advancement, virtual & cloud environments, and consumer demand for the newest applications and hardware, a good night’s sleep for an Asset Manager requires properly managed and optimally allocated hardware and software assets in a constantly changing environment.
This session explores this intimidating world, common pitfalls, prescriptive actions and what the latest technology can do to make sure your assets, licenses and infrastructure are optimally aligned to drive wealth in IT…and let the Asset Manager sleep well without the fear of negative audit findings and exorbitant fines.
A movement is underway. Businesses are awakening to a new era of the digital enterprise, requiring companies to find new ways of delivering their services built for the digital era. Success in this new era requires a digital industrialization strategy in which datacenters become the core asset of the business enabling a transformation from an infrastructure that is tightly coupled with the business to a modern infrastructure that enables any business.
Digital industrialization is a continuous cycle that organizations can use to turn IT infrastructure from a cost into an asset by standardizing on one set of technologies and economics across facilities, hardware, software, and operations; consolidating datacenters; abstracting functionality; automating operations and governing it all to ensure security, integrity, and compliance.
This presentation will go through why digital industrialization is needed, what the benefits are and how the Ericsson Cloud portfolio facilitates it.
Would you like to cut complexity across all phases of app development and deployment?
Join us for this straightforward discussion on how CA Application Lifecycle Conductor reduces risk through a single source-of-truth. CA Application Lifecycle Conductor automates and manages the software development lifecycles that span mobile-to-mainframe environments — from the initial service desk ticket to the deployment of the application in production.
Join Rose Sakach, Sr. Principal Product Manager, and Vaughn Marshall, Director, Product Management as they outline CA Application Lifecycle Conductor’s many benefits. Discover how you can:
• Create one view and traceability for the application development lifecycle
• Identify the potential time savings for project managers, release managers and compliance managers
• Determine which customer segments would benefit the most from adopting CA ALC
Are you ready to simplify application lifecycle management—from mobile to mainframe?
With the average company experiencing unplanned downtime 13 times a year, the costs associated with continuing to invest in a legacy backup solution can be extensive. For this reason, more customers are switching to Veeam® and Quantum than ever before. Update to a modern data center and achieve Availability for the Always-On Enterprise™ with Veeam coupled with Quantum’s tiered storage that increases performance, reduces bandwidth requirements and executes a best practices for data protection.
After a record-setting year in 2015, where will the tech M&A market go in 2016? What trends that pushed M&A spending to its highest level since the Internet Bubble burst will continue to drive deals and which ones will wind down? What other sectors are likely to see the most activity this year? And most importantly, what valuations will be handed out in deals over the coming year? Drawing on data and views from across 451 Research, the Tech M&A Outlook webinar maps many of the major developments in the IT landscape (IoT, Big Data, cloud computing) to how those influence corporate acquisition strategies. Join us for a look ahead to what we expect for tech M&A in 2016.
In this session, we will review foundational cryptographic concepts and discuss the key highlights of the new DB2 native encryption capability, including encrypting online data, encryption backup images and key management.
dashDB is a newly announced data warehouse as a service deployed in the cloud that leverages technologies like BLU Acceleration, in-database analytics and Cloudant to allow you to focus more on the business and less on the business of IT. In this DB2 Tech Talk you will learn a little more about IBM’s cloud initiatives and the value proposition around dashDB as well as...
-dashDB’s architecture and use cases
-pricing and offerings as as service
-competitive differentiations and customer feedback
DB2's most recent release added significant new capabilities for both transaction and analytic processing. This presentation will take you through the new features and what you need to know. Attend this DB2 Tech Talk to learn about:
• BLU Shadow Tables
• Expanded Oracle SQL compatibility for data marts
• Enhanced SAP Business Warehouse support
• Standard TCP/IP sockets for pureScale
• Additional capabilities
In-memory, columnar, faster analytics, high availability and scalability. There is a lot happening in the data management space these days. In this presentation, we will go through some of the key capabilities in the newest release of DB2, including BLU Shadow Tables, standard TCP/IP sockets for pureScale, and other enhancements. We'll also talk through some of the latest competitive announcements and how DB2 stacks up to these offerings.
Soloman Barghouthi (presenter), Rick Swagerman (host)
In a 3 tier architecture, an application server is considered tier 2 and a database is considered tier 3. IBM offers a well known Composable, secure and high scalable application server called WebSphere Application Server. IBM also offers multiple databases one of which is is DB2. This session will explain some of the key integration features between the two products and why this combination helps you create a secure, scalable and highly reliable solution for your middleware and backend database. Topics covered in this session are:
-- Security: Trusted connection support
-- Scalability: Heterogenous pooling support and Lock sharing
-- Resiliency: Automatic client reroute and failover enhancements
-- and more
In this session from the IDUG 2014 Technical Conference in North America, Randy Ebersole highlights some of the new features of DB2 11. He introduces some of the new functionality to get you thinking about moving to DB2 11!
Guersad Kuecuek (presenter) and Rick Swagerman (host)
Analytics are the hot business need because the competitive advantage goes to the organization who is first to discover and capitalize on the business insight. Data is at the root of this, but it is the analytics that provide data value. For many years DB2 has been used with OLAP and OLTP systems by SAP customers worldwide to deliver a cost-effective database engine to SAP applications.
Now with DB2 10.5 with BLU Acceleration, you can use this next generation in-memory technology to deliver these insights from data fast and simply. DB2 with BLU Acceleration is an in-memory technology, which makes analytical jobs by factors faster right out of the box, while requiring dramatically less storage and nearly eliminating the need for tuning.
Although BLU Acceleration is in-memory optimized, it is not main memory-limited. DB2 BLU is highly optimized for accessing data in RAM, but performance won’t suffer as data size grows beyond RAM. These benefits are achieved through next generation columnar processing, operating on compressed data, carefully exploiting modern microprocessor designs, and accessing memory efficiently. The result is a system that simultaneously looks and feels like DB2 while being in-memory optimized, CPU-optimized and I/O-optimized.
SAP-DB2 customers have seen tremendous performance improvements with the same hardware. Because of this, customers can reuse their existing IT infrastructure and simply update to the newer DB2 10.5 version-- converting row-store tables to column-store tables.
Join DB2 - SAP expert Guersad Kuecuek for a discussion of the technical features of this solution including how to use BLU, how to derive performance improvement and how to do the sizing for BLU.
Judy Ruby-Brown gives a thorough overview of system level backup and using FlashCopy for disaster recovery as provided by various vendors. This presentation is quite thorough but yet very easy to understand, even for a beginner. Judy explains the types of services that need to be setup for disaster recovery as well as easy scenarios for the DBA in the implementation of disaster recovery. Just the summary at the end is enough to pass on to the system programmers in your shop as a checklist for implementation.
Jef Treece and Shawn Moe, IBM Labs; with Rick Swagerman (host)
The "Internet of Things" is the growing number of devices and sensors that communicate via the Internet, offering vast new opportunities. Harnessing data from billions of connected devices lies in the ability to store, access and query SQL and NoSQL data together, seamlessly. The result of all this data is the need for fast development of web and mobile applications and speed-of-thought insight for fast business decisions.
Learn how to use Informix and DB2 to bridge the gap between new computing technologies designed for big data, cloud, and mobile computing with the enterprise world of relational data in the "Internet of Things" era. Jef Treece and Shawn Moe will present how to develop mobile applications using techniques that preserve the native look and feel specific to such devices as Android and iOS. They will also show device-agnostic techniques and discuss new approaches to handling a range of types of mobile devices. A discussion of how to keep data on the device in sync with a back-end database or cloud service, such as DB2 or Informix on BlueMix will also be included. And finally, using an embedded database to achieve a truly scalable and reliable Internet of Things architecture will be included.
In this comprehensive overview of DB2 10 for z/OS Lori Ann Galluzzo goes through the features of DB2 10 in great detail. Each feature is discussed, along with how and when to take advantage of the feature, as well as some pitfalls and things to watch out for.
Les King, Kelly Schlamb, Vladimir Stojanovski (presenters) and Rick Swagerman (host)
Today in order to remain competitive, there is a tremendous demand on lines of business to be able to analyze more aspects of business data and do this very quickly. There is a control shift to the business analyst so they can analyze what they need to, NOW!
Big Data requirements are also emerging, even as everyone is being asked to accomplish more with less. Join us to learn how the combination of DB2 10.5 with BLU Acceleration and Cognos BI 10.2 can help bring sanity to satisfying these ever increasing demands.
We will provide an overview and technical deep dive of DB2 10.5 with BLU Acceleration, Cognos BI 10.2 and Dynamic Cubes. Join experts from the DB2 and the Cognos BI teams who will also explain how these two offerings fit together and the benefits they have provided to clients already. See the value that DB2 with BLU Acceleration can bring to Cognos BI, and ask these experts your questions, so you can understand how this solution helps solve business challenges.
Conference Chairman Bob Vargo gives attendees a detailed overview of the conference, seminars, keynotes, and the technical sessions that will be featured at this year's North American conference. Viewers are introduced to the schedule, speakers, and are given tips and insights into the conference.
Vladimir Stojanovski, Predictive & Business Intelligence Segment Leader
Watch a demonstration of IBM Cognos Business Intelligence running on a DB2 BLU Acceleration in-memory database.
The demo compares, side-by-side, the performance of a dashboard sitting on top of a traditional row-based data warehouse with indexes, versus a dashboard sitting on top of the same database, with BLU Acceleration.
BLU Acceleration is an innovation from IBM Research and Development labs that delivers a new generation of In-memory database technology, for "speed of thought" analytics. Coupled with Cognos Dynamic Cubes, it results in sub-second query result times. See it for yourself!
SQL is a powerful language that you will learn to use to a greater extent in this talk! Join us to explore some of the less well-known features of SQL and see how they can help in some practical situations. We will also look at some common mistakes and misconceptions and discuss ways of avoiding them. Among the topics:
- tricks with date and time values
- use of common table expression and recursive queries
- use of data change tables
- use of nested tables
- use of the MERGE statement
- and more.
Nick Ivanov is presenting this talk based on his consulting experience with many customers to maximize use of IBM DB2 and other products; move applications to DB2; and tune database and query performance.
Although the DBA’s tuning efforts often result in both a decrease of CPU consumption and end user response times, the perception of DBA activities at the C-level sometimes is more of the kind “why didn't they do that before?”.
This presentation helps you to better understand the C-level perspective and shows you how we managed to improve the image of the DBA team from a reactive team of techies to a proactive team that is directly involved in improving the business results.
Sriram Padmanabhan (presenter) and Rick Swagerman (host)
IBM InfoSphere Information Server is a market-leading data integration and governance platform. It provides a set of capabilities to build confidence in your data by helping you understand, cleanse, transform and deliver trusted information to critical business initiatives, such as big data, master data management and point-of-impact analytics. Information Server provides comprehensive connectivity to DB2 (including DPF and BLU Acceleration) and it can also push computation to DB2 when appropriate. By using Information Server with DB2, you get its best-of-breed capabilities to integrate DB2 data and manage its metadata as well.
Join us for this DB2 Tech Talk to learn how using these products together can help you deliver trusted information to your business, support big data initiatives, and more.
In March 2013 the IDUG DB2 11 Editorial Committee was formed, comprised of volunteers from IDUG’s worldwide community of DB2 users and consultants. Working alongside IBM's formal Early Support Program (ESP), the Editorial Committee gained valuable insight into exactly what makes the new release tick. The White Paper, available on the IDUG web site, contains their findings, including practical experiences and independent evaluations of the new features.
DB2 11 contains a large number of enhancements, but some will be of more interest than others depending on your job role and background. The paper has been organized according to the features likely to be most applicable to each of the three major technical roles in a DB2 for z/OS environment: Systems Programmers, Database Administrators and Application Developers.
This presentation from the IDUG EMEA 2013 Technical Conference looks at some of the key findings presented in the paper.
IBM InfoSphere Optim Workload Replay can significantly improve your database testing experience and help you successfully manage change in your database environments, whether you are adopting BLU Acceleration, upgrading DB2, or making the move to DB2. Using InfoSphere Workload Replay, you can build realistic testing environments to help assess a wide range of database changes before production deployment and without extensive script creation or application setup. This makes the change process faster and less risky.
In this session, we will examine the use of InfoSphere Workload Replay to facilitate database testing in efforts to validate version, platform or infrastructure changes. We will discuss how to use the tool to prepare for BLU Acceleration, and review some of the key use cases. Learn the testing methodology and the various features to control capture, replay and generate reports. Using this realistic testing approach you will find key issues sooner in the test cycle, and resolve them before the deployment day.
Anson Kokkat (presenter), Tony Leung (presenter), Rick Swagerman (host)
Stored Procedures and user-defined functions are a great way to consolidate database logic, improve database performance, code re-use, security and integrity. Stored Procedure development in DB2 has always been an integral part of the database, and in this DB2 Tech Talk, we will look at how IBM Data Studio helps accelerate the development and debugging process.
IBM Data Studio is a comprehensive database development and administration environment. It includes the tools needed to develop and deploy Stored Procedures. We will look at the various wizards and options available to create a SQL Stored Procedure and walk through the debugging of that stored procedure in a DB2 environment.
Joint DB2 Tech Talk series sponsored by IDUG with IBM
This one-stop webcast channel makes it fast and easy to build and extend your technical skills on the IBM DB2 product family. IDUG has joined forces with IBM to bring together a wide array of learning sessions on technical product features, related products, new product developments and more.