DB2 Tech Talk: Part Two Certification Prep for DB2 10 Fundamentals Exam
Validated DB2 skills help you stand out with your employer, meet annual education objectives, and create an edge in the employment marketplace.
IBM is offering the all new DB2 10.1 Fundamentals Certification (exam 610) to attendees of the Information on Demand global conference in Las Vegas, Nevada from October 21 – 25, 2012. And if you are not attending the conference, IBM is offering the Take it Again promotion through November 30, 2012.
This two-part Tech Talk can help you prepare for the exam. Part one was presented on Thursday July 26 from 12:30 to 2:00 PM ET. Part two is presented on Thursday August 2nd from 12:30 – 2:00 PM ET.
Basics of the DB2 products and tools, DB2 pureScale feature, plus data warehousing and OLAP concepts
·Security including authentication, authorization and privileges, and the new Row and Column Access Control (RCAC) feature
·Working with databases and database objects including use of DDL statements to create objects
·Accessing data using SQL, to query data including XML data with XQuery, and use of Time Travel Query with temporal tables
·Working with tables, views and indexes including XML and Oracle compatibility data types and the new temporal tables
·Data concurrency mechanisms
This talk is hosted by Serge Rielau, SQL architect for DB2 for LUW, with presentations by experienced DB2 instructor Andre Albuquerque.
The time is 12:30 PM ET, however note that your view may present in your local time zone.
Data professionals tend to see Hadoop as an extension of the data warehouse architecture and not a replacement; however it can reduce the overhead on expensive data warehouses by moving some of the data and processing to Hadoop. The Big Data framework has been extended beyond the warehouse to incorporate operational use cases such as customer insight 360, real-time offers, monetisation, and data archival. Generating value from big data requires the right tools to move and prepare data to effectively discover new insights. In order to operationalize those insights, new data must integrate securely with existing data, infrastructure, applications, and processes.
In this webinar you will see how Oracle and Hortonworks has made it possible for you to accelerate your Big Data Integration without having to learn MapReduce, Spark, Pig or Oozie code. In fact, Oracle is the only vendor that can automatically generate Spark, HiveQL and Pig transformations from a single mapping which allows our customers to focus on business value and the overall architecture rather than multiple programming languages
The adoption of performance management methodologies across all types of businesses continues, but the expected step-change in organisational performance still today, often proves elusive.
Join Information Builders and Phil Jones, a thought-leader in performance management and business strategy, where we will identify some of the key reasons for such mixed results.
In this 30 minute webinar, we will explain the importance of connecting up common performance management practices with decision-making processes within organisations, to deliver a significant improvement in operational performance.
Attend this webinar to learn:
- How decisions really get made in organisations
- Where traditional performance management thinking can go wrong
- How to make sense of the maelstrom of KPIs, measures and useful information
- Best practices for taking and acting on decisions to improve performance
- How to really learn from decision making and performance management
Nichts im Sortiment gefunden? Doch ihre Umsätze können steigen - das Geheimnis der Customer Journey...oder was Sie schon immer über ihre Produkte und das Verhalten Ihrer Kunden wissen wollten.
Herr Daniel Wrigley (SHI) und Herr Andreas Leichtle (Hortonworks) werden einen Vortrag, in Form einer Fallstudie zum Thema halten und dazu eine Live-Demo mit informativen Dashboards zeigen.
Worum es geht: Jeder Kunde hinterlässt überall seine digitale Spuren, die im Kontext mit anderen Daten, auch aus anderen Systemen, neue Erkenntnisse liefern, die den Geschäftserfolg steigern werden.
Wir betrachten die Ausgangslage eines E-Commerce Shops:
1. Steigender Traffic im Shop wird registriert (die Marketingaktivitäten wirken also)
2. Der Umsatz bleibt konstant (eigentlich war die Erwartungshaltung, ein steigender Umsatz)
3. Und die Conversion-Rate sinkt sogar
Diese Fragen suchen nun nach einer Antwort:
1. Wie viele Kampagnen gab es zu welchem Produkt?
2. Was sind meine Topseller?
3. Sind meine Topseller profitabel?
4. Was beeinflusst meinen Profit?
5. Gibt es Kundenfeedback?
6. Wie gelangen die User zu Produkten und wie werden sie aufgefunden?
Die komplexen Zusammenhänge werden gezeigt:
Die Vielzahl von Daten und Informationen, unstrukturiert oder strukturiert, geben den Aufschluss. Live, in einer Demo, sehen Sie, durch Dashboards veranschaulicht, wie diese Fragen eine Antwort finden. Warum kam es zu dieser Ausgangslage und welche Maßnahmen müssen ergriffen werden, um diese Situation zu bereinigen damit die ursprünglichen Ziele doch erreicht werden können.
Die Analyse dieses Falls wird 4 Aktionen aufzeigen, die zu der Lösung umgesetzt werden müssen.
Join guest speaker Holger Kisker Ph.D. as he discusses what companies need today: a flexible data management architecture to cope with both traditional and emerging sources of data (in any structure), advanced data analytics to extract deeper business insights, and efficient ways to deliver these insights as information or data services for better business decisions. All embedded into an efficient data virtualization layer that makes all data available when, where, and in whatever format it is needed.
Watch a short demo on the technology behind search-driven analytics, the future of Business Intelligence. You will see how ThoughtSpot allows anyone - with zero training - to get answers from their data and make smarter decisions.
This webinar will focus around the What, Why and How of Sales Performance Management (SPM) and Incentive Compensation Management (ICM) automation.
Learn What SPM can do for your organisation, its benefits and real life stories of other organisations and how they are thriving in a new, more automated world. Based upon research, you will also be exposed to the answers as to Why organisations embark on the journey to automation. Finally, you will be shown a demonstration as to How an automated solution can be used to enhance sales performance, reduce administrative headaches and allow your organisation to move from a day to day survival mode of calculation to using IBM’s ICM solution as a strategic advantage.
What Is OpenSymmetry?
OpenSymmetry is a trusted leader and global advisor that delivers Sales, HR and Finance solutions to drive accelerated business performance. With a leading success rate across more than 20 industries worldwide, you can count on OpenSymmetry to deliver the comprehensive solutions you need for a proven path to rapid results.
James Mulligan - James has been working in the SPM industry for over 9 years. With experiences across numerous products and platforms, as well as multiple implementation and selling roles, James has the knowledge and experience to speak to every level of the SPM engagement. James joined Varicent (now IBM ICM) in June 2006 as an Implementation Consultant and held numerous roles at both Varicent and IBM, before moving to OpenSymmetry in 2014 to lead sales in EMEA.
Jon Clark – Jon joined OpenSymmetry in 2009. He leads the Strategy Services practice for EMEA, which focuses on Smarter Sales development, incentive plan design, business case for automation and future state design. In addition, Jon has experience in wider reward and performance management design.
Large-scale forecasting problems require you to understand the structure of your time series data. Whether you’re a retailer trying to determine the right assortment of SKUs for your stores, part of a telco anticipating call center demand for labor capacity planning, or in any industry where you need to make forecasts over time – this webinar will be of interest.
Join us as we discuss an approach for successfully forecasting the wide range of time series patterns through time series segmentation. We will also show how the new SAS® Forecast Server client makes time series segmentation an integrated part of the forecasting process.
Watch and learn:
• The basics of time series segmentation and the value of its application to large-scale forecasting challenges.
• How to apply the right forecast modeling strategy to any time series pattern with the new SAS Forecast Server client.
See how CA manages and monitors our own CA SAP environment
SAP applications and the business processes they enable often represent the very core of your business. Customers, partners, and employees expect availability, responsiveness and secure access, while the CEO expects that budget objectives will be met and efficiencies exploited to improve the bottom line. CA Technologies provides a robust portfolio of SAP certified IT management solutions which help reduce the cost and risk associated with delivering your SAP enabled business processes, while helping to enable the service levels needed for more optimal business operations.
In this session you’ll learn:
• The business drivers, technical challenges and importance of SAP in the daily works of CA
• How CA manages its own SAP implementation with CA APM solutions
Do you need to build and retrain lots of predictive models quickly? Do you want to test a variety of statistical and machine learning algorithms and easily find the best-performing model? Do you need automated machine learning at scale?
If the answer is yes to any of these questions, you’ll want to watch this on-demand webinar. Learn from SAS experts Sascha Schubert and Jonathan Wexler as they talk about the ins and outs of automated machine learning at scale. They’ll also demo one of SAS’ newest offerings – SAS® Factory Miner.
Watch this on-demand webinar and learn how to:
• Easily build and retrain hundreds of predictive models across multiple segments. (With a drag-and-drop interface!)
• Automatically pick the best model for each segment.
• Quickly deploy champion models in different environments.
• Take advantage of automated machine learning techniques.
In this session, we will review foundational cryptographic concepts and discuss the key highlights of the new DB2 native encryption capability, including encrypting online data, encryption backup images and key management.
dashDB is a newly announced data warehouse as a service deployed in the cloud that leverages technologies like BLU Acceleration, in-database analytics and Cloudant to allow you to focus more on the business and less on the business of IT. In this DB2 Tech Talk you will learn a little more about IBM’s cloud initiatives and the value proposition around dashDB as well as...
-dashDB’s architecture and use cases
-pricing and offerings as as service
-competitive differentiations and customer feedback
DB2's most recent release added significant new capabilities for both transaction and analytic processing. This presentation will take you through the new features and what you need to know. Attend this DB2 Tech Talk to learn about:
• BLU Shadow Tables
• Expanded Oracle SQL compatibility for data marts
• Enhanced SAP Business Warehouse support
• Standard TCP/IP sockets for pureScale
• Additional capabilities
In-memory, columnar, faster analytics, high availability and scalability. There is a lot happening in the data management space these days. In this presentation, we will go through some of the key capabilities in the newest release of DB2, including BLU Shadow Tables, standard TCP/IP sockets for pureScale, and other enhancements. We'll also talk through some of the latest competitive announcements and how DB2 stacks up to these offerings.
Soloman Barghouthi (presenter), Rick Swagerman (host)
In a 3 tier architecture, an application server is considered tier 2 and a database is considered tier 3. IBM offers a well known Composable, secure and high scalable application server called WebSphere Application Server. IBM also offers multiple databases one of which is is DB2. This session will explain some of the key integration features between the two products and why this combination helps you create a secure, scalable and highly reliable solution for your middleware and backend database. Topics covered in this session are:
-- Security: Trusted connection support
-- Scalability: Heterogenous pooling support and Lock sharing
-- Resiliency: Automatic client reroute and failover enhancements
-- and more
In this session from the IDUG 2014 Technical Conference in North America, Randy Ebersole highlights some of the new features of DB2 11. He introduces some of the new functionality to get you thinking about moving to DB2 11!
Guersad Kuecuek (presenter) and Rick Swagerman (host)
Analytics are the hot business need because the competitive advantage goes to the organization who is first to discover and capitalize on the business insight. Data is at the root of this, but it is the analytics that provide data value. For many years DB2 has been used with OLAP and OLTP systems by SAP customers worldwide to deliver a cost-effective database engine to SAP applications.
Now with DB2 10.5 with BLU Acceleration, you can use this next generation in-memory technology to deliver these insights from data fast and simply. DB2 with BLU Acceleration is an in-memory technology, which makes analytical jobs by factors faster right out of the box, while requiring dramatically less storage and nearly eliminating the need for tuning.
Although BLU Acceleration is in-memory optimized, it is not main memory-limited. DB2 BLU is highly optimized for accessing data in RAM, but performance won’t suffer as data size grows beyond RAM. These benefits are achieved through next generation columnar processing, operating on compressed data, carefully exploiting modern microprocessor designs, and accessing memory efficiently. The result is a system that simultaneously looks and feels like DB2 while being in-memory optimized, CPU-optimized and I/O-optimized.
SAP-DB2 customers have seen tremendous performance improvements with the same hardware. Because of this, customers can reuse their existing IT infrastructure and simply update to the newer DB2 10.5 version-- converting row-store tables to column-store tables.
Join DB2 - SAP expert Guersad Kuecuek for a discussion of the technical features of this solution including how to use BLU, how to derive performance improvement and how to do the sizing for BLU.
Judy Ruby-Brown gives a thorough overview of system level backup and using FlashCopy for disaster recovery as provided by various vendors. This presentation is quite thorough but yet very easy to understand, even for a beginner. Judy explains the types of services that need to be setup for disaster recovery as well as easy scenarios for the DBA in the implementation of disaster recovery. Just the summary at the end is enough to pass on to the system programmers in your shop as a checklist for implementation.
Jef Treece and Shawn Moe, IBM Labs; with Rick Swagerman (host)
The "Internet of Things" is the growing number of devices and sensors that communicate via the Internet, offering vast new opportunities. Harnessing data from billions of connected devices lies in the ability to store, access and query SQL and NoSQL data together, seamlessly. The result of all this data is the need for fast development of web and mobile applications and speed-of-thought insight for fast business decisions.
Learn how to use Informix and DB2 to bridge the gap between new computing technologies designed for big data, cloud, and mobile computing with the enterprise world of relational data in the "Internet of Things" era. Jef Treece and Shawn Moe will present how to develop mobile applications using techniques that preserve the native look and feel specific to such devices as Android and iOS. They will also show device-agnostic techniques and discuss new approaches to handling a range of types of mobile devices. A discussion of how to keep data on the device in sync with a back-end database or cloud service, such as DB2 or Informix on BlueMix will also be included. And finally, using an embedded database to achieve a truly scalable and reliable Internet of Things architecture will be included.
In this comprehensive overview of DB2 10 for z/OS Lori Ann Galluzzo goes through the features of DB2 10 in great detail. Each feature is discussed, along with how and when to take advantage of the feature, as well as some pitfalls and things to watch out for.
Les King, Kelly Schlamb, Vladimir Stojanovski (presenters) and Rick Swagerman (host)
Today in order to remain competitive, there is a tremendous demand on lines of business to be able to analyze more aspects of business data and do this very quickly. There is a control shift to the business analyst so they can analyze what they need to, NOW!
Big Data requirements are also emerging, even as everyone is being asked to accomplish more with less. Join us to learn how the combination of DB2 10.5 with BLU Acceleration and Cognos BI 10.2 can help bring sanity to satisfying these ever increasing demands.
We will provide an overview and technical deep dive of DB2 10.5 with BLU Acceleration, Cognos BI 10.2 and Dynamic Cubes. Join experts from the DB2 and the Cognos BI teams who will also explain how these two offerings fit together and the benefits they have provided to clients already. See the value that DB2 with BLU Acceleration can bring to Cognos BI, and ask these experts your questions, so you can understand how this solution helps solve business challenges.
Conference Chairman Bob Vargo gives attendees a detailed overview of the conference, seminars, keynotes, and the technical sessions that will be featured at this year's North American conference. Viewers are introduced to the schedule, speakers, and are given tips and insights into the conference.
Vladimir Stojanovski, Predictive & Business Intelligence Segment Leader
Watch a demonstration of IBM Cognos Business Intelligence running on a DB2 BLU Acceleration in-memory database.
The demo compares, side-by-side, the performance of a dashboard sitting on top of a traditional row-based data warehouse with indexes, versus a dashboard sitting on top of the same database, with BLU Acceleration.
BLU Acceleration is an innovation from IBM Research and Development labs that delivers a new generation of In-memory database technology, for "speed of thought" analytics. Coupled with Cognos Dynamic Cubes, it results in sub-second query result times. See it for yourself!
SQL is a powerful language that you will learn to use to a greater extent in this talk! Join us to explore some of the less well-known features of SQL and see how they can help in some practical situations. We will also look at some common mistakes and misconceptions and discuss ways of avoiding them. Among the topics:
- tricks with date and time values
- use of common table expression and recursive queries
- use of data change tables
- use of nested tables
- use of the MERGE statement
- and more.
Nick Ivanov is presenting this talk based on his consulting experience with many customers to maximize use of IBM DB2 and other products; move applications to DB2; and tune database and query performance.
Although the DBA’s tuning efforts often result in both a decrease of CPU consumption and end user response times, the perception of DBA activities at the C-level sometimes is more of the kind “why didn't they do that before?”.
This presentation helps you to better understand the C-level perspective and shows you how we managed to improve the image of the DBA team from a reactive team of techies to a proactive team that is directly involved in improving the business results.
Sriram Padmanabhan (presenter) and Rick Swagerman (host)
IBM InfoSphere Information Server is a market-leading data integration and governance platform. It provides a set of capabilities to build confidence in your data by helping you understand, cleanse, transform and deliver trusted information to critical business initiatives, such as big data, master data management and point-of-impact analytics. Information Server provides comprehensive connectivity to DB2 (including DPF and BLU Acceleration) and it can also push computation to DB2 when appropriate. By using Information Server with DB2, you get its best-of-breed capabilities to integrate DB2 data and manage its metadata as well.
Join us for this DB2 Tech Talk to learn how using these products together can help you deliver trusted information to your business, support big data initiatives, and more.
In March 2013 the IDUG DB2 11 Editorial Committee was formed, comprised of volunteers from IDUG’s worldwide community of DB2 users and consultants. Working alongside IBM's formal Early Support Program (ESP), the Editorial Committee gained valuable insight into exactly what makes the new release tick. The White Paper, available on the IDUG web site, contains their findings, including practical experiences and independent evaluations of the new features.
DB2 11 contains a large number of enhancements, but some will be of more interest than others depending on your job role and background. The paper has been organized according to the features likely to be most applicable to each of the three major technical roles in a DB2 for z/OS environment: Systems Programmers, Database Administrators and Application Developers.
This presentation from the IDUG EMEA 2013 Technical Conference looks at some of the key findings presented in the paper.
IBM InfoSphere Optim Workload Replay can significantly improve your database testing experience and help you successfully manage change in your database environments, whether you are adopting BLU Acceleration, upgrading DB2, or making the move to DB2. Using InfoSphere Workload Replay, you can build realistic testing environments to help assess a wide range of database changes before production deployment and without extensive script creation or application setup. This makes the change process faster and less risky.
In this session, we will examine the use of InfoSphere Workload Replay to facilitate database testing in efforts to validate version, platform or infrastructure changes. We will discuss how to use the tool to prepare for BLU Acceleration, and review some of the key use cases. Learn the testing methodology and the various features to control capture, replay and generate reports. Using this realistic testing approach you will find key issues sooner in the test cycle, and resolve them before the deployment day.
Anson Kokkat (presenter), Tony Leung (presenter), Rick Swagerman (host)
Stored Procedures and user-defined functions are a great way to consolidate database logic, improve database performance, code re-use, security and integrity. Stored Procedure development in DB2 has always been an integral part of the database, and in this DB2 Tech Talk, we will look at how IBM Data Studio helps accelerate the development and debugging process.
IBM Data Studio is a comprehensive database development and administration environment. It includes the tools needed to develop and deploy Stored Procedures. We will look at the various wizards and options available to create a SQL Stored Procedure and walk through the debugging of that stored procedure in a DB2 environment.
Joint DB2 Tech Talk series sponsored by IDUG with IBM
This one-stop webcast channel makes it fast and easy to build and extend your technical skills on the IBM DB2 product family. IDUG has joined forces with IBM to bring together a wide array of learning sessions on technical product features, related products, new product developments and more.