Mastering the DB2 10.1 Certification Exam – Part 1: Planning

Norberto Gasparotto Filho is a database specialist with more than 10 years of experience with database administration and is the author of the DB2 10.1 fundamentals certification exam 610 prep, Part 1: Planning.

He was the winner of the first edition of “DB2’s Got Talent” contest in 2011. He also worked as programmer using a variety of technologies, and has certifications in both programming and database administration areas. In his blog (“Insights on DB2 LUW database admin, programming and more”), Norberto shares lessons learned in the day-to-day database administration work, tips and knowledge. During his spare time, Norberto likes to run, ride a bike and have fun with his kids and wife. Learn more in Norberto’s profile in the developerWorks community.

We caught up with him to ask him and asked him about the tutorial and what he feels are the most important takeaways for DB2 professionals looking to become certified:

Q) What products are discussed in this tutorial?
DB2 10.1 for Linux, Unix and Windows and some related tools like Data Studio – with mentions to DB2 for the z/OS world as well.

Q) What problem does it solve?
If you don’t have an entry-level certification in DB2, this tutorial should really help you! The Part 1 (Planning) tutorial covers everything in the Planning set of questions – so read it carefully! Along with information about the different distributions of DB2 and their features, you’ll know specifics about Data Studio, CLPPlus and other tools, notions about Data Warehouse/OLTP workloads and much more. Of course, you have to study all the tutorials in the series to be completely prepared for the test.

Q) Why should DB2 professionals be interested in getting certification?
When you need to buy something, don’t you prefer to have a product that has an endorsement? Like buying shoes or any sports goods, those that have ads with famous athletes endorsing usually catch your eye more easily than others, isn’t it?

The same happens to professionals and their certifications. When you have one or more certifications, companies that are looking for candidates will prefer YOU. With your DB2 certification, IBM will be endorsing you… How fancy is that? :)

Q) Do have any special tips?

  • Don’t limit your studying to just the tutorials. Nothing can replace hands-on experience.
  • Pay attention to which version of DB2 you are studying for and make sure it aligns with the test you are taking. We are talking about v10.1, even though v10.5 has already been released. The exam questions were written before new features of DB2 10.5 were introduced (and even before new features of latest fix packs of v10.1 were released).
  • When taking the exam, don’t spend much time analyzing or thinking about a specific question. If you are unsure, chose your answer (never leave it blank) and mark the question for review (then if you have time at the end, you can back to it and think more – maybe after answering other questions you’ll find tips for answering it).
  • After you’ve completed and passed the exam – let the world know about your accomplishment! Spread the word via social media (your LinkedIn profile, Twitter, Facebook, and of course, add it to your CV). IBM has a website where you can make public your list of certifications. Just go through (Member Site / My Profile / Public Profile Information). The information is updated automatically, and you can add your picture and more info about yourself – like links to your LinkedIn/Twitter profiles and more. Take a look at mine:

To see the entire tutorial, visit the link below:

DB2 10.1 fundamentals certification exam 610 prep, Part 1: Planning

Webinar: Why IBM DB2 Database Software Leaves the Competition in the Dust

We all know that IBM DB2 database software is an industry leader when it comes to performance, scale, and reliability. But how does it compare to the competition, specifically Oracle and SAPHANA?

IBM’s Chris Eaton joined IDUG’s DB2 Tech Talk to give an update on IBM DB2 and show how DB2 goes above and beyond our competitors to provide reliable functionality while saving businesses money.

During the presentation, Chris walked the audience through DB2’s latest release, DB2 10.5 “Cancun Release” and the four innovations that make BLU Acceleration  different from our competitors: Next Generation In-Memory, the ability to analyze compressed data, CPU Acceleration, and data skipping.

You can watch the entire presentation on the IDUG website by clicking here, and also review the Tweets from the event by logging on to the IBM DB2 Storify page here.

Still have additional questions? Feel free to leave them in the comment box below and we’ll get the answers to you shortly.

Chris eatonAbout Chris Eaton – Chris is a Worldwide Technical Sales Specialist for DB2 at IBM primarily focused on planning and strategy for DB2 on Linux, UNIX and Windows. Chris has been working with DB2 on the LUW platforms for over 21 years. From customer support to development manager to Externals Architect to Product Manager for DB2, Chris has spent his career listening to customers and working to make DB2 a better product. Chris is also the author of The High Availability Guide for DB2 , DB2 9 New Features, Break Free with DB2 9.7: A Tour of Cost-Slashing New Features and Understanding Big Data Analytics For Enterprise Class Hadoop And Streaming Data. Follow Chris on his blog here.

Multiple aggregations and the SQL table function: A case study

The DB2 Social media team caught up with Nattavut Sutyanyong and Kanishka Mandal who are part of the DB2 development team to talk to them about their article “Multiple aggregations and the SQL table function: A case study“. Here is what they has to say :

1) Why should someone read this article?

This article is useful to DBAs or SQL writers who design large DB2 queries that join several tables to create daily, weekly or  monthly reports for their business  analysis.  These kinds of queries are often used by data warehouse customers.

2) What problem does it solve?

This article helps in an environment where reports are generated with the same tables accessed multiple times.  So  it mainly focuses on data fetch redundancy and table data encapsulation.

3) What products are discussed in the paper?

Although the article focuses on the DB2 for LUW product, it can also be used  in warehousing queries.  The idea behind the case study can be applied in any DBMS that supports SQL.  The optimization and encapsulation techniques discussed in this paper are performed at the SQL language level: they are not specific to DB2 for LUW.

4) Do have any special tips?

The example shown in the article creates a common table which is used as a parent table and with which other tables can be joined to this table in order to avoid  redundancy  of data fetched.

Read the full article here.

About the authors:

Nattavut Sutyanyong

Nattavut has been working in SQL rewrite and optimization in DB2 LUW for 14 years. He has been helping numerous IBM customers and partners in tuning their SQL, improving the performance of the systems.

Nattavut Sutyanyong
DB2 Development – Query Compiler
Current project: UDX solution in dynamite
IBM Toronto Lab

Kanishka is working as DB2 LUW Worldwide advanced support analyst for 8 years.His current expertise are in High avaliablity in DB2, Purescale, And enhancing and benchmarking several customer environments.

Kanishka Mandal
DB2 LUW Advanced support

Balluff loves BLU Acceleration too

cassieBy Cassandra Desens
IBM Software Group, Information Management  

BLU Acceleration is a pretty darn exciting advancement in database technology. As a marketing professional, I can tell you why it’s cool..
BLU provides instant insight from real-time operation data,
BLU provides breakthrough performance without the constraints of other in-memory solutions,
BLU provides simplicity with a load-and-go setup,
etcetera, etcetera get the point.

You can read our brochures and watch our videos to hear how DB2 with BLU Acceleration will transform your business. We think it’s the next best thing since sliced bread because we invented it. But is it all it’s cracked up to be? The answer is YES.

Clients all over the world are sharing how BLU Acceleration made a huge, positive difference to their business. Hearing customer stories puts our product claims into perspective. Success stories give us the ultimate answer to the elusive question “How does this relate to me and my business?”. Which is why I want to share with you one of our most recent stories: Balluff.

Balluff is a worldwide company with headquarters in Germany. They have over 50 years of sensor experience and are considered a world leader and one of the most efficient manufacturers of sensor technology.  Balluff relies on SAP solutions to manage their business, including SAP Business Warehouse for their data analysis and reporting.

Over the last few years Balluff experienced significant growth, which resulted in slowed data delivery. As Bernhard Herzog, Team Manager Information Technology SAP at Balluff put it “Without timely, accurate information we risked making poor investment decisions, and were unable to deliver the best possible service to our customers.”

The company sought a solution that would transform the speed and reliability of their information management system. They chose DB2 with BLU Acceleration to accelerate access to their enormous amount of data. With BLU Acceleration Balluff achieved:

  • Reduced reporting time for individual reports by up to 98%
  • Reduced backup data volumes by 30%
  • Batch mode data processing improvements by 25%
  • A swift transition with no customization needed; Balluff transferred 1.5 terabytes of data within 17 hours with no downtime

These improvements have a direct impact on their business. As Bernhard Herzog put it, “Today, sales staff have immediate access to real-time information about customer turnover and other important indicators. With faster access to key business data, sales managers at Balluff can gain a better overview, sales reps can improve customer service and the company can increase sales”.

Impressive, right? While you could argue it’s no sliced bread, it certainly is a technology that is revolutionizing reporting and analytics, and worth try. Click here for more information about DB2 with BLU Acceleration and to take it for a test drive.


For the full success story, click here to read the Balluff IBM Case Study
You can also click here to read Balluff’s success as told by ComputerWoche (Computer World Germany). Open in Google Chrome for a translation option.

What is DB2ssh?

photo.doBy Mihai Iacob
DB2 Security Development

The IBM DB2 pureScale Feature provides high levels of distributed availability, scalability and transparency to the application, but why do I need to enable password-less SSH for the root user in my DB2 pureScale cluster? Well you don’t any longer and this site  explains how to use db2ssh to securely deploy and configure the DB2 pureScale Feature.

Both the DB2 installer and GPFS, the filesystem used by DB2 pureScale, have a requirement to run commands as root on a remote system. Db2ssh provides an alternative to enabling password-less SSH as root, by effectively SSH-ing as a regular user, and then elevating privileges to root to run the require commands.

Wait, isn’t that asking for trouble? Can a non-root user run remote commands as root in my cluster ? Not at all, there are rigorous security checks put in place to make sure only the root user can run commands remotely as root. This is accomplished by having the root user digitally sign any message that is sent to the remote system and having the remote system verify this signature before executing any commands. SSH can also be configured in a secure way to prevent against replay attacks.

Take a look at the article to find out how to configure and troubleshoot DB2ssh.

Exclusive Opportunity to Influence IBM Product Usability: Looking for Participants for Usability Test Sessions – Data Warehousing and Analytics

Arno thumbnail 2By Arno C. Huang, CPE
Designer, IBM Information Management Design
IBM Design making the user the center of our productsThe IBM Design Team is seeking people with a variety of database, data warehousing and analytics backgrounds to participate in usability test sessions. We are currently looking for people who work in one of the following roles: DBA, Architect, Data Scientist, Business Analyst or Developer. As a test participant, you will provide your feedback about current or future designs we are considering, thus making an impact on the design of an IBM product and letting us know what is important to you.

Participating in a study typically consists of a web conference or on-site meeting scheduled around your availability. IBM will provide you with an honorarium for your participation. There are several upcoming sessions, so if you’re interested, we’ll help you find a session that best suits your schedule. If you are interested, please contact Arno C. Huang at

Nothing Endures But Change – Face it With Confidence.

RadhaBy Radha Gowda
Product Marketing Manager, DB2 and related offerings

When faced with change, do you share Dilbert’s frustration (take a look at this Dilbert comic and you’ll see what we mean)?  Wait… don’t be fecklessly hopeless yet!  We understand that keeping up with competition and customer expectations in this constantly changing global economy requires you to continuously enhance products and services.   While change can bring in a wealth of new business opportunities, we also realize implementing these changes may cause a lot of grief, including production delays and deployment day disasters.

To put this in perspective, according to a survey from the Ponemon Institute that is sponsored by Emerson Network Power, the average cost of data center downtime across industries is $7,908 per minute (Survey-Infographic).

OWRFrom a data management perspective, we have a proposal to manage the change –  IBM InfoSphere Optim Workload  Replay.  This tool offers you the ability to capture actual production workload including workload concurrency, the order of SQL execution, all the input variables, and the workload characteristics that are needed to later replay the workload.  It even includes how long the statements ran, what was the SQL code that resulted etc.  You can then replay the captured workload in your pre-production environment and record the outcome.

This comprehensive set of inputs and outputs on both the original and the replayed versions lets you compare and verify if you are getting the same performance in your pre-production environment as you did earlier in production environment.  You can capture millions of SQL statements that run over a period of time in production and analyze how well they fare when replayed in a pre-production environment.

Some of the use cases where you may benefit from Optim Workload Replay are performance and stress testing, database upgrades/migration, on-going database maintenance, capacity planning, introducing new applications, platform consolidation, and periodic disaster recovery validation.

We invite you to check out IBM InfoSphere Optim Workload  Replay page and browse through the solution brief, white paper and more.

Change can be scary, but you now have a reason to smile.

Rapid Insight With Results: Harnessing Analytics in the Cloud

basirBy Basiruddin Syed
DB2 Social Marketing Manager

“It’s Now or Never” is a popular song recorded by Elvis Presley and the lyrics of this song seem to appeal to us now more than ever. Every day we are making decisions in our lives, both personal and financial. We are often seen contemplating which holidays to book, is this right time to invest, etc. We are constantly making decisions and our decisions are based on information, user reviews, and recommendations.

When individuals are facing such pressure to make quick decisions, Imagine how much harder it must be for larger organizations.

Decision-making can be regarded as the cognitive process resulting in the selection of a belief or a course of action among several alternative possibilities.

Individuals are facing a lot of pressure to make rapid decisions and choose the best course of action within a short span of time, and it much more difficult when the time drastically reduces for organizations who’s needs are much higher and the time frame is compressed.

Today’s data-driven organization is faced with magnified urgency around data volume, user needs and compressed decision time frames. In order to address these challenges, Organizations are exploring cloud based BI and analytical technology to accelerate decision making and enhance business performance.

Find out more on how organizations are making this possible “Rapid Insight with results: Harnessing analytics in the cloud”



Make Your Apps Highly Available and Scalable

By Vinayak Joshi
Senior Software Engineer, IBM

The IBM premium data-sharing technologies offer unmatched high-availability and scalability to applications. If you are a JDBC application developer wanting to explore how these benefits accrue to your application and whether you need to do anything special to exploit these benefits, my article – “Increase scalability and failure resilience of applications with IBM Data Server Driver for JDBC and SQLJ” – is a great source of information.

In the article, I explain how turning on a single switch on the IBM Data Server Driver for JDBC and SQLJ opens up all the workload balancing and high availability benefits to your JDBC applications. There is very little required for an application to unlock the workload balancing and high availability features built into the DB2 server and driver technologies.

For those curious about  how the driver achieves this in tandem with pureScale and sysplex server technologies, the article should provide a good end-to-end view. While all the nuts and bolts explanations are provided, it is stressed that all of it happens under the covers, and beyond the bare minimum understanding, application developers and DBA’s need not concern themselves with it too much if they do not wish to.

The aspects a developer needs to keep in mind are highlighted and recommendations on configuring and tuning applications are provided.  We’ve made efforts to keep the reading technically accurate while keeping the language simple enough for a non-technical audience to grasp.

Any and all feedback shall be much appreciated and taken into account. Take a look at the article by clicking here, and feel free to share your thoughts in the comment section below

I’m Your Big Data Problem: Integrate THIS!

RachelBlandby Rachel Bland
Senior Product Manager, IBM Business Analytics Growth Initiatives

Let’s take a look at my profile. Gen X-er. Young kid, busy job, work-life integrated. I will not answer your online surveys, I won’t even answer the phone unless I know who it is because if you really knew me you’d send a text. I don’t listen to voice mail – too disruptive.

I dare you to find me, understanding me, and approach me. The only information you are going to get is spread haphazardly all over the web. My likes and dislikes are out there if you’re looking.

Where is it all?  Amazon has a slice, a big slice of very relevant purchasing info, so does Zappos. UPS and Fedex visit my house every day. Fitbit and MyFitnessPal are my most frequently updated apps. Yahoo has my personal email, Gmail is my dumping ground for spammers. Facebook is my personal life, LinkedIn and Twitter have my professional life. Comcast provides my phone and internet service but keeps calling me looking for someone else.

Sure, I’m difficult, but I’m still a pretty attractive, albeit elusive consumer. I have a good income, a house that needs some work, a penchant for retail therapy, and a pretty decent appetite for convenience products and services. If you’ve got something I want, deliver free, and take returns by mail; there’s a good chance I’ll give you a chance. If only you could read my mind and figure out what I’ll buy next!

This is small example of the tremendous amounts of customer data available to businesses. How you get that data and what you do with it is what will separate the haves from the have nots. In order to truly know your customers and your market segment, you have to do the work.

The reality is that the possibilities to tap into new markets, identify innovative efficiencies and just run better and smoother are there; what’s also very real is the perception that the technical challenge is insurmountable. Well, not so much.  We’ve learned a lot in the past few years as the wealth of information from analysts and vendors demonstrates. The IBM Institute for Business Value has blue prints to help you identify the opportunities for value, and industry experts like Tony Curcio and Ray Wang from Constellation Research can provide you will best-practices for the steps along the way.


Spend an hour with us on September 11th and hear what experts Ray Wang and Tony Curcio have to say:

Successful Big Data Projects require Big Data Integration, since most Hadoop initiatives involve collecting, moving, transforming, cleansing, integrating, exploring, and analyzing volumes of disparate data.

Register for this webinar to learn about:

  • The current state of the Big Data market
  • Customer success stories with Big Data and Big Data Integration
  • The 5 best practices when it comes to Big Data Integration

Get every new post delivered to your Inbox.

Join 38 other followers