Improve IT Productivity with IBM PureData System for Transactions

KellySchlamb

Kelly Schlamb , DB2 pureScale and PureData Systems Specialist, IBM
I’m a command line kind of guy, always have been. When I’m loading a presentation or a spreadsheet on my laptop, I don’t open the application or the file explorer and work my way through it to find the file in question and double click the icon to open it. Instead, I open a command line window (one of the few icons on my desktop), navigate to the directory I know where the file is (or will do a command line file search to find it) and I’ll execute/open the file directly from there. When up in front of a crowd, I can see the occasional look of wonder at that, and while I’d like to think it’s them thinking “wow, he’s really going deep there… very impressive skills”, in reality it’s probably more like “what is this caveman thinking… doesn’t he know there are easier, more intuitive ways of accomplishing that?!?”

The same goes for managing and monitoring the systems I’ve been responsible for in the past. Where possible, I’ve used command line interfaces, I’ve written scripts, and I’ve visually pored through raw data to investigate problems. But inevitably I’d end up doing something wrong, like miss a step, do something out of order, or miss some important output - leaving things not working or not performing as expected. Over the years, I’ve considered that part of the fun and challenge of the job. How do I fix this problem? But nowadays, I don’t find it so fun. In fact, I find it extremely frustrating.Things have gotten more complex and there are more demands on my time. I have much more important things to do than figure out why the latest piece of software isn’t interacting with the hardware or other software on my system in a way it is supposed to. When I try to do things on my own now, any problem is immediately met with an “argh!” followed by a google search hoping to find others who are trying to do what I’m doing and have a solution for it.

When I look at enterprise-class systems today, there’s just no way that some of the old techniques of implementation, configuration, tuning, and maintenance are going to be effective. Systems are getting larger and more complex. Can anybody tell me that they enjoy installing fix packs from a command line or ensuring that all of the software levels are at exactly the right level before proceeding with an installation of some modern piece of software (or multiple pieces that all need to work together, which is fairly typical today)? Or feel extremely confident in getting it all right? And you’ve all heard about the demands placed on IT today by “Big Data”. Most DBAs, system administrators, and other IT staff are just struggling to keep the current systems functioning, not able to give much thought to implementing new projects to handle the onslaught of all this new information. The thought of bringing a new application and database up, especially one that requires high availability and/or scalability, is pretty daunting. As is the work to grow out such a system when more demands are placed on it.

It’s for these reasons and others that IBM introduced PureSystems. Specifically, I’d like to talk here about IBM PureData System for Transactions. It’s an Expert Integrated System that is designed to ensure that the database environment is highly available, scalable, and flexible to meet today’s and tomorrow’s online transaction processing demands. These systems are a complete package and they include the hardware, storage, networking, operating system, database management software, cluster management software, and the tools. It is all pre-integrated, pre-configured, and pre-tested. If you’ve ever tried to manually stand up a new system, including all of the networking stuff that goes into a clustered database environment, you’ll greatly appreciate the simplicity that this brings.

The system is also optimized for transaction processing workloads, having been built to capture and automate what experts do when deploying, managing, monitoring, and maintaining these types of systems. System administration and maintenance is all done through an integrated systems console, which simplifies a lot of the operational work that system administrators and database administrators need to do on a day-to-day basis. What? Didn’t I just say above that I don’t like GUIs? No, I didn’t quite say that. Yeah, I still like those opportunities for hands-on, low-level interactions with a system, but it’s hard not to appreciate something that is going to streamline everything I need to do to manage a system and at the same time keep my “argh” moments down to a minimum. The fact that I can deploy a DB2 pureScale cluster within the system in about an hour and deploy a database in minutes (which, by the way, also automatically sets it up for performance monitoring) with just a few clicks is enough to make me love my mouse.

IBM has recently released some white papers and solution briefs around this system and a couple of them talk to these same points that I mentioned above. To see how the system can improve your productivity and efficiency, allowing your organization to focus on the more important matters at hand, I suggest you give them a read:

Improve IT productivity with IBM PureData System for Transactions solution brief
Four strategies to improve IT staff productivity white paper

The four strategies as described in these papers, that talk to the capabilities of PureData System for Transactions, are:

  • Simplify and accelerate deployment of high availability clusters and databases
  • Streamline systems management
  • Reduce maintenance time and risk
  • Scale capacity without incurring downtime

I suspect that I won’t be changing my command line and management/maintenance habits on my laptop and PCs any time soon, but when it comes to this system, I’m very happy to come out of my cave.

It’s the Most Wonderful Time of the Year

Melanie Stopfer

Melanie Stopfer , Consulting Learning Specialist, Information Management, IBM

It’s getting close to one of my favorite weeks of each year – IDUG North America Technical Conference.  Break out the eggnog and mistletoe.  I feel like a child before Christmas that’s counting the days to open presents.  This year there’s even a bigger present waiting because my 83-year old mother and younger sister live in the Phoenix area. IDUG DB2 2014 Technical Conference will be held at Sheraton Downtown Phoenix Hotel and Convention Center, May 12-16.

IDUG is the foremost independent, user-driven community that provides a direct channel to thousands of professional DB2 users across the globe.  At the conference you gain access to influential decision makers.  You can also raise your profile by gaining a deeper understanding of marketplace needs and the competitive environment.  There are so many opportunities to engage with current and potential customers looking to learn about DB2 and related technologies to improve business performance.

Use the following tidbits to provide justification for your attendance.  Sometimes planning your week can be a little time consuming and for the new attendee a little overwhelming so use the following information to experience a very successful week.

Monday, May 12 – IDUG One Day Seminars
Take a deep dive into specific aspects of DB2 with some of the most renowned speakers in our industry.  You will not only have the opportunity to listen to these speakers but will have the opportunity to ask them specific questions as well.  There is an additional registration fee for these seminars of $425 for paid full IDUG Conference delegates ($475 for unpaid) which includes lunch and a hard copy of the materials. Select from the following session topics:

  • Advanced SQL Coding and Performance (Dan Luksetich)
  • DB2 10 for z/OS System Admin Crammer class for Certification Exam 617 (Judy Nall)
  • DB2 10.1 for Linux, UNIX, and Windows Database Administration Certification Exam (Exam 611) Preparation (Roger Sanders)
  • DB2 11 for z/OS Database Administration Certification Crammer Course (Susan Lawson)
  • DB2 LUW Problem Determination and Troubleshooting Workshop (Pavel Sustr and Samir Kapoor)
  • DB2 V10+ for LUW Top Gun Performance Workshop (Martin Hubel and Scott Hayes)
  • The BEST of the BEST of the THINGS I WISH I”D TOLD YOU 8 YEARS AGO (Bonnie Baker)

Tuesday, May 13 – Keynote Speaker
Donald Feinberg, vice president and distinguished analyst in Gartner Intelligence Information Management group, is responsible for Gartner’s research on database management systems and data warehousing infrastructure and Big Data.  I can’t wait to hear his insights.

Tuesday, May 13 through Friday, May 16 – Attend your selected technical sessions  Build your personalized conference agenda using IDUG’s My Conference feature .  Customize your week at IDUG with technical sessions, technical networking sessions and social networking opportunities. The My Conference feature also allows you to evaluate sessions and the overall conference. You can download your custom agenda or sync with your calendar (Google Calendar, Outlook 2010, etc.) and have the information at their fingertips.  Check out the agenda by technical track, time, or title.  Just click on the title to get speaker bio, session abstract, and session objectives.  Don’t forget about the new track Big Data and Analytics.  You can choose technical sessions from the following tracks:

  • Ala Carte
  • Big Data and Analytics
  • DB2 for Developers
  • DB2 for LUW – I
  • DB2 for LUW – II
  • DB2 for z/OS – I
  • DB2 for z/OS – II

Consider selecting my two presentations:

  • Wed May 14, 8:00 AM, Paradise Valley – C05 Best Practices Upgrade to DB2 10.5 with BLU Acceleration
  • Thu May 15, 1:00 PM, Camelback – D11 Using Row and Column Access Controls (RCAC) with DB2 LUW 

Wednesday, May 14, 3:30-4:45  - Special Interest Groups (SIGS)
Get face-to-face with developers and technical experts and discuss what’s hot.  Where does your special interest lie?  Wish I could clone myself and attend more than one SIG.  Walid Rjaibi, IBM’s Chief Security Architect for Information Management, Rebecca Bond, R Bond Consulting, Inc., and I will co-chair SIG1F – Security and Data Masking.

  • SIG1A – DB2 Memory Usage, Futures and Features
  • SIG1B – Fun with SQL
  • SIG1C – Maintaining Big Data
  • SIG1D – DB2 BLU Acceleration
  • SIG1E – DB2 Compression – What are the options? What are the license implications?
  • SIG1F – Security and Data Masking
  • SIG1G – DB2 in the Cloud (SIGS)

Thursday, May 15, 7:30-8:30 AM – Call for Volunteers Breakfast

IDUG relies on its members to leverage the technical and business acumen of DB2 professionals. It is because of dedicated members that IDUG exists.  To continue to provide the best networking and education to IDUG members worldwide, IDUG is always looking for active volunteers. Take part!  Get involved!  Help build IDUG’s future.

Thursday, May 15, 12:00-1:00 PM – Speaker Feedback Lunch

A great opportunity for speakers to discuss their experiences with the IDUG NA conference team.  Meet for lunch in Encanto A and provide valuable feedback to planning committee to improve the process for next year.

Thursday, May 15, 4:45-6:00 PM – Mini-Panels

You can submit your questions early at the registration help desk.  Always a lively Q&A with lots of technical answers and often hints of what is coming in the future.

  • SIG2A – z/OS Mini Panel
  • SIG2C – LUW Mini Panel (SIGS)

Thursday, May 15, 6:30 PM – DINE AROUND EVENT

FILL IN THE BLANK:  ”10 DBAs walk into a restaurant and ­­­­­­­­­­­­­____________.”
Seriously, the night is your oyster.  What a great opportunity to make new friends and renew previous relationships.  Every year, I have such a great time meeting new people and learning about them.  Don’t miss this event.  Sign up early with your favorite speaker’s restaurant since the limited number of spots go quickly.  Each diner receives a separate check for their charges. Ember Crooks, IBM Champion and db2commerce.com blogger, and I are the speakers assigned to dine at Pizzeria Bianco.  Who wants to join us for Italian?

Friday, May 16, 8:00 AM to 2 PM  –  Complimentary Technical Workshops
Wowza – who doesn’t love a free workshop?  What a great opportunity to build your technical skills!  You register for the workshops when completing your conference registration.

  • DB2 BLU Acceleration for the Cloud Workshop – Learn about the exciting new enhancements with DB2 BLU, how they relate to Cloud implementations and how they can provide data insights to your organization fast and easily.
  • DB2 11 Migration Planning Workshop -  DB2 11 gives back with more out of the box performance improvements, greater migration flexibility, many optimization’s, administration, and application features.  Leave the session with materials that you can use to start your installation/migration immediately, or in the future.

FREE IBM CERTIFICATION EXAMS

All DB2 Tech Conference attendees can take a FREE IBM CERTIFICATION EXAM! Each attendee may take one exam for free and if they pass they are eligible for a second free exam. Additional exams will be offered at the low discounted price of $25. Prepare by attending one of the pre-certification education seminars on Monday listed above. The closing time listed is the last seating for the day. Go early in the week because time slots fill up towards end of the week.  Exams will be administered:

  • Tuesday, May 13: 10:00 – 17:00
  • Wednesday, May 14: 9:00 – 16:00
  • Thursday, May 15: 8:00 – 17:00
  • Friday, May 16: 8:00 – 12:00

Products and Services Expo
Walk the exhibit hall and take advantage of the excellent opportunity to learn more about the latest DB2 technologies.  While you pick up some great souvenirs, absorb the technical content you need to be successful.  Several vendors will be handing out invitations to parties where you can network the nights away.  The expo is open:

  • Tuesday, May 13, 5:30 pm – 7:30 pm
  • Wednesday, May 14, 11:30 am – 1:00 pm, 4:30 pm – 6:00 pm
  • Thursday, May 15, 11:30 am – 1:00 pm

First Time Attendees and IDUG Mentors Discount
Discount!  Did someone mention $$$SAVINGS$$$? The IDUG Mentor program recognizes and helps loyal IDUG attendees, IBM Champions and RUG Leaders share their DB2 knowledge with first time attendees. If you fall into one of these categories you are eligible to apply for a coupon worth 80% off the registration rate for a first time attendee. First time attendees can take advantage of this offer by reaching out to a co-worker that falls into one of these categories. To apply or for more information about the IDUG Mentor Program visit the IDUG Mentor webpage:

Justification for Attending IDUG Conference

Go to the IDUG website and download the Justification Letter and How to Justify list.

Tips

  • Remember to register for a complimentary technical workshop when completing your conference registration.
  • Build your agenda before arriving so that you don’t overlook your favorite speaker or topic.  There is so much happening during the week that you will want to take advantage of every opportunity.
  • IDUG NA Conference home website
  • Build time in your agenda to take at least one complimentary certification exam.  All results are private unless you decide to share the news.
  • Make “lunch dates” with fellow attendees or speakers.  No one likes to eat alone.  Your invitation will make someone’s day. 
  • Bring a light jacket, sweater or wrap.  Convention Centers are really meat lockers in disguise.
  • Sign up to moderate at least one session.  Besides the thread chair becoming your new friend, you get to meet the speaker, and become involved with a great organization.  What a great and easy way to get involved!
  • Pack extra business cards.
  • Use the #idug hashtag in your Tweets.
  • On Friday technical sessions end at 11:45 AM and complimentary technical workshops end at 2:00 PM.

If you are interested in networking opportunities with fellow users, product developers and solution providers, user-driven product training and quality continuing education, IDUG DB2 Technical Conference is for you!

Watch out Santa because I’m opening my presents early – IDUG is coming to Phoenix.  See you there.

——–

Melanie Stopfer is a Consulting Learning Specialist and Developer for IBM Software Group. She is recognized worldwide as an advanced database specialist. As a Certified DB2 LUW 10.5 Technical Expert and Learning Facilitation Specialist, she has provided in-depth technical support to IM customers specializing recovery, performance and database upgrade and migration best practices since 1988. In 2009, Melanie was the first DB2 LUW speaker to be inducted into the IDUG Speaker Hall of Fame and was again selected Best Overall Speaker at IDUG EMEA 2011 and 2012 and IDUG NA 2012. In 2013, IOD rated her presentations in top two for customer satisfaction.

Connect with Melanie on :

Data Analytics, or How Much Info for a Buck?

Bill Cole

Bill Cole – Competitive Sales Specialist,Information Management, IBM

Leave only footprints; take only pictures.  Have you seen that slogan in a national park?  My wife (she’s now an ex) didn’t believe the signs that told us to leave everything exactly where it was.  She didn’t want to just enjoy the beauty.  She wanted to take some home with us.  The flashing light of the Park Ranger car told me we were in trouble for picking up a few rocks along the side of the road.  The nice man in the Smokey hat told me to put the rocks back.  The scenery is for consumption with your eyes, your camera, not for taking home.  I did as instructed, happy to be leaving with my wallet in one piece.

I’ve always produced data and then turned it into information by adding other bits of data together and adding some context.  My users guided me for a while and then I both guided and pushed them.  This seemed to be the natural order of things, sort of like factories and the folks who buy the goods from those factories.

The IT/BI/DA teams accumulate and store the data and then massage to build what are essentially standard reports.  Standard reports are good for standard thinking, of course.  If you know the answer you’re looking for, a standard report probably has it in there somewhere, like those old balance sheets and ledgers that I ran so long ago.  But there was nothing in those reports that would help think outside of the data on those reports.  In fact, there was so little insight in them that one of the plant managers actually asked me what good these reports were.  There’s really not a good response to that one.

Insights are gained when the lines of business can chase an idea through all sorts of non-standard iterations.  Almost like chasing one of those happy mistakes from science, like penicillin, or those ubiquitous not-very-sticky note sheets that we all stick all over everything so we can easily keep track of passwords, etc.  LOL, like you haven’t done that.

So how do we get to this idea-chasing sort of thing?  This place where the data analysts or, better still, the line of business user can see something interesting and start chasing it?  This is custom-developed solution, a virtual pair of bespoke shoes that were for your situation and only for your situation.  The person in the next cubicle needn’t look over your shoulder.  It would do them no good after all.  There’s a scene in the Maureen O’Hara/John Wayne move “The Quiet Man” in which John asks directions and the local says “Do you see that road over there?  Don’t take it, it’ll do you no good.”  Insights are like that.  You need to know not to walk down a road that will do you no good.

The trick, it seems to me, is having the right tools.  Let’s start with the database (you know I’m a practicing DBA and that means all discussions start with the database).  DB2 BLU is exactly the right repository for your decision-making data.  After all, it offers both row- and column-oriented models in a single database!  This means you’re getting performance no matter which way your data chooses to be represented.  Moreover, there are different kinds of compression to ensure you save space and improve performance.  What could be better?  And all for the price of an upgrade!  Easy.  No-brainer.

There’s a neat coda to this, too.  You’re not confined to the old solution of finding a server, building it and installing the software, then building the database.  Let’s talk choices, folks.  Lots of choices.  Maybe every choice.  On premise, just like we’ve always done, works.  Maybe your own cloud would be better.  Build your BI/DA system in a PureFlex or PureApp or PureData cloud hosted in your own data center.  There’s a simple solution with lots of benefits including workload management.  Set it and forget it and go on about your business.  Maybe DBaaS works better.  Virtualize the workload and database in an existing private cloud to make use of those “excess” mips.  (Parkinson’s Law says that any organization grows to fill all the space available.  I think the demand for mips grows to fill the available servers, thus negating the concept of “Excess mips.”)  There’s SoftLayer for either a public or private cloud.  Remember, they’ll go all the way to bare metal if that’s what you need.  Finally, maybe best, is DB2 BLU available in the cloud. I championed this a while back and it’s now reality.  A pre-configured database that IBM manages and maintains, including backups and upgrades.  Talk about easy!  Go ahead, get some sleep.  We’ve got this one.

One last thought about the tools.  InfoSphere Analytics Server will do the analysis for you and present your users with suggested insights right out of the box.  And it will help the folks find their own insights by helping them look, filter and massage the data in any way that suits them.  It’s a cool tool for those times when you need the freedom to find your own way through the forest of data.

Finally, I’ve always kept two Robert Frost poems on my wall.  Perhaps, “Two Roads Diverged in a Yellow Wood” is the one for this post.  We in IT need to give the folks in the lines of business the right tools to chase down the new roads, new insights.  We’ll give the GPS for the roads less traveled by.  Good luck on your journeys of exploration!

The other poem is “Stopping By Woods On a Snowy Evening,” of course.  We all have miles to go before we sleep, before our work is complete, and using the right tools makes those miles ever so much more productive.  Bundle up on those snowy evenings and enjoy the ride.

Follow Bill Cole on Twitter : @billcole_ibm

Visit the IBM BLU HUB to learn more about the next gen in-memory database technology!

Datawarehousing on the Cloud – Twitterchat

#IBMBLU Twitterchat on Wednesday March 26th – 11 am to 12 pm ET

Enterprises thrive when the primary focus is on innovation and achieving business goals, without distractions related to the underlying infrastructure or large capital expenditures. Cloud computing technology is the rage right now, due to the flexible infrastructure and services that it provides and the agile methodology that enterprises can adopt, as a result.

Datawarehousing and analytic solutions help businesses glean valuable game changing insights from the huge torrent of data that heads their way – every day. Traditionally this would mean complex data management environments, a large team of experts and investments in infrastructure and technology so as to stay at the forefront of emerging trends.

The emergence of Datawarehousing solutions on the Cloud helps simplify and accelerate the delivery of insights from data, without the need for sophisticated environments and big budgets. BLU Acceleration for Cloud from IBM breaks the cost and complexity barriers and brings market leading technology to everyone with the need to analyze vast amounts of data. Powered by IBM BLU Acceleration, breakthrough in-memory database technology, and combined with Cognos BI, users can experience a simple, agile cloud solution.

Join us for an #IBMBLU Twitterchat on the topic of Datawarehousing on the cloud – March 26th, from 11:00 am to 12:00 pm ET.

Our special guests for this Twitterchat are Antonio Cangiano - IM Cloud Computing Center of Competence and Evangelism, Leon Katsnelson -Director, IM Cloud Computing & Mobile, Adam Ronthal – Product Marketing & Strategy – Big Data, Cloud, Appliances, Grant Hutchison – Educator, Big Data University,  Rav Ahuja – Senior Program Manager – Cloud Computing and Big Data.

You can follow along—and join the discussion using the hashtag #IBMBLU.

We look forward to your thoughts and comments on the #IBMBLU twitterchat hosted by @IBM_DB2.

Here are some questions that we’ll discuss on the twitterchat, and additional reference topics to help inspire the conversation.

  • What use cases and applications for analytics and data warehousing in the cloud do you see as particularly relevant?
  • What are your biggest concerns about analytics and data warehousing in the cloud?
  • What are the biggest advantages you see in embracing cloud for these types of application?
  • What sources of data do you see as most relevant for data warehousing in the cloud?
  • How can cloud computing augment your existing on-premise systems?
  • What analytic tools do you need to be able to use in the cloud? (BI, predictive modeling, ETL, etc…)
  • How big are the datasets you would want to use cloud data warehousing for?
  • What is best approach to securing data in the cloud?

The Enterprise Data Warehouse is Virtualizing into the Big-Data Cloud – James Kobielus
What will you do with your 5 hours? – Grant Hutchison
When is the Cloud Appropriate for Data Warehousing and Analytics? – Adam Ronthal
BLU Acceleration for Cloud Is Now Available to Everyone – Antonio Cangiano

How do you join in?

If you use a Twitter client like Tweetdeck or HootSuite, create a search column for the term ‘#IBMBLU’. Then as participants tweet with the #IBMBLU hashtag, those tweets will appear in your column.

How do you participate?

It’s as easy as BLU. Read- Tweet- Share!

Review the discussion questions posted in advance so you can prepare your thoughts and answers. When the question is posed, begin your response with A1: for question 1 and A2: for question 2, and so on. This makes it easier to follow the conversation throughout the chat. No answer is wrong!

For more information on twitterchats – take a look at the slideshare below!

Speed & Simplicity

larry

 Larry Heathcote , Program Director, IBM Data Management

I’ve been thinking a lot about the database and data warehousing markets lately. Like “shower moment” thinking – you know when you’re really passionate about something, you think about it in the shower. Well, for me, yesterday I had a “highway moment.”

I was driving down one of the major roads in Raleigh, NC on my way to have dinner with my wife. And traffic started to get a little heavy, so everyone had to slow down, and slow down, and slow down some more. We didn’t stop, but we were moving painfully slow. We were all going somewhere, just not very fast. Frustration set in.

And at that moment, it hit me. I felt just like Joe. Joe is a data warehousing architect I met a few weeks back when I was out on a speaking trip (Joe is not his real name). Joe and his team have been working on a really important project for the past couple months, but they’ve run into some performance challenges that may jeopardize their delivery date. Every day they were making progress, very slow progress. And Joe was getting frustrated. Just like me sitting in slow traffic.

Joe’s company manufactures and sells products through a number of brick-and-mortar stores as well as online. A big percentage of their sales are to repeat customers. And recently, a couple of their products had some quality issues – which swamped their call centers for a few weeks. The merchandising managers now wants to find out if the quality issues in one of their product lines was having a ripple affect to other product areas. So they asked Joe to give them the data and the analytics they need to gain this insight. And they wanted answers, like now!

Joe and his team recently migrated one of the company’s core customer databases onto a larger server and bought more storage. Then they integrated call center logs from their support centers, but they found that they were not getting the query performance they had expected. They fixed a few problems and cut query times significantly, but when stress testing the system they could just not get the response times that the merchandising managers wanted.

And that’s when it hit me for a second time…Joe’s challenges are not unique. Increasing data volumes, upgrading infrastructures, mixing in new data types and doing new types of analytics – there are a lot of companies going through this right now to satisfy increasing demands from line of business managers.

What Joe needs is this – he needs speed and simplicity – a next-generation database for the big data era. One that can handle all data types, SQL and NoSQL, transactional and mixed analytics workloads, take advantage of modern technologies like in-memory columnar processing and other performance accelerating techniques. And one that is easy and fast to set up, deploy and manage; one that would take fewer resources to manage, so Joe and his team could focus on additional innovative projects to support his business managers.

What Joe needs – is IBM DB2 with BLU Acceleration. View the infographic here to find out why.

DBMS_1

DB2 with BLU Acceleration and Intel – A great partnership!

Allen Wei

Allen Wei, DB2 Warehousing Technology, System Verification Test, IBM

DB2 with BLU Acceleration is a state of art columnar store RDBMS (Relational Database Management System) master piece that combines and exploits some of the best technologies from IBM and Intel. In the video linked below, there is mention of an 88x speedup when compared with the previous generation of row store RDBMS on the exact same workload. That announcement was made during IBM IOD in November 2013.

Guess what? In a test done a few days ago (less than 3 months after the video was filmed), the speedup, again comparing DB2 with BLU Acceleration with row store RDBMS using the exact same workload on new Intel Xeon IVY-EX based hardware, is now 148x. Really? Need I say more? This shows that not only is DB2 with BLU Acceleration equipped with innovative technologies, but it also combines the exact set of technologies from both RDBMS and hardware advancement that you really need. This helps BLU Acceleration to fully exploit hardware capacities to the extreme and to give you the best ROI (Return on Investment) that every CTO dreams about.

You might start wondering if this is too good to be true. I have shown you the numbers. So, no, it is the truth. You might want to ask, even if this is true, is it complicated? Well, it does take discipline, innovative thinking and effort to offer technologies like this. However, my answer is again a No. It’s completely the opposite! In fact, As Seen on TV (the video clip), it’s as simple as – create your tables; load your data and voila!! Start using your data. There is no need for extensive performance tuning, mind-boggling query optimization or blood-boiling index creation. Leave these tasks to DB2 with BLU Acceleration.  Can’t wait to try for yourself? It really is that fast and simple.

Do you need to hear more before you are 100% convinced? Let me begin by recalling a few key disruptive technologies that are built into DB2 with BLU Acceleration. This is mentioned in the video clip as well, and I will prove to you that we are not listing for the sake of listing them.

What state of the art technology was built into DB2 with BLU Acceleration that makes it so great? Here is a summary of what you saw on in the video clip:

# Dynamic In-Memory Technology – loads terabytes of data into random access memory instead of hard disks, streamlining query workloads even when data sets exceed the size of the memory.

  • This allows the CPU to operate efficiently without waiting on the disk I/O operations
  • In my case, for one instance, I could fit 2TB database into 256GB RAM or 1/8 of the database size
  • I could also fit a 10TB database into 1TB RAM or 1/10 of the database size, in another test.

# Actionable Compression – Deep data compression and perform actions directly on uncompressed data

  • Deep compression
  • I noticed a storage space consumption that was 2.8x – 4.6x smaller than corresponding row-store database, depending on the size of  the database
  • Data can be accessed as is in the compressed form, no decompression needed
  • CPU can dedicate power to query processing not on decompression algorithms.

#  Parallel Vector Processing – Fully utilize available CPU cores

  • Vector is processed more efficiently hence there’s an increase in CPU efficiency
  • All CPU cores are fully exploited.

#  Data Skipping – Jump directly to where the data is

  • We do not need to process irrelevant data.

Have  you been convinced, yet? I know you have. However, you don’t need to just take my word for it. Try it. The time you spent on reading the blog and trying to find a loophole is enough to give yourself a high performance database from scratch. Period.

Read more about the Intel and BLU Acceleration partnership here : DB2 BLU Acceleration on Intel Xeon E7v2 Solutions Brief 

Allen Wei joined IBM as a developer for the BI OLAP product line, including OLAP Miner. He was a key member of the Infosphere product line, and has lead Infosphere Warehouse and DB2 LUW SVTs. Currently  he focuses on tuning the performance of BLU Acceleration, mainly w.r.t the Intel partnership.

Visit the IBM BLU HUB to learn more about the next gen in-memory database technology!

Also checkout this post on the Intel Blog about how IBM and Intel have been working together to extract big data insights.

Introducing the IBM BLU Acceleration Hub

John Park-pic

John Park , Product Manager – DB2, BLU Acceleration for Cloud.

Hemingway once wrote “There is nothing to writing. All you do is sit down at the typerwriter and bleed” — he also wrote — “The best way to find out if you trust someone is to trust them.”

So when Susan Visser “pinged” me on IBM’s venerable Sametime system asking me to blog for the launch of ibmbluhub.com my immediate response was “I don’t blog, I only know how to post pictures of my pets and kid to facebook” she responded, “It’s easy, trust me”. Hence the quotes.

So here I am, and who am I? Well, my name is John Park. I am an IBM’er, I am a DB2 Product Manager and as of today, I am a blogger (?).

My IBM life has revolved around the DB2 and the analytics space, starting off as a developer building the engn_sqm component (think snapshot, event monitor and governor) – so if your stuff is broke – its probably my fault.

Then I moved into the honorable realm of product management, leading the charge on products such as Smart Analytics Systems, PureData for Operational Analytics and now BLU Acceleration for Cloud … which is why I guess I’m here.

On a personal note, I like to build stuff – specifically I like to build cool stuff, and BLU Acceleration is freak’in cool. When I think about the significance of this technology I recollect back to fixing Version 7 of DB2, building V8 and my last piece of code in v9.5. All along the way the DB2 team building features and products which helped our customers and our users, use DB2.

Personally, I see BLU as a convergence point, the pinnacle of where all the years of engineering and thought leadership have finally come to “eureka”.  Let me guide you in my thinking …

Autonomic features such as Automatic Maintenance, Self Tuning Memory Management and Automatic Workload Management were all incremental steps along DB2 Version releases, each fixed a problem the DB2 user had and improved the usage of DB2

DB2’s compression story started with row compression, index compression, then went to adaptive row compression and now to actionable compression and with each compression story a better value proposition to the DB2 user.  (Note that the word compression is used is 6 times!)

DB2’s performance optimization journey went from database partitioning and MPP to table and range partitioning, continuous ingest, multi-temp and workload management, making DB2 a leader in performance across all workloads.

Usability in its simplest form, value driven compression and unprecedented performance, are the three key tenets to the development of BLU. These features improved DB2 incrementally between versions, and as the product incrementally grew, our engineers experience and creativity expanded. With BLU we see these features and the knowledge gained from developing these features transform and support the simplest statement I have ever seen in enterprise software – “Create, Load and GO”. Simply amazing.

Welcome to the world, “ibmbluhub.com”, for the readers, enjoy the ride, this is the first step in a new direction for data management and technology. I haven’t even talked about BLU Acceleration for Cloud ….

Until then, here is a picture of my cat.

John Park

DBaaS Explained, or Miracles in Minutes

Bill Cole

Bill Cole – Competitive Sales Specialist,Information Management, IBM


Don’t you love installing databases?  I man the whole stack from the database software through the tools and applications.  Lots of real innovation there, eh?  How many times have you done this sort of thing?  My friend Don was sent to install some very complex software for a client.  The scope was one (count ‘em, one) environment.  So Don calls me the first day he’s there and says the project wants – and this is absolutely true – seventeen copies of the environment.  As an MIS guy, he was offended.  What in the world did they need with so many copies of the environment?  Turns out that every developer wanted his/her own copy.  Given that the install process for any single environment was two weeks and Don had three weeks, all those copies weren’t happening.  Don politely explained the situation, including the fact that the disk space available would barely accommodate a single installation.  Disappointment and relief.  Guess who experienced which emotion.

I’ve written on this topic before in relationship to patterns.  This time I’d like to talk about an old or new concept depending on who you ask: Database as a Service.  Frankly, the naming is unfortunate since the concept is about providing a complete environment for an application, not just the database.  Being a DBA, the database is the most important part, of course.  LOL.

During my tenure as a data center director, I thought of my infrastructure team as providing a service not only to the IT department but to the rest of the company as well.  The real trick there was getting my team – and the CIO – to understand that.  We’re IT after all.  Miracles are our stock in trade.  It’s what we did every day.  The application folks came to rely on our miracles, too.

As we built the data center – I had the luxury of starting from scratch – we agreed on the number of environments and their specific uses, etc..  This lasted for almost two weeks once we actually got started.  From the target of the three we had agreed to, my team was challenged to create and manage seven, each with their own rules, users, priorities, sources and schedules.  The application team didn’t think we could do it.  I created the processes and procedures.  And I participated in them, too.  Shared misery.  LOL.  The apps folks would challenge us to do something in a hurry and I’d quote a short turn-around and then beat it by 50% most of the time.  It both annoyed and amazed.

This was our own DBaaS but we should first define the term.  DBaaS is initiated by a user asking for an application environment to be provisioned.  Note it’s the full application environment including the database, application code, any web- or form-based pieces, plus the ancillary tools.  Crucially, the request includes three other critical pieces of information.  First is the sizing.  Is this a small, medium, large or massive environment?  We need to understand how the environment will be used.  Is this a sandbox or training or development or Production implementation?  Some other bits of information need to be collected as well.  Which department is getting billed for this?  After all, IT didn’t just collective decide to create this environment.  Perhaps we should add something about managing this collection.  Are we applying patches or updates?  What’s the life-span?  A few days or weeks or years?  SLA?  It’s not simply gimme a copy of Prod.  We need a complete description.

Note the assumptions we make in DBaaS.  First, we assume there’s a “golden” or benchmark source from which to create the environment.  If not, it’s a whole new installation and the price just went up.  Second, that we have server resources available, probably virtualized servers, too!  And that provisioning this new environment won’t step on any existing environments.  Third, that there’s disk space available.  Again, this is space that won’t step on performance of any other environment.  I add this caveat because I once had a CIO call me out of the blue and ask me about the rules for putting data on his expensive SAN.  It’s seems all those CYA documents and spreadsheets were killing the performance of critical production database.  I know of a clearly enunciated set of rules, please share them with me.  Finally, we need to ensure network connectivity and bandwidth.  Don’t forget this critical piece of the pie.  We can’t assume that there’s room in the pipe for an added load.

The pledge we in IT make is both timescale and accuracy.  We give the requestor a time for completion that may be tomorrow or, at worst, the following day.  I know that sounds wildly aggressive for some types of provisioning but we have to build our sources and processes so we can install them quickly on a variety of operating environments.

All of the above is where DB2, SoftLayer and PureSystems shine brightly.  DB2 can be provisioned quickly from copies in a number of ways from cold or warm backups.  PureSystems implements all the advantages of DB2 plus PureApps includes DBaaS tools.  And SoftLayer is really the gold standard for DBaaS.  Just take a look.

Finally, my wife works for a large application development organization masquerading as a bank.  She gets very frustrated with environment management and provisioning.  Fortunately, I understand both sides of that story so I sit and listen patiently while she tells me what the bloody minded idiots who manage their various dev/test/prod environments have done to make her life difficult today.  The thing that gets her maddest is that no one seems to know how to provision any new environments.  With dozens of infrastructure staff, they can’t because no one has really haven’t thought through the process.  So the dev teams just sit around waiting.  Losing precious time.  It’s really not a full service bank, I guess.  Don’t expect a miracle.

Follow Bill Cole on Twitter : @billcole_ibm

Psst…. What’s your database strategy for the world of Big Data?

Radha

Radha Gowda, Product Marketing Manager, DB2 and related offerings

Alice said, “Would you please tell me which way to go from here?” The cat said, ‘That depends on where you want to get to.”  ― Lewis Carroll

The rapid rise and ubiquity of mobile and social applications are stimulating increased levels of interaction and creating an explosion of new unstructured data.  The big data technologies make it possible to obtain deeper insights for improved business productivity from this unstructured data within the context of structured/traditional data the organizations already have.  In fact, according to the NewVantage Partners Big Data Executive Survey 2013, the areas of most interest for big data were customer transaction data, financial data, market data and behavioral data.

While this explosion of unstructured data is creating new opportunities, it is also posing challenges – the existing systems are struggling to keep pace with this data growth, and in many cases are causing performance issues, unplanned downtime, missed service-level agreements (SLAs) and escalating IT costs.

So what does this mean to you?   It’s clear your business needs a cost-effective database solution that can help you manage data growth and tap into the insights locked within big data; a database solution that is future-proof and works well with the new generation of applications; a database solution with in-memory and columnar technologies; a database solution that is highly available and scalable; a database solution that is cloud ready, mobile-friendly, and secure.

You need a strategy to get there. To start, ask yourself a few questions:

  • Can your present systems handle your projected data growth?
  • Are you meeting SLAs for availability and performance?
  • Do you take advantage of emerging technologies that can improve the productivity of your business users?

If you struggle with any of these questions, read Database strategies for the world of big data to gain competitive advantage.

Follow Radha on Twitter @rgowda

 

Data Warehousing on the Cloud with BLU Acceleration

Adam Ronthal

Adam Ronthal - Technical Marketing, BLU Acceleration for Cloud, IBM

I recently took on a new role at IBM focused on technical marketing for BLU Acceleration for Cloud, IBM’s new cloud-based agile data warehousing solution.

Why cloud?  And specifically, why analytics in the cloud?  Analytics, long recognized as a competitive differentiator has traditionally required significant resources — both in skills, and capital investment to enter the game.    Most on-premise data warehouses usually have at least a six-figure price tag associated with them, with many implementations costing millions.  And while you do get significant value and performance with an on-premise implementation, that capital investment means longer procurement lead times, and longer lead times in general to ramp up an analytics project

Cloud computing represents a paradigm shift… now even small organizations with limited budgets and resources can access the same powerful analytic technology leveraged in the most advanced analytic environments.  BLU for Cloud is a columnar, in-memory solution that brings appliance simplicity and ease of use for data warehousing and analytics to everyone — all for less than the price of a cup of coffee per hour.[1]

BLU for Cloud is perfect for:

  • Pop-up Analytics Environments – need a quick, agile data warehouse for a temporary project?  Put it in the cloud!
  • Dev/Test Environments – Yes, it’s compatible with the enterprise databases already in use within your organization because it’s based on DB2, an industry standard!
  • Analytic Marts – Augment and modernize your existing data warehouse infrastructure by leveraging cloud flexibility
  • Self Contained Agile Data Warehousing - leverage BLU for Cloud for almost any analytics application

Come find out more at my PULSE and TDWI sessions in Las Vegas next week!

At PULSE:  Tuesday, Feb 25, 5:00pm at the Expo Theater

At the TDWI World Conference:  Wednesday, Feb 26, 12:35pm

Or check out the BLU for Cloud website at http://www.bluforcloud.com for more details.


[1] To be fair, we’re probably talking Starbucks, not Dunkin Donuts…

More on the author - 
Adam Ronthal has worked in the technology industry for over 19 years in technical operations, system administration, and data warehousing and analytics. In 2006, Adam joined Netezza as a Technical Account Manager, working with some of IBM Netezza’s largest data warehousing and analytic customers and helping them architect and implement their Netezza-based solutions. Adam led the team to write the Netezza NZLaunch Handbook, a practical implementation guide for IBM Netezza customers, and served as editor of the final guide. Today, Adam works in technical marketing for IBM’s Cloud, Big Data, and Appliance offerings. Adam is an IBM Certified Specialist for Netezza, and holds a BA from Yale University.

Here’s an interesting video on BLU for Cloud :

Follow

Get every new post delivered to your Inbox.

Join 26 other followers