Exclusive Opportunity to Influence IBM Product Usability: Looking for Participants for Usability Test Sessions – Data Warehousing and Analytics

Arno thumbnail 2By Arno C. Huang, CPE
Designer, IBM Information Management Design
IBM Design making the user the center of our productsThe IBM Design Team is seeking people with a variety of database, data warehousing and analytics backgrounds to participate in usability test sessions. We are currently looking for people who work in one of the following roles: DBA, Architect, Data Scientist, Business Analyst or Developer. As a test participant, you will provide your feedback about current or future designs we are considering, thus making an impact on the design of an IBM product and letting us know what is important to you.

Participating in a study typically consists of a web conference or on-site meeting scheduled around your availability. IBM will provide you with an honorarium for your participation. There are several upcoming sessions, so if you’re interested, we’ll help you find a session that best suits your schedule. If you are interested, please contact Arno C. Huang at achuang@us.ibm.com

Troubles Are Out of Reach With Instant Insights

RadhaBy Radha Gowda
Product Marketing Manager, DB2 and related offerings

Bet you have been hearing a lot about shadow tables in DB2 “Cancun Release” lately.  Umm… do shadow and Cancun remind you of On the beach by Cliff Richards and the Shadows?  Seriously, DB2 shadow tables can make you dance to a rock ‘n’ roll on the beach because you will be trouble free with real-time insights into your operations and of course, lots of free time.

What is a shadow table?

Shadow tables have been around since the beginning of modern computing – primarily for improving performance.  So what does the DB2 shadow table offer? The best of both OLTP and OLAP worlds!  You can now run your analytic reports directly in OLTP environment with better performance.

Typically organizations have separate OLTP and OLAP environments – either due to resource constraints or to ensure the best OLTP performance.   The front-end OLTP is characterized by very small, but high volume transactions. Indexes are created to improve performance.  In contrast, the back-end OLAP has long-running complex transactions that are relatively small in number. Indexes are created, but they may be different from OLTP indexes.  Of course, an ETL operation must transfer data from OLTP database to OLAP data mart/warehouse at time intervals that may vary from minutes to days.

DB2 can help you simplify your infrastructure and operations with shadow tables. Shadow table is a column organized copy of a row-organized table within the OLTP environment, and may include all or a subset of columns.  Because the table is column organized, you get the benefit of enhanced performance that BLU Acceleration provides for analytic queries.

How do shadow table work?

shadow tables

Shadow table is implemented as a materialized query table (MQT) that is maintained by replication. IBM InfoSphere Change Data Capture for DB2, available in advanced editions, maintains shadow tables through automatic and incremental synchronization of row-organized tables.

While all applications can access the row-organized table by default, DB2 optimizer will perform the latency-based routing to determine whether a query needs to be routed to shadow tables or the row-organized source.

A truly flexible and trouble-free OLTP world

Shadow tables offer the incredible speed you have come to expect from BLU Acceleration while the source tables remain row-organized to best suit OLTP operations.  In fact, with shadow tables, the performance of analytical queries can improve by 10x or more, with equal or greater transactional performance*.

With instant insight into “as it happens” data for all your questions and all the free time you’ll have with no more indexing/tuning, what’s not to like? Try DB2 today

* Based on internal IBM testing of sample transactional and analytic workloads by replacing 4 secondary analytical indexes in the transactional environment with BLU Shadow Tables. Performance improvement figures are cumulative of all queries in the workload. Individual results will vary depending on individual workloads, configurations and conditions.

Is Your Database a Hero or a Hindrance?

KellySchlamb Kelly Schlamb
DB2 pureScale and PureData Systems Specialist, IBM

Here’s a big question for you – Is your database a hero or a hindrance? In other words, is your database environment one that’s helping your organization meet your performance, scalability, and availability needs or is it holding you back from meeting your SLAs and keeping up with ever changing business needs?

Join me for an Information Week webinar on this topic next week — Thursday, September 4th at 12pm EDT — where I’ll be talking about these types of challenges faced by IT organizations and how DB2 has the capabilities to address those challenges.  News about some of these capabilities will be hot off the press and so you won’t want to miss it.

Click here to register

Webcast(Hero)-LIVE

Steps toward the Future: How IBM DB2 is changing the Game

toriTori McClellan
Super Awesome Social Media Intern

 

Welcome to the New Age of database technology!

IBM DB2 with BLU Acceleration changes the game for in-memory computing.  Due the importance of in-memory computing, we created a dedicated website to take you through all the details, references, and more: www.ibmbluhub.com !  This website is in place to help clients and prospects understand what next-gen in-memory computing can do for them and why IBM BLU is the ideal in-memory database to deliver fast answers.

A few examples of how IBM BLU has helped other clients find their ideal balance between speed and quality:

  1. Regulatory reporting is a huge challenge for all banks - Handelsbanken, one of the most profitable banks in the world, is currently doing reports monthly but are expected to do them daily in the near future. DB2 with BLU Acceleration has helped Handelsbanken analysts get the data they need for daily reports via columnar store. Learn more by watching this video: http://bit.ly/1u7urAA
  2.  Deploying DB2 with BLU Acceleration is simple - with only a handful of commands, you can turn on analytics mode, create a new or auto-configure an existing database to make best use of your hardware for analytics, and then load the data. Learn more from this IBM Redbook that introduces the concepts of DB2 with BLU Acceleration from ground up and describe the technologies that work hand-in-hand with BLU Acceleration: Architecting and Deploying IBM DB2 with BLU Acceleration in Your Analytical Environment.
  3.  Get the FACTS and stay current by subscribing to the ibmbluhub.com newsletter.

- IBM DB2 with BLU Acceleration is a revolutionary technology and delivers breakthrough performance improvements for analytic queries by using dynamic in-memory columnar technologies.

- Different from other vendor solutions, BLU Acceleration allows the unified computing of online transaction processing (OLTP) and analytics data inside a single database, therefore, removing barriers and accelerating results for users. With observed hundredfold improvement in query response time, BLU Acceleration provides a simple, fast, and easy-to-use solution for the needs of today’s organizations; quick access to business answers can be used to gain a competitive edge, lower costs, and more.

- Subscribe to the newsletter to continue learning about this hot in-memory database.  You will receive a periodic iNews email, which links to what’s new.  Just click and learn: http://www.ibmbluhub.com/blu-inews/

ToriBlog

If this information suits your needs, be sure to follow @IBM_DB2 on twitter. Get your information as it is being published.

How to Revolutionize Analytics with Next-Generation In-Memory Computing

lesking by Les King
Director, Big Data, Analytics and Database Solutions – Information Management, Software Group

 

We are now in the era of cognitive analytics. These are analytic processes that provide useful information with a timeliness which qualifies as “speed of thought”. More and more clients are leveraging the next generation of analytic computing to address business challenges which could never be handled before.

To understand this idea, here’s a fun video that explains this theory a little better and gives a real business example of exactly this: What do chicken dinners have to do with IBM?

As another example, just recently a few friends and I were looking for a coffee shop which had both WiFi and a table which was near a working power outlet. We were surprised to discover that a coffee shop in the area was analyzing the information from our mobile devices and was able to let us know that they had what we were looking for. Coffee shops are all over the place, but, that real time analytics and communication with us was what made the difference. The coffee shop doing this real-time analytics ended up getting our business.

What do the two business examples above have in common ? They both require the analysis of large volumes of information and to be able to take action on this information, very quickly. One of the key technologies allowing clients to accomplish this is in-memory computing. Hardware can handle an ever increasing volume of memory and processing power. There have also been amazing strides in the area of data compression. Vendors who provide the ability to analyze data, in memory, while compressed, will have a huge advantage with these analytic workloads.

An example of this would be IBM’s DB2 with BLU Acceleration. DB2 with BLU Acceleration provides an average of 10X ( 90% ) compression rates. This means 1 TB of data can be stored in about 100 GB of space. DB2 with BLU Acceleration stores data in memory in its compressed form, using less memory to store vast amounts of business data. More importantly, DB2 with BLU Acceleration can analyze this data while compressed. This combination of capabilities positions DB2 with BLU Acceleration as a key technology in the era of big data and cognitive analytics.

When you consider the business examples above, you can see the competitive advantage these companies will have. These next generation analytic infrastructures, which leverage in-memory computing, will allow these companies to grow their business and take clients from their competitors.

To hear another example of how this modernization of a company’s analytic infrastructure is helping solve real world business challenges, check out this upcoming webinar “How to Revolutionize Analytics with Next-Generation In-Memory Computing“, taking place on Sept 25 at 12:00 EDT .

Webcast-(InMemory)REVAMP

Tweetchat on Fraud Prevention in Banking

RadhaBy Radha Gowda
Product Marketing Manager, DB2 and related offerings

On August 7 ‘14, at 11 AM EDT, IBM Data Management team is privileged to have Robert L. Palmer, James Kobielus, and Wilson Davis join us on a tweetchat to share their expertise on #FraudPrevention in Banking.  Some topics that we shall be soliciting your opinion(s) on are:

  • Q1: Are fraudulent activities in banking increasing or decreasing? Why?
  • Q2: What are some key business impacts of fraud?
  • Q3: What measures can be taken to identify potential fraudulent transactions?
  • Q4: What analytics do you need to detect fraud?
  • Q5: What data sources can contribute to the analytics?
  • Q6: How can your systems analyze transactions as they occur?
  • Q7: How can new technologies such as in-memory analytics help in fraud detection?
  • Q8: Where can I learn more?

Here’s what you need to do to join our conversation to contribute or just listen:

  • Go to twubs.com or tweetdeck.com
  • Sign in with your twitter handle
  • Search on #FraudPrevention
  • A new window will open that makes it easy for you to follow and contribute.

If you plan to contribute to our tweetchat, please review the tips at slideshare since the chat can be very fast paced. Suggested resources relevant to the topic include:

  1. How to Mitigate Fraud and Cyber Threats with Big Data and Analytics
  2. IBM data management for banking
  3. Best practices to deploy IBM Banking Data Warehouse model to #IBMBLU for production
  4. Attract and retain customers with always-on digital mobile banking services
  5. Fight against fraud in real-time and save on operating expenses
  6. Customize offers to your clients with the data already at your fingertips
  7. World’s top 5 most secure bank is becoming more strategic and more profitable
  8. Regulatory reporting headaches? See how @Handelsbanken solved their reporting challenges

Tweetchat-AugustMore about our panelists:

Robert L. Palmer (@bigdatabusinessGlobal Banking Industry Marketing, Big Data, IBM

Bob’s expertise is applying B2B software to optimize key business processes.  He is a subject matter expert in financial services, and writes about business challenges, Big Data, analytics, CRM, cognitive computing, and information management.

James Kobielus   (@jameskobielus) Senior Program Director, Big Data Analytics, IBM

James is a popular speaker and thought leader in big data, Hadoop, enterprise data warehousing, advanced analytics, business intelligence, data management and next best action technologies.

Wilson Davis (@wilsondavisibm) Executive Technical Consultant – Counter Fraud iCoC, IBM

Wilson’s specialties include financial and operational data analytics, counter-fraud and anti-money laundering, straight-through-processing, and game changing improvements in business processes and application systems for the financial services industry.

The data advantage: Creating value into today’s digital world

IBM Institute for Business Value is looking to understand how organizations around the globe are creating business value from analytics. If you can spare a few minutes to participate in the survey, you’d be the first to receive a copy of the study when it is released in October 2014.  2014 Analytics Survey

Follow Radha on Twitter @rgowda

Data Analytics, or How Much Info for a Buck?

Bill Cole

Bill Cole – Competitive Sales Specialist,Information Management, IBM

Leave only footprints; take only pictures.  Have you seen that slogan in a national park?  My wife (she’s now an ex) didn’t believe the signs that told us to leave everything exactly where it was.  She didn’t want to just enjoy the beauty.  She wanted to take some home with us.  The flashing light of the Park Ranger car told me we were in trouble for picking up a few rocks along the side of the road.  The nice man in the Smokey hat told me to put the rocks back.  The scenery is for consumption with your eyes, your camera, not for taking home.  I did as instructed, happy to be leaving with my wallet in one piece.

I’ve always produced data and then turned it into information by adding other bits of data together and adding some context.  My users guided me for a while and then I both guided and pushed them.  This seemed to be the natural order of things, sort of like factories and the folks who buy the goods from those factories.

The IT/BI/DA teams accumulate and store the data and then massage to build what are essentially standard reports.  Standard reports are good for standard thinking, of course.  If you know the answer you’re looking for, a standard report probably has it in there somewhere, like those old balance sheets and ledgers that I ran so long ago.  But there was nothing in those reports that would help think outside of the data on those reports.  In fact, there was so little insight in them that one of the plant managers actually asked me what good these reports were.  There’s really not a good response to that one.

Insights are gained when the lines of business can chase an idea through all sorts of non-standard iterations.  Almost like chasing one of those happy mistakes from science, like penicillin, or those ubiquitous not-very-sticky note sheets that we all stick all over everything so we can easily keep track of passwords, etc.  LOL, like you haven’t done that.

So how do we get to this idea-chasing sort of thing?  This place where the data analysts or, better still, the line of business user can see something interesting and start chasing it?  This is custom-developed solution, a virtual pair of bespoke shoes that were for your situation and only for your situation.  The person in the next cubicle needn’t look over your shoulder.  It would do them no good after all.  There’s a scene in the Maureen O’Hara/John Wayne move “The Quiet Man” in which John asks directions and the local says “Do you see that road over there?  Don’t take it, it’ll do you no good.”  Insights are like that.  You need to know not to walk down a road that will do you no good.

The trick, it seems to me, is having the right tools.  Let’s start with the database (you know I’m a practicing DBA and that means all discussions start with the database).  DB2 BLU is exactly the right repository for your decision-making data.  After all, it offers both row- and column-oriented models in a single database!  This means you’re getting performance no matter which way your data chooses to be represented.  Moreover, there are different kinds of compression to ensure you save space and improve performance.  What could be better?  And all for the price of an upgrade!  Easy.  No-brainer.

There’s a neat coda to this, too.  You’re not confined to the old solution of finding a server, building it and installing the software, then building the database.  Let’s talk choices, folks.  Lots of choices.  Maybe every choice.  On premise, just like we’ve always done, works.  Maybe your own cloud would be better.  Build your BI/DA system in a PureFlex or PureApp or PureData cloud hosted in your own data center.  There’s a simple solution with lots of benefits including workload management.  Set it and forget it and go on about your business.  Maybe DBaaS works better.  Virtualize the workload and database in an existing private cloud to make use of those “excess” mips.  (Parkinson’s Law says that any organization grows to fill all the space available.  I think the demand for mips grows to fill the available servers, thus negating the concept of “Excess mips.”)  There’s SoftLayer for either a public or private cloud.  Remember, they’ll go all the way to bare metal if that’s what you need.  Finally, maybe best, is DB2 BLU available in the cloud. I championed this a while back and it’s now reality.  A pre-configured database that IBM manages and maintains, including backups and upgrades.  Talk about easy!  Go ahead, get some sleep.  We’ve got this one.

One last thought about the tools.  InfoSphere Analytics Server will do the analysis for you and present your users with suggested insights right out of the box.  And it will help the folks find their own insights by helping them look, filter and massage the data in any way that suits them.  It’s a cool tool for those times when you need the freedom to find your own way through the forest of data.

Finally, I’ve always kept two Robert Frost poems on my wall.  Perhaps, “Two Roads Diverged in a Yellow Wood” is the one for this post.  We in IT need to give the folks in the lines of business the right tools to chase down the new roads, new insights.  We’ll give the GPS for the roads less traveled by.  Good luck on your journeys of exploration!

The other poem is “Stopping By Woods On a Snowy Evening,” of course.  We all have miles to go before we sleep, before our work is complete, and using the right tools makes those miles ever so much more productive.  Bundle up on those snowy evenings and enjoy the ride.

Follow Bill Cole on Twitter : @billcole_ibm

Visit the IBM BLU HUB to learn more about the next gen in-memory database technology!

DB2 with BLU Acceleration and Intel – A great partnership!

Allen Wei

Allen Wei, DB2 Warehousing Technology, System Verification Test, IBM

DB2 with BLU Acceleration is a state of art columnar store RDBMS (Relational Database Management System) master piece that combines and exploits some of the best technologies from IBM and Intel. In the video linked below, there is mention of an 88x speedup when compared with the previous generation of row store RDBMS on the exact same workload. That announcement was made during IBM IOD in November 2013.

Guess what? In a test done a few days ago (less than 3 months after the video was filmed), the speedup, again comparing DB2 with BLU Acceleration with row store RDBMS using the exact same workload on new Intel Xeon IVY-EX based hardware, is now 148x. Really? Need I say more? This shows that not only is DB2 with BLU Acceleration equipped with innovative technologies, but it also combines the exact set of technologies from both RDBMS and hardware advancement that you really need. This helps BLU Acceleration to fully exploit hardware capacities to the extreme and to give you the best ROI (Return on Investment) that every CTO dreams about.

You might start wondering if this is too good to be true. I have shown you the numbers. So, no, it is the truth. You might want to ask, even if this is true, is it complicated? Well, it does take discipline, innovative thinking and effort to offer technologies like this. However, my answer is again a No. It’s completely the opposite! In fact, As Seen on TV (the video clip), it’s as simple as – create your tables; load your data and voila!! Start using your data. There is no need for extensive performance tuning, mind-boggling query optimization or blood-boiling index creation. Leave these tasks to DB2 with BLU Acceleration.  Can’t wait to try for yourself? It really is that fast and simple.

Do you need to hear more before you are 100% convinced? Let me begin by recalling a few key disruptive technologies that are built into DB2 with BLU Acceleration. This is mentioned in the video clip as well, and I will prove to you that we are not listing for the sake of listing them.

What state of the art technology was built into DB2 with BLU Acceleration that makes it so great? Here is a summary of what you saw on in the video clip:

# Dynamic In-Memory Technology – loads terabytes of data into random access memory instead of hard disks, streamlining query workloads even when data sets exceed the size of the memory.

  • This allows the CPU to operate efficiently without waiting on the disk I/O operations
  • In my case, for one instance, I could fit 2TB database into 256GB RAM or 1/8 of the database size
  • I could also fit a 10TB database into 1TB RAM or 1/10 of the database size, in another test.

# Actionable Compression – Deep data compression and perform actions directly on uncompressed data

  • Deep compression
  • I noticed a storage space consumption that was 2.8x – 4.6x smaller than corresponding row-store database, depending on the size of  the database
  • Data can be accessed as is in the compressed form, no decompression needed
  • CPU can dedicate power to query processing not on decompression algorithms.

#  Parallel Vector Processing – Fully utilize available CPU cores

  • Vector is processed more efficiently hence there’s an increase in CPU efficiency
  • All CPU cores are fully exploited.

#  Data Skipping – Jump directly to where the data is

  • We do not need to process irrelevant data.

Have  you been convinced, yet? I know you have. However, you don’t need to just take my word for it. Try it. The time you spent on reading the blog and trying to find a loophole is enough to give yourself a high performance database from scratch. Period.

Read more about the Intel and BLU Acceleration partnership here : DB2 BLU Acceleration on Intel Xeon E7v2 Solutions Brief 

Allen Wei joined IBM as a developer for the BI OLAP product line, including OLAP Miner. He was a key member of the Infosphere product line, and has lead Infosphere Warehouse and DB2 LUW SVTs. Currently  he focuses on tuning the performance of BLU Acceleration, mainly w.r.t the Intel partnership.

Visit the IBM BLU HUB to learn more about the next gen in-memory database technology!

Also checkout this post on the Intel Blog about how IBM and Intel have been working together to extract big data insights.

Introducing the IBM BLU Acceleration Hub

John Park-pic

John Park , Product Manager – DB2, BLU Acceleration for Cloud.

Hemingway once wrote “There is nothing to writing. All you do is sit down at the typerwriter and bleed” — he also wrote — “The best way to find out if you trust someone is to trust them.”

So when Susan Visser “pinged” me on IBM’s venerable Sametime system asking me to blog for the launch of ibmbluhub.com my immediate response was “I don’t blog, I only know how to post pictures of my pets and kid to facebook” she responded, “It’s easy, trust me”. Hence the quotes.

So here I am, and who am I? Well, my name is John Park. I am an IBM’er, I am a DB2 Product Manager and as of today, I am a blogger (?).

My IBM life has revolved around the DB2 and the analytics space, starting off as a developer building the engn_sqm component (think snapshot, event monitor and governor) – so if your stuff is broke – its probably my fault.

Then I moved into the honorable realm of product management, leading the charge on products such as Smart Analytics Systems, PureData for Operational Analytics and now BLU Acceleration for Cloud … which is why I guess I’m here.

On a personal note, I like to build stuff – specifically I like to build cool stuff, and BLU Acceleration is freak’in cool. When I think about the significance of this technology I recollect back to fixing Version 7 of DB2, building V8 and my last piece of code in v9.5. All along the way the DB2 team building features and products which helped our customers and our users, use DB2.

Personally, I see BLU as a convergence point, the pinnacle of where all the years of engineering and thought leadership have finally come to “eureka”.  Let me guide you in my thinking …

Autonomic features such as Automatic Maintenance, Self Tuning Memory Management and Automatic Workload Management were all incremental steps along DB2 Version releases, each fixed a problem the DB2 user had and improved the usage of DB2

DB2’s compression story started with row compression, index compression, then went to adaptive row compression and now to actionable compression and with each compression story a better value proposition to the DB2 user.  (Note that the word compression is used is 6 times!)

DB2’s performance optimization journey went from database partitioning and MPP to table and range partitioning, continuous ingest, multi-temp and workload management, making DB2 a leader in performance across all workloads.

Usability in its simplest form, value driven compression and unprecedented performance, are the three key tenets to the development of BLU. These features improved DB2 incrementally between versions, and as the product incrementally grew, our engineers experience and creativity expanded. With BLU we see these features and the knowledge gained from developing these features transform and support the simplest statement I have ever seen in enterprise software – “Create, Load and GO”. Simply amazing.

Welcome to the world, “ibmbluhub.com”, for the readers, enjoy the ride, this is the first step in a new direction for data management and technology. I haven’t even talked about BLU Acceleration for Cloud ….

Until then, here is a picture of my cat.

John Park

Data Warehousing on the Cloud with BLU Acceleration

Adam Ronthal

Adam Ronthal – Technical Marketing, BLU Acceleration for Cloud, IBM

I recently took on a new role at IBM focused on technical marketing for BLU Acceleration for Cloud, IBM’s new cloud-based agile data warehousing solution.

Why cloud?  And specifically, why analytics in the cloud?  Analytics, long recognized as a competitive differentiator has traditionally required significant resources — both in skills, and capital investment to enter the game.    Most on-premise data warehouses usually have at least a six-figure price tag associated with them, with many implementations costing millions.  And while you do get significant value and performance with an on-premise implementation, that capital investment means longer procurement lead times, and longer lead times in general to ramp up an analytics project

Cloud computing represents a paradigm shift… now even small organizations with limited budgets and resources can access the same powerful analytic technology leveraged in the most advanced analytic environments.  BLU for Cloud is a columnar, in-memory solution that brings appliance simplicity and ease of use for data warehousing and analytics to everyone — all for less than the price of a cup of coffee per hour.[1]

BLU for Cloud is perfect for:

  • Pop-up Analytics Environments – need a quick, agile data warehouse for a temporary project?  Put it in the cloud!
  • Dev/Test Environments – Yes, it’s compatible with the enterprise databases already in use within your organization because it’s based on DB2, an industry standard!
  • Analytic Marts – Augment and modernize your existing data warehouse infrastructure by leveraging cloud flexibility
  • Self Contained Agile Data Warehousing - leverage BLU for Cloud for almost any analytics application

Come find out more at my PULSE and TDWI sessions in Las Vegas next week!

At PULSE:  Tuesday, Feb 25, 5:00pm at the Expo Theater

At the TDWI World Conference:  Wednesday, Feb 26, 12:35pm

Or check out the BLU for Cloud website at http://www.bluforcloud.com for more details.


[1] To be fair, we’re probably talking Starbucks, not Dunkin Donuts…

More on the author – 
Adam Ronthal has worked in the technology industry for over 19 years in technical operations, system administration, and data warehousing and analytics. In 2006, Adam joined Netezza as a Technical Account Manager, working with some of IBM Netezza’s largest data warehousing and analytic customers and helping them architect and implement their Netezza-based solutions. Adam led the team to write the Netezza NZLaunch Handbook, a practical implementation guide for IBM Netezza customers, and served as editor of the final guide. Today, Adam works in technical marketing for IBM’s Cloud, Big Data, and Appliance offerings. Adam is an IBM Certified Specialist for Netezza, and holds a BA from Yale University.

Here’s an interesting video on BLU for Cloud :

Follow

Get every new post delivered to your Inbox.

Join 37 other followers