dashDB grows and improves in a flash!

By Dennis Duckworth, 

IBM® dashDB™ continues to grow and improve with some announcements made on December 18, 2014.  Additional plans are now available to everyone and new features have been included in dashDB.

New Deployment Options

  • Enterprise Plan available to all – The Enterprise Plan for dashDB is a dedicated cloud infrastructure with tera-scale capacity. This offering is now available to anyone. Contact your IBM Information Management Sales representative to get started!
  • Cloudant deployments of dashDB now offer higher capacity, paid usage plans – We are adding an Entry plan that is fully integrated with Cloudant supporting up to 50 GB of uncompressed data, for $50/month. The freemium offering for data usage below 1 GB will remain.
  • Expanded Geographic Presence – dashDB can now be deployed in our UK availability region in addition to our existing North American region.

New Cool and Useful Features

As a result of input from beta program participants, we have added new features to dashDB to make it even more useful:

  • Improved SQL Editor – The SQL query and editor capabilities have been expanded to allow a full range of SQL to be submitted via the web browser, including the ability to load and save SQL scripts. SQL validation and error checking is also included.

DD 1

  • Better Workload Monitoring – Get a much better idea of what’s running in your dashDB instance, including specific statements and connections. Set it to monitor in near real-time, drill into the details of a session, or terminate a session if needed.


  • Command Line Support – Sometimes, you need a command line interface for scripting and automation. dashDB now includes CLPPlus support – so you now get a command line user interface that lets you connect to databases and define, edit, and run statements, scripts, and commands.
  • UDX Support – Some applications or algorithms require user-defined functions (UDFs) and user-defined aggregates (UDAs). dashDB now supports these out of the box so you can implement and run your own algorithms right inside the database.

New Security Features and Capabilities

Data security is always a consideration, and dashDB now includes new security features:

  • SSL Support for all Connections – It’s not enough to automatically encrypt data at rest, we need to encrypt it in motion too. dashDB now supports SSL for all connections to the database.


  • Select Guardium Reports for all PlansdashDB now has bundled Guardium reports for all plans including the Enterprise and Cloudant integrated plans. This allows for automatic discovery of sensitive data, as well as access reports and details of SQL statements that were run against that data.


We will continue to add new features and capabilities to dashDB over the coming months, so watch this space!

If you have not started analyzing your data in dashDB, what are you waiting for? Get started with dashDB on Bluemix or Cloudant at dashDB.com.

About Dennis Duckworth

Dennis Duckworth, Program Director of Product Marketing for Data Management & Data Warehousing has been in the data game for quite a while, doing everything from Lisp programming in artificial intelligence to managing a sales territory for a database company. He has a passion for helping companies and people get real value out of cool technology. Dennis came to IBM through its acquisition of Netezza, where he was Director of Competitive and Market Intelligence. He holds a degree in Electrical Engineering from Stanford University but has spent most of his life on the East Coast. When not working, Dennis enjoys sailing off his backyard on Buzzards Bay and he is relentless in his pursuit of wine enlightenment. You can follow Dennis on Twitter 


What the Future Holds for the Database Administrator (DBA)

By Rich Hughes,

Scanning the archives as far back as 2000 reveals articles speculating on the future of the DBA.  With mounting operational costs attributed to the day-to-day maintenance of data warehouses, even 15 years ago, this was a fair question to ask.  The overhead of creating indexes, tuning individual queries, on top of the necessary nurturing of the infrastructure had many organizations looking for more cost effective alternatives.

Motivated to fix the I/O bottleneck that traditionally handicapped data warehouses, and inspired by the design goals of reduced administration and easy data access for users, the data warehouse appliance was born.  Netezza built the original data warehouse appliance that by brilliantly applying hardware and software combinations, brought the query request much closer to the data.  This breakthrough paved the way for lower administrative costs and forced others in the data warehouse market to think of additional ways to solve the I/O problem.

To be sure, Netezza disruptive technology of no indexing, great performance, and ease of administration, left many DBAs feeling threatened.  But what was really threatened was the frustrating and never ending search for data warehouse performance via indexing.  Netezza DBAs got their nights and weekends back, and adjusted by making themselves more valuable to their organizations by using the time saved with no-indexing to get closer to the business.  Higher level skills taken on by DBAs included data stewardship and data modeling, and in this freer development environment, advanced analytics took root.  In the data warehouse appliance world, much more DBA emphasis was placed on the business applications because the infrastructure was designed to run for the most part, unassisted.

Fast forward to current day where the relentless pursuit of IT cost efficiencies while providing more business value continues.  Disruptive technologies in the past decade have been invented to fill this demand, like the Hadoop Ecosystem and the maturing Cloud computing environment.  Hardware advances have pushed in-memory computing, Solid State Drives are in the process of phasing out spinning disk storage, and 128 bit CPUs and operating systems are on the drawing boards.  Databases like IBM’s dashDB have benefitted by incorporating several of these newer hardware and software advances

So 15 years into the new Millennium what’s a DBA to do? Embrace change and realize there is plenty of good news and much data to administer.  While the Cloud’s Infrastructure and Platform services will decrease on-premise DBA work over time, the added complexity will demand new solutions for determining the right mixture of on, off, and hybrid premise platforms.  Juggling the organizational data warehouse work load requires different approaches if the Cloud’s elasticity and cheaper off-hour rates are to be leveraged.

Capacity planning and data retention take on new meaning in a world where, while it is now possible to store and access everything, what is the return value of all that information? The DBA will be involved in cataloging the many new data sources as well as getting a handle on the unstructured data provided by the Internet of Things.  Moving data, when to move data, to persist or not, how does this data interact with existing schemas are all good questions to be considered for the thoughtful DBA.  And that is just on the ingest side of the ledger.  Who gets access, what are the security levels, how can applications be rapidly developed, how does one re-use SQL in a NoSQL world, and how to best federate all this wonderful data are worthwhile areas for reasonable study.

In summary, the role of the Database Administrator has always been evolving, forced by technology advances and rising business demands.  The DBA has and will continue to be one that requires general knowledge of several IT disciplines, with the opportunity to specialize.  Historically the DBA, by keeping current, can go deeper in a particular technology– a move that benefits both their career and their organization’s needs.  The DBA can logically move into an architecture or Data Scientist position, the higher skill sets for today’s world.  What has not changed is the demand to deliver reliable, affordable, and valuable information.

About Rich Hughes,

Rich Hughes is an IBM Marketing Program Manager for Data Warehousing.  Hughes has worked in a variety of Information Technology, Data Warehousing, and Big Data jobs, and has been with IBM since 2004.  Hughes earned a Bachelor’s degree from Kansas University, and a Master’s degree in Computer Science from Kansas State University.  Writing about the original Dream Team, Hughes authored a book on the 1936 US Olympic basketball team, a squad composed of oil refinery laborers and film industry stage hands. You can follow him on @rhughes134

How Small & Medium Sized Businesses Can Tap Into Big Data & Analytics – Part 2

By Rahul Agarwal, 

In my blog on part one of this topic, I have explained that choosing the right analytic solution that is simple, smart and agile is an important step in your journey to become an analytics driven organization.

Another important aspect for you to consider is the completeness, accuracy and timeliness of the data that is being fed into this analytic solution. As the number of data sources increase in number, chances are that, your data is spread across multiple applications and systems (your data may reside in departmental databases and/or may be as granular as user spreadsheets). Hence it is extremely important to have robust information integration capabilities in order to create utmost confidence in data, and thus in the results of your decision-making. Only with that assurance of trust can you be sure that your analytics initiatives will succeed.

A study conducted by Ventana Research has found that data spread across applications and systems, and multiple versions of the truth are the most frequently cited barriers to efficient information management.[1]

IBM InfoSphere Information Server for Data Integration

IBM InfoSphere Information Server for Data Integration is a market-leading data integration platform that helps organizations ensure that the information that drives their business and strategic initiatives is trusted, consistent and conforms to governance policies. It delivers agile integration capabilities so that businesses can integrate data quickly and flexibly wherever it resides.

The Good News

Here is the good news for you. You can accelerate time to value when deploying a trusted data warehouse analytic solution by taking advantage of the tight, fast connectivity between InfoSphere Information Server and PureData System for Analytics.

Shanghai-based Haitong Securities Co. Ltd has used the combination of IBM PureData System for Analytics appliance and IBM InfoSphere DataStage and IBM Cognos Business Intelligence V10 software to capture data from multiple systems in near-real time and perform extensive data mining and real-time search. The new system is helping the company reduce the time taken to compile comprehensive reports from 10 days to a single day.  In addition, better segmentation and targeting of customers has helped increase cross-selling and up-selling effectiveness, boosting commission revenue by 10%.

PureData System for Analytics N3001 includes software entitlement to 280 PVUs of IBM InfoSphere DataStage 11.3, including two concurrent users of the Designer Client and use of InfoSphere Data Click for self-service data integration to enhance business agility.

In Conclusion

As your business grows, quick and efficient data integration capabilities become increasingly important. They may be difference between having time to work on innovative new projects or being caught in the drudgery of managing existing data systems. And for your business, it can mean quicker, more informed decision-making leading to a stronger bottom line, better customer service and competitive advantage.

More information on the IBM PureData System for Analytics N3001-001 and the software entitlements can be found at this link.

About Rahul Agarwal

Rahul Agarwal is a member of the worldwide product marketing team at IBM that focuses on data warehouse and database technology. Rahul has held a variety of business management, product marketing, and other roles in other companies including HCL Technologies and HP before joining IBM.  Rahul studied at the Indian Institute of Management, Kozhikode and holds a bachelor of engineering (electronics) degree from the University of Pune, India. Rahul’s Twitter handle :  @rahulag80

[1] marksmith.ventanaresearch.com/2012/08/08/cios-need-to-make-information-management-a-real-priority/

Data Security in dashDB

By Walid Rjaibi,

dashDB is a managed data warehousing and analytics service on the cloud, available on both the Bluemix  and Cloudant platforms. For IT professionals, it takes data warehouse infrastructure out of the equation when you must rapidly add analytics services for your organizations. For business professionals, it provides a self-service analytics powerhouse in a cloud-easy, load and go format. But when you place your data in the cloud you need to know that it is secure. In this blog you will learn how dashDB keeps your data secure through encryption, database activity monitoring, deployment hardening, and secure design principles

Encryption for data at rest

With dashDB, encryption for data at rest is automatic. The encryption uses Advanced Encryption Standard (AES) in Cipher-Block Chaining (CBC) mode with a 256 bits key. Encryption and key management are totally transparent to applications and schemas. Additionally, the client has the option to indicate, upon provisioning, the master key rotation period. The default is 90 days but the client may choose a different value. The master key rotation is automatic and transparent. Database and tables-pace backup images are automatically compressed and encrypted. Like for online data, backup images are also encrypted using AES in CBC mode with 256 bit keys.

Encryption for data in transit

Secure Socket Layer (SSL) is automatically configured when your dashDB database is provisioned. That is, your database applications have the option to immediately leverage SSL to protect the confidentiality and integrity of the database traffic. The SSL certificate you need to enable your applications for SSL is easily downloadable from the dashDB console. The dashDB console itself is automatically deployed with HTTPS so all your exchanges with the console are also protected with SSL.

Database activity monitoring

Your dashDB database is continuously monitored through IBM InfoSphere Guardium. The monitoring reports are made available to you easily through the dashDB console. Three different reports are available. The first is a sensitive data report. This allows you to understand what sensitive data might be present in your database (e.g., credit card numbers). The second report is a database connections report. This allows you to understand who is making connections to your database. The third report is an activity report. This allows you to understand who is accessing which objects in your database. There are two versions of this report: A summary version and a detailed version.

Database access control

Database access control starts with the dashDB console where you define your database users. Your dashDB database also provides a rich set of traditional security capabilities to allow you to manage who in your team should have access to what objects in your database.  These capabilities include table level privileges and role based access control. For example, suppose that a Guardium sensitive data report shows that you have a table that includes sensitive data. In this case, you would want to create a role representing the users authorized to access that table, grant access on the table to that role,  and then revoke access from anyone else.

Deployment hardening

Both the dashDB database server and the database are hardened. The database server employs a host firewall to protect listening services against port scans and other network security threats. As such, only the required TCP ports are open. CONNECT authority to the database is revoked from PUBLIC, and SELECT privilege on the catalog tables and views is also revoked from PUBLIC. Additionally, the AUTHENTICATION database manager configuration parameter is set to SERVER_ENCRYPT which means that user authentication credentials are never exchanged in clear text between a user application and the database server. These credentials are automatically encrypted with AES 256 when sent over the network regardless of whether SSL is used or not.

Secure design principles

The development of dashDB follows secure development best practices as outlined in the IBM Secure Engineering Framework (http://www-01.ibm.com/software/test/wenses/security/). For example, this includes the completion of a risk assessment and a threat modeling document. Additionally, the IBM Security AppScan tools are regularly used to conduct static and dynamic code analysis during the development process.

About Walid Rjaibi, 

Walid Rjaibi is the Chief Security Architect for IBM Information Management (IM). He drives the strategy and provides technical oversight for security over a broad set of IM products and cloud services. Prior to his current role, Walid was a Research Staff Member at the IBM Zurich Research Lab in Switzerland where he established and led a new research program focused on database security and privacy. His research results were the foundation for key security enhancements in IBM’s database products and for which he led the actual development efforts upon his return to the IBM Toronto Lab. Walid’s work so far resulted in over 20 patents and several publications in the proceedings of leading scientific conferences, such as the international conference on Very Large Databases (VLDB), the International Conference on Data Engineering (ICDE), and the international conference on Security and Cryptography (SECRYPT). Walid also speaks frequently at industrial conferences such as the International DB2 User Group (IDUG) and the IBM Information of Demand (IOD). You can follow him on @WalidRjaibi

Harness the Experience of Industry Knowledge to Accelerate Success in Data Warehousing

By Elaine Hanley,

“I am always doing that which I cannot do, in order that I may learn how to do it.”
Pablo Picasso

Taking on a data warehouse project is not for the fainthearted. It involves marrying the views of disparate parts of an organization, where not only are the business objectives driven by different needs, but the language spoken is at odds. A data warehouse project includes the daunting task of integrating the silos that exist in virtually every organization, nominally bringing together the information from multiple systems, and of addressing organizational misalignment and differences by trying to seek agreement in the overall approach towards measuring the business operations.

What tools can such organizations use to reduce the risk of failure? What can help them to accelerate their projects? How can they ensure that they are not paralyzed by indecision and the unknown?

IBM Industry Data Models can help:

  • Learn from the experienced

Our approach to acquiring knowledge has always been to avoid starting from nothing, and we have turned to the experience or output of others, so that we can reuse and modify information to suit our own needs.

  • Speak the same language

The ability to explain what is required when measuring the business and the translation of that specification into the systems and analytics that can help the organization to navigate through risks and spot opportunities relies on a communication system that interconnects different parts of the organization. When we speak of a “customer”, do we mean an active customer or a prospective customer, and is “client” the same thing?

  • Build small, build iteratively

One of the biggest risks in any project is to be overambitious, but this risk can be mitigated by considering a cross-enterprise initiative such as a data warehouse. How can we deliver a focused set of analytics, while ensuring the solution will serve  future needs.

“By three methods we may learn wisdom: First, by reflection, which is noblest; Second, by imitation, which is easiest; and third by experience, which is the bitterest.”

IBM Industry Models address these and other issues and challenges. IBM Industry Models encapsulate the experience gained in the Banking, Insurance, Healthcare, Telecommunications and Retail industries and translate this experience into template analytics and methodologies for defining and building data warehouses.

The models are the subject of a paper by McKnight Associates:
“IBM Industry Data Models can jump-start an organization towards a comprehensive analytics environment by applying proven best practices in data modeling to self-contained units of business functionality.”

For more information, go to https://www14.software.ibm.com/webapp/iwm/web/signup.do?source=sw-infomgt&S_PKG=ov28366

About Elaine Hanley,

Elaine is the Product Manager for Banking, Financial Markets and Insurance Industry Models, and has  been involved in software and design for data warehousing at IBM for over 20 years. Elaine has worked in a variety of roles, including software development, consulting, project management, technical sales and product management. Elaine holds a BAI (Bachelor of Arts in Engineering) from Trinity College, Dublin and a Master’s degree in Computer Applications from Dublin City University. You can follow Elaine on Twitter at @ElaineHanley.

Join IBM at the TDWI Orlando Conference, Dec 7-12, 2014

By Amit Patel,

You are invited to join IBM at the TDWI Orlando Conference, Dec 7-12 to learn how IBM’s next generation data management and BI solutions can advance your business. TDWI events are focused on helping attendees get the best business value from their data. For data and business professionals who are looking for a week of focused education and interaction around organizing and visualizing data, I encourage you to attend this event.

We would love to see you in the IBM Booth (#205) in the Expo to learn more about your particular requirements around data management and BI, and discuss how IBM solutions can help you gain quick and actionable insight from your data. You can enter a raffle to win a Kindle by visiting the IBM booth during the Expo Partner Member Reception on Monday, Dec 8, 5:15-7:15 PM.

On Wednesday, December 10, from 12:10-1:45 PM you are invited to attend special educational sessions from IBM:

  • Big Data News cases…What in the world are people doing with Hadoop, presented by Rola Shaar
  • Taking a more refined approach to Big Data – Why you need a Data Refinery, presented by Brian Vile
  • What’s new in the IBM PureData System for Analytics, presented by Rich Hughes
  • From Insight to Foresight with BI and Predictive Analysis, presented by David Clement

IBM is hosting a special luncheon on Monday, December 8 at 12:15 PM where I am presenting a session on dashDB, a brand-new, fully-managed data warehouse service in the cloud. You will get to learn why the cloud offers unique advantages for analytics and data warehousing, and what’s involved in moving analytics to the cloud. This is an invitation-only luncheon with very limited capacity, so please let us know if you’d like to join us for this special event.

I look forward to seeing you in Orlando!