SQL Saturday 588 – May 20 in NYC

I’m really excited about speaking at SQL Saturday #588 at the Microsoft office in Times Square, I’ll be discussing Accelerating TDM Using Data Virtualization.  This is my favorite topic, because for the 20+ years that I worked as a database administrator (DBA), I ran into the same challenges time and again.

The challenges of data volume.

More than 20 years ago, a friend named Cary Millsap was asked to define the term very large database, because its acronym (VLDB) had become popular.  Knowing that numbers are relative and that they obsolesce rapidly, he refused to cite a number.  Instead, he replied, “A very large database is one that is too large for you to manage comfortably“, which of course states the real problem, as well as the solution.

The problem of data volume has, if anything, become more severe, and there is no sign that it will abate and allow storage and networking technology to catch up.

So it becomes necessary to cease doing things the way we’ve done them for the past several decades, and find a more clever way.

And that is why I work for the solution.  Delphix data virtualization addresses the root of the problem, providing a far faster, less expensive way to provision and refresh huge volumes of data.  The result is revolutionary for the beleaguered software testing industry.

Every mishap or delay in the software development life-cycle means time pressure on the last step:  testing.  Creating full-size and fully-functional copies of the environment to be tested is also time-consuming, putting more time pressure on testing to prevent the slippage of delivery dates.  The end result is untested or poorly tested systems in production, despite heroic efforts by under-appreciated testers.

Constraining the creation of test data is data volume, which is growing beyond the capability of most enterprises.  “Storage is cheap” is not merely a cheap lie to dismiss the problem without solving it, but it is irrelevant and beside the point.

The real issue is time, because it takes a lot of time to push terabytes around from place to place.  It is just physics.  It is time which is most expensive and, by the way, storage really isn’t cheap, especially not in the quality that is demanded, and certainly not in the quantities in which it is being demanded.

Provisioning a full environment for each tester, one for each task, of each project, seems unrealistic when each environment might require terabytes of storage.  As a result, testers are limited to working in shared DEV or TEST environments which are refreshed only every few months, if ever.  Code quality suffers because testing is incomplete, and the pace of delivery fails to live up to business needs.

Server virtualization unleashed a surge of productivity in IT infrastructure over the past ten years.  But each virtual server still requires actual storage, making storage and data the constraint which holds everything back.

Data virtualization is the solution.

Come learn how to accelerate testing and how to accelerate the pace of innovation.

Come learn how to inspire more A-ha! moments by liberating QA and testing, and eliminating the constraint of data under which the software development life-cycle — from waterfall to agile — has labored over the past decades.

Liberate testing now!