Motion Map Chart with Tableau

8 years ago Hans Rosling demoed on TED the Motion Chart, using Gapminder’s Trendalizer. 7 years ago Google bought Trendalizer and incorporated into Google Charts.

A while ago, for my own education and for demo purposes, I implemented various Motion Charts using:


To implement Motion Chart in Tableau, you can use Page Shelf and place there either a Timing dimension (I used Dimension “Year” in Tableau example above) or even Measures Names (Average Monthly Home Value per ZIP Code) in my implementation of Motion Map Chart below.


Tableau’s ability to move through pages (automatically when Tableau Desktop or Tableau reader are in use and manually when Data Visualization hosted by Tableau Server and accessed through Web Browser) enabling us to create all kind of Motion Charts, as long as Visualization Author will put onto Pages a Time, Date or Timestamp variables, describing a Timeline. For me the most interesting was to make a Filled Map (Chart Type supported by Tableau, which is similar to Choropleth Map Charts) as a Motion Map Chart, see the result below.

As we all know, 80% of any Data Visualization are Data and I found the appropriate Dataset @Zillow Real Estate Research here: . Dataset contains Monthly Sales Data for All Homes (SFR, Condo/Co-op) for entire US from 1997 until Current Month (so far for 12604 ZIP Codes, which is only 25% of all USA ZIP codes) – average for each ZIP Code area.

This Dataset covers 197 Months and contains about 2.5 millions of DataPoints. All 5 Dimensions in Dataset are very “Geographical”: State, County, Metro Area, City and ZIP code (to define the “Region” and enable Tableau to generate a Longitude and Latitude) and each record has 197 Measures – the Average Monthly Home Prices per Given Region (which is ZIP Code Area) for each available Month since 1997.

In order to create a Motion Filled Map Chart, I put Longitude as Column and Latitude as Row, Measure Values as Color, Measure Names (except Number of Records) as Pages, States and Measure Names as Filters and State and ZIP code as Details and finally Attribute Values of County, Metro Area and City as Tooltips. Result I published on Tableau Public here: ,

so you can review it online AND you can download it and use it within Tableau Reader or Tableau Desktop as the automated Motion Map Chart.

For Presentation and Demo purposes I created the Slides and Movie (while playing it don’t forget to setup a Video Quality to HD resolution) with Filled Map Chart colored by Home Values for entire USA in 2013 as a Starting points and with 22 follow-up steps/slides: Zoom to Northeast Map, colored by 2013 Values, Zoom to SouthEastern New England 2013, start the Motion from Southeastern New England, colored  by 1997 Home Values per each ZIP Code and then automatic Motion through all years from 1997 to 2014, then Zoom to Eastern Massachusetts and finally Zoom to Middlesex County in Massachusetts, see movie below:

Here the content of this video as the presentation with 24 Slides:

Now I think it is appropriate to express my New Year Wish (I repeating it for a few years in a row) that Tableau Software Inc. will port the ability to create AUTOMATED Motion Charts from Tableau Desktop and Tableau Reader to Tableau Server. Please!

Notes about Spotfire 6 Cloud pricing

2 months ago TIBCO (Symbol TIBX on NASDAQ) anounced Spotfire 6 at TUCON 2013 user conference. This as well a follow-up release  (around 12/7/13) of Spotfire Cloud supposed to be good for TIBX prices. Instead since then TIBX lost more then 8%, while NASDAQ as whole grew more then 5%:


For example, at TUCON 2013 TIBCO’s CEO re-declared “5 primary forces for 21st century“(IMHO all 5 “drivers” sounds to me like obsolete IBM-ish Sales pitches) – I guess to underscore the relevance of TIBCO’s strategy and products to 21st century:

  1. Explosion of data (sounds like Sun rises in the East);

  2. Rise of mobility (any kid with smartphone will say the same);

  3. Emergence of Platforms (not sure if this a good pitch, at least it was not clear from TIBCO’s presentation);

  4. Emergence of Asian Economies (what else you expect? This is the side effect of the greedy offshoring for more then decade);

  5. Math trumping Science  (Mr. Ranadive and various other TUCON speakers kept repeating this mantra, showing that they think that statistics and “math” are the same thing and they do not know how valuable science can be. I personally think that recycling this pitch is dangerous for TIBCO sales and I suggest to replace this statement with something more appealing and more mature).

Somehow TUCON 2013 propaganda and introduction of new and more capable version 6 of Spotfire and Spotfire Cloud did not help TIBCO’s stock. For example In trading on Thursday, 12/12/13 the shares of TIBCO Software, Inc. (NASD: TIBX) crossed below their 200 day moving average of $22.86, changing hands as low as $22.39 per share while Market Capitalization was oscillating around $3.9B, basically the same as the capitalization of 3 times smaller (in terms of employees) competitor Tableau Software.

As I said above, just a few days before this low TIBX price, on 12/7/13, as promised on TUCON 2013, TIBCO launched Spotfire Cloud and published licensing and pricing for it.

Most disappointing news is that in reality TIBCO withdrew itself from the competition for mindshare with Tableau Public (more then 100 millions of users, more then 40000 active publishers and Visualization Authors with Tableau Public Profile), because TIBCO no longer offers free annual evaluations. In addition, new Spotfire Cloud Personal service ($300/year, 100GB storage, 1 business author seat) became less useful under new license since its Desktop Client has limited connectivity to local data and can upload only local DXP files.

The 2nd Cloud option called Spotfire Cloud Work Group ($2000/year, 250GB storage, 1 business author/1 analyst/5 consumer seats) and gives to one author almost complete TIBCO Spotfire Analyst with ability to read 17 different types of local files (dxp, stdf, sbdf, sfs, xls, xlsx, xlsm, xlsb, csv, txt, mdb, mde, accdb, accde, sas7bdat,udl, log, shp), connectivity to standard Data Sources (ODBC, OleDb, Oracle, Microsoft SQL Server Compact Data Provider 4.0, .NET Data Provider for Teradata, ADS Composite Information Server Connection, Microsoft SQL Server (including Analysis Services), Teradata and TIBCO Spotfire Maps. It also enables author  to do predictive analytics, forecasting, and local R language scripting).

This 2nd Spotfire’s Cloud option does not reduce Spotfire chances to compete with Tableau Online, which costs 4 times less ($500/year). However (thanks to 2 Blog Visitors – both with name Steve – for help), you cannot use Tableau online without licensed version of Tableau Desktop ($1999 perpetual non-expiring desktop license with 1st year maintenance included and each following year 20% $400 per year maintenance) and Online License (additional $500/year for access to the same site, but extra storage will not be added to that site!) for each consumer. Let’s compare Spotfire Workgroup Edition and Tableau Online cumulative cost for 1, 2, 3 and 4 years for 1 developer/analyst and 5 consumer seats :


Cumulative cost for 1, 2, 3 and 4 years of usage/subscription, 1 developer/analyst and 5 consumer seats:


Spotfire Cloud Work Group, 250GB storage

Tableau Online (with Desktop), 100GB storage

Cost Difference (negative if Spotfire cheaper)

















UPDATE: You may need to consider some other properties, like available storage and number of users who can consume/review visualizations, published in cloud. In sample above:

  • Spotfire giving to Work Group total 250GB storage, while Tableau giving total 100GB to the site.
  • Spotfire costs less than Tableau Online for similar configuration (almost twice less!)

Overall, Spotfire giving more for your $$$ and as such can be a front-runner in Cloud Data Visualization race, considering that Qlikview does not have any comparable cloud options (yet) and Qliktech relying on its partners (I doubt it can be competitive) to offer Qlikview-based services in the cloud. Gere is the same table as above but as IMage (to make sure all web browsers can see it):


3rd Spotfire’s Cloud option called Spotfire Cloud Enterprise, it has customizable seating options and storage, more advanced visualization, security and scalability and connects to 40+ additional data sources. It requires an annoying negotiations with TIBCO sales, which may result to even larger pricing. Existence of 3rd Spotfire Cloud option decreases the value of its 2nd Cloud Option, because it saying to customer that Spotfire Cloud Work Group is not best and does not include many features. Opposite to that is Tableau’s Cloud approach: you will get everything (with one exception: Multidimensional (cube) data sources are not supported by Tableau Online) with Tableau Online, which is only the option.

Update 12/20/13:  TIBCO announced results for last quarter, ending 11/30/13 with Quarterly revenue $315.5M (only 6.4% growth compare with the same Quarter of 2012) and $1070M Revenue for 12 months ended 11/30/13 (only 4.4% growth compare with the same period of 2012). Wall Street people do not like it today and TIBX lost today 10% of its value, with Share Price ending $22 and Market Capitalization went down to less then $3.6B. At the same time Tableau’s Share Price went up $1 to $66 and Market Capitalization of Tableau Software (symbol DATA) went above $3.9B). As always I think it is relevant to compare the number of job openings today: Spotfire – 28, Tableau – 176, Qliktech – 71

DV footprints on Disk and in Memory, Part 2

My previous blogpost, comparing footprints of DV Leaders (Tableau 8.1, Qlikview 11.2, Spotfire 6) on disk (in terms of size of application file with embedded dataset with 1 million rows) and in Memory (calculated as RAM-difference between freshly-loaded (without data) application and  the same application when it will load appropriate application file (XLSX or DXP or QVW or TWBX) got a lot of feedback from DV Blog visitors. It even got mentioning/reference/quote from Tableau Weekly #9 here: and the full list of Tableau Weekly issues is here:

The majority of feedback asked to do a similar Benchmark – the footprint comparison for larger dataset, say with 10 millions of rows. I did that but it required more time and work,  because the footprint in memory for all 3 DV Leaders depends on the number of visualized Datapoints (Spotfire for years used the term Marks for Visible Datapoints and Tableau adopted these terminology too, so I used it from time to time as well, but I think that the correct term here will be “Visible Datapoints“).

Basically I used the same dataset as in previous blogpost with main difference that I took subset with 10 millions of rows as a opposed to 1 Million rows in previous Benchmarks. The Diversity of used Dataset with 10 Million rows is here (each row has 15 fields as in previous benchmark):

I removed from benchmarks for 10 million rows the usage of Excel 2013 (Excel cannot handle more the 1,048,576 rows per worksheet) and PowerPivot 2013 (it is less relevant for given Benchmark). Here are the DV Footprints on disk and in Memory for Dataset with 10 Million rows and different number of Datapoints (or Marks: <16, 1000, around 10000, around 100000, around 800000):

Main observations and notes from benchmarking of footprints with 10 millions of rows as following:

  • Tableau 8.1 requires less (almost twice less) disk space for its application file .TWBX then Qlikview 11.2 (.QVW) for its application file (.QVW) or/and Spotfire 6 for its application file (.DXP).

  • Tableau 8.1 is much smarter when it uses RAM then Qlikview 11.2 and Spofire 6, because it takes advantage of number of Marks. For example for 10000 Visible Datapoints Tableau uses 13 times less RAM than Qlikview and Spotfire and for 100000 Visible Datapoints Tableau uses 8 times less RAM than Qlikview and Spotfire!

  • THe Usage of more than say 5000 Visible Datapoints (even say more than a few hundreds Marks) in particular Chart or Dashboard often the sign of bad design or poor understanding of the task at hand; the human eye (of end user) cannot comprehend too many Marks anyway, so what Tableau does (in terms of reducing the footprint in Memory when less Marks are used) is a good design.

  • For Tableau in results above I reported the total RAM used by 2 Tableau processes in memory TABLEAU.EXE itself and supplemental process TDSERVER64.EXE (this 2nd 64-bit process almost always uses about 21MB of RAM). Note: Russell Christopher also suggested to monitor TABPROTOSRV.EXE but I cannot find its traces and its usage of RAM during benchmarks.

  • Qlikview 11.2 and Spotfire 6 have similar footprints in Memory and on Disk.

DV footprints on Disk and in Memory, Part 1

More than 2 years ago I estimated the footprints for the sample dataset (428999 rows and 135 columns) when it encapsulated in text file, in compressed ZIP format, in Excel 2010, in PowerPivot 2010, Qlikview 10, Spofire 3.3 and Tableau 6. Since then everything upgraded to the “latest versions” and everything 64-bit now, including Tableau 8.1, Spotfire 5.5 (and 6), Qlikview 11.2, Excel 2013 and PowerPivot 2013.

I decided to use the new dataset with exactly 1000000 rows (1 million rows) and 15 columns with the following diversity of values (Distinct Counts for every Column below):

Then I put this dataset in every application and format mentioned above – both on disk and in memory. All results presented below for review of DV blog visitors:

Some comments about application specifics:

  • Excel and PowerPivot XLSX files are ZIP-compressed archives of bunch of XML files

  • Spotfire DXP is a ZIP archive of proprietary Spotfire text format

  • QVW  is Qlikview’s proprietary Datastore-RAM-optimized format

  • TWBX is Tableau-specific ZIP archive containing its TDE (Tableau Data Extract) and TWB (XML format) data-less workbook

  • Footprint in memory I calculated as RAM-difference between freshly-loaded (without data) application and  the same application when it will load appropriate application file (XLSX or DXP or QVW or TWBX)

Data Visualization Landscape changed: October 2013

Something dramatic happened during October 2013 with Data Visualization (DV) Market and I feel it everywhere. Share Prices for QLIK went down 40% from $35 to $25, for DATA went down 20% from $72 to below $60, for MSTR went up 27% from $100 to $127 and for DWCH went up  25% from $27.7 to $34.5. This blog got 30% more visitors then usual and it reached 26000 visitors per month of October 2013!

dwchPlus3DVPricesOctober2013So in this blog post I revisited who are actually the DV leaders and active players in Data Visualization field, what events and factors important here and I also will form the DVIndex containing 4-6 DV Leaders and will use it for future estimate of Marketshare and Mindshare in DV market.

In terms of candidates for DV Index I need measurable players, so I will prefer public companies, but will mention private corporations if they are relevant. I did some modeling and it turned out that the best indicator for DV Leader if its YoY (Year-over-Year Revenue growth) is larger than 10% – it will separate obsolete and traditional BI vendors and me-too attempts from real DV Leaders.

Let’s start with traditional BI behemoths: SAP, IBM, Oracle and SAS: according to IDC, their BI revenue total $5810M, but none of those vendors had YoY (2012-over-2011) more then 6.7% ! These 4 BI Vendors literally desperate to get in to Data Visualization market (for example SAP Lumira, IBM is getting desperate too with Project Neo (will be in beta in early 2014), Rapidly Adaptive Visualization Engine (RAVE), SmartCloud Analytics-Predictive Insights, BLU Acceleration, InfoSphere Data Explorer or SAS Visual Analytics) but so far they were not competitive with 3 known DV Leaders (those 3 are part of DVIndex for sure) Qlikview, Tableau and Spotfire

5th traditional BI Vendor – Microsoft had BI revenue in 2012 as $1044M, YoY 16% and added lately a lot of relevant features to its Data Visualization toolbox: Power Pivot 2013, Power View, Power Query, Power Map, SSAS 2012 (and soon SQL Server 2014) etc. Unfortunately Microsoft does not have Data Visualization Product but pushing everything toward Office 365, SharePoint and Excel 2013, which cannot compete in DV market…

6th Traditional BI vendor – Microstrategy made during October 2013 a desperate attempt to get into DV market by releasing 2 free Data Visualization products: Microstrategy Desktop and Microstrategy Express, which are forcing me to qualify Microstrategy for a status of DV Candidate, which I will include (at least temporary) into DVIndex.  Microstrategy BI revenue for TTM (Trailing 12 months) was $574, YoY is below 5% so while I can include it into DVIndex, I cannot say (yet?) that Microstrategy is DV Leader.

Datawatch Corporation is public (DWCH), recently bought advanced Data Visualization vendor – Panopticon for $31M. Panopticon TTM Revenue approximately $7M and YoY was phenomenal 112%  in 2012! Combining it with $27.5M TTM Revenue of Datawatch (45% YoY!) giving us approximately 55% YoY for combined company and qualifying DWCH as a new member of DVIndex!

Other potential candidates for DVIndex can be Panorama (and their Necto 3.0 Product), Visokio (they have very competitive DV Product, called Omniscope 2.8), Advizor Solution with their mature Advizor Visual Discovery 6.0 Platform), but unfortunately all 3 companies choose to be private and I have now way to measure their performance and so they will stay as DV Candidates only.

In order to monitor the progress of open source BI vendors toward DV Market, I also decided to include into DVIndex one potential DV Candidate (not a leader for sure) – Actuate with their BIRT product. Actuate TTM revenue about $138M and YoY about 3%. Here is the tabular MarketShare result with 6 members of DVIndex:


Please keep in mind that I have no way to get exact numbers for Spotfire, but I feel comfortable to estimate Spotfire approximately as 20% of TIBCO numbers. Indirect confirmation of my estimate came from … TIBCO’s CEO and I quote: “In fact, Tibco’s Spotfire visualization product alone boasts higher sales than all of Tableau.” As a result I estimate Spotfire’s YoY is 16% which is higher then 11% TIBCO has. Numbers in table above are fluid and reflect the market situation by the end of October 2013. Also see my attempt to visualize the Market Share of 6 companies above in simple Bubble Chart (click on it to Enlarge; where * X-axis: Vendor’s Revenue for last TTM: 12 trailing Months, * Y-axis: Number of Full-Time Employees, working for given Vendor, * Sized by Market Capitalization, in $B (Billions of Dollars), and * Colored by Year-Over-Year revenue Growth):


For that date I also have an estimate of Mindshare of all 6 members of DVIndex by using the mentioning of those 6 companies by LinkedIn members, LinkedIn groups, posted on LinkedIn job openings and companies with Linkedin profile:


Again, please see below my attempt to represent Mindshare of those 6 companies above with simple Bubble Chart ((click on it to Enlarge; here 6 DV vendors, positioned relatively to their MINDSHARE on LinkedIn and where * X-axis: Number of LinkedIn members, mentioned Vendor in LinkedIn profile, * Y-axis: Number of LinkedIn Job Postings, with request of Vendor-related skills, * Sized by number of companies mentioned them on LinkedIn and * Colored by Year-Over-Year revenue Growth):


Among other potential DV candidates I can mention some recent me-too attempts like Yellowfin, NeitrinoBI, Domo, BIME, RoamBI, Zoomdata and multiple similar companies (mostly private startups) and hardly commercial but very interesting toolkits like D3. None of them have impact on DV Market yet.

Now, let’s review some of October events (may add more October events later):

1. For the fourth quarter, Qliktech predicts earnings of 28 cents to 31 cents a share on revenue between $156 million and $161 million. The forecast came in significantly lower than analysts’ expectations of 45 cents a share on $165.78 million in revenue. For the full year, the company projects revenue between $465 million and $470 million, and earnings between 23 and 26 cents a share. Analysts had expectations of 38 cents a share on $478.45 million. As far as I concern it is not a big deal, but traders/speculants on Wall Street drove QLIK prices down almost 40%

2. Tableau Software Files Registration Statement for Proposed Secondary Offering. Also Tableau’s Revenue in the three months ended in September rose to $61 million, 10 millions more then expected – Revenue jumped 90%! Tableau CEO Christian Chabot said the results were boosted by one customer that increased its contract with the company. “Our third quarter results were bolstered by a large multimillion-dollar deal with a leading technology company,” he said. “Use of our products in this account started within one business unit and over the last two years have expanded to over 15 groups across the company. “Recently, this customer set our to establish an enterprise standard for self-service business intelligence, which led to the multimillion-dollar transaction. This deal demonstrates the power and value of Tableau to the enterprise.” However DATA prices went down anyway in anticipation of a significant portion of these Shares Premium prices should quickly evaporate as the STOCK Options lock-up will expire in November 2013.

3. TIBCO TUCON 2013 conference somehow did not help TIBCO stock but in my mind brought attention to Datawatch and to the meteoric rise of DWCH stock (on Chart below compare it with QLIK and TIBX prices, which basically did not change during period of March-October of 2013) which is more then tripled in a matter of just 8 months (Datawatch bought and integrated Panopticon during exactly that period):

DWCHvsQLIKvsTIBXMar_Oct20134. Datawatch now has potentially better software stack then 3 DV Leaders, because of Datawatch Desktop is integrated with Panopticon Desktop Designer and Datawatch Server is integrated with Panopticon Data Visualization Server; it means that in addition to “traditional” BI + ETL + Big Data 3V features (Volume, Velocity, Variety) Datawatch has 4th V feature, which is relevant to DV Market: the advanced Data Visualization. Most visualization tools are unable to cope with the “Three V’s of Big Data” – volume, velocity and variety. However, Datawatch’s technology handles:

  • Data sources of any size (it has to be tested and compared with Qlikview, Spotfire and Tableau)

  • Data that is changing in real time (Spotfire has similar, but Qlikview and Tableau do not have it yet)

  • Data stored in multiple types of systems and formats

We have to wait and see how it will play out but competition from Datawatch will make Data Visualization market more interesting in 2014… I feel now I need to review Datawatch products in my next blog post…

Qlikview.Next has a gift for Tableau and Datawatch

Yesterday I got invited by Qliktech for their semi-annual New England QlikView Boston User Group meeting. It was so many participants, so Qliktech was forced to hold the Keynote (of course the presentation and the demo of Qlikview.Next) and 4 cool presentations by Customers and Partners (Ocean State Job Lot, Analog Devices, Cybex and Attivio) outside of its own office but in the same building  on the 1st floor @Riverside Offices in Newton, MA @Rebecca’s Cafe.

It was plenty of very excited people in a very large room and very promising demo and presentation of Qlikview.Next, which actually will not be generally available until 2014. Entire presentation was done using new and capable HTML5 client, based on functionality Qliktech got when it bought NComVA 6 months ago.

I was alarmed when presenter never mentioned my beloved Qlikview Desktop and I when I asked directly about it, the answer shocked and surprised me. One of the most useful piece of software I ever used will not be part of Qlikview.Next anymore. As part of Qlikview 11.2, it will be supported for 3 years and then it will be out of the picture! I did not believe it and asked one more time during demo and 2 more times after presentation in-person during Networking and Cocktail Hour inside Qliktech offices. While food and drink were excellent, the answer on my question was the same – NO!


I have the utmost respect for very smart software developers, architects and product managers of Qlikview, but in this particular case I have to invoke 20+ years of my own advanced and very extensive experience as the Software Architect, Coder and Software Director and nothing in my past can support such a decision. I do not see why Qlikview.Next can not have both (and we as Qlikview users need and love both) Qlikview Desktop Client and Qlikview HTML5 client?

I personally urge Qliktech (and I am sure the majority of 100000+ (according to Qliktech) Qlikview community will agree with me) to keep Qlikview Desktop client as long as Qlikview exist. And not just keep it but 1st,  keep it as the best Data Visualization Desktop Client on market and 2nd, keep it in sync (or better ahead) with HTML5 client.

In case if Qlikview Desktop will disappear from Qlikview.Next, it will be a huge gift to Tableau and Datawatch (Spotfire Cloud Personal will no longer have access to the Spotfire Analyst desktop product and therefor Spotfire Cloud Personal is making a similar (partial) mistake as Qlikview.Next)



Tableau recently invested heavily into progress of all variations of Tableau Desktop (Professional, Personal, Public, Online, Free Reader) including (finally) migration to 64-bit and even porting Desktop to MAC, so it will instantly get the huge advantage over Qlikview in desktop, workstation, development, design, debugging, testing, QA  and offline environments.


It will also almost immediately propel the Datawatch as a very attractive contender in Data Visualization market, because Datawatch got (when they bought Panopticon this year) the extremely capable Panopticon Desktop Designer

Panopticon_Data_Visualization_Software_logo_file,_800x155,_Real-Time_Visual_Data_Analysisin addition to its own very relevant line of products.

Again, I hope I misunderstood answer I got 4 times during 4-hour meeting and during follow-up networking/cocktail hour or if understood it correctly, Qliktech will reconsider, but I will respect their decision if they don’t…

So I have to disagree with Cindi Howson (as usual): even if “QlikTech Aims To Disrupt BI, Again“, it actually will disrupt itself first, unless it will listen me begging them to keep Qlikview Desktop alive, well and ahead of competition.


You can find in Ted Cuzzillo’s article here: the actual quote from Qliktech’s CEO Lars Björk: ““We can disrupt the industry again”. My problem with this quote that Qliktech considers itself as the insider and reinventor of the dead and slow BI industry, while Tableau with its new motto “DATA to the people” is actually trying to be out of this grave and be inside own/new/fast growing Data Visualization space/field/market, see also blogpost from Tony Cosentino, VP of Ventana Research, here:!

You can see below interview with Time Beyers, who has own doubts about Qlikview.Next from investor’s point of view:

Basically, Qlikview.Next is late for 2 years, it will not have Qlikview Desktop (big mistake), it still does not promise any Qlikview Cloud services similar to Tableau Online and Tableau Public and it still does not have server-less distribution of visualizations because it does not have free Qlikview Desktop Viewer/Readers similar to free Tableau Reader. So far it looks to me that QLIK may have a trouble in the future…

The BI is a dead horse, long live the DV!

Last month Tableau and Qliktech both declared that Traditional BI is too slow (I am saying this for many years) for development and their new Data Visualization (DV software) is going to replace it. Quote from Tableau’s CEO: Christian Chabot: “Traditional BI software is obsolete and dying and this is very direct challenge and threat to BI vendors: your (BI that is) time is over and now it is time for Tableau.” Similar quote from Anthony Deighton, Qliktech’s CTO & Senior VP, Products: “More and more customers are looking at QlikView not just to supplement traditional BI, but to replace it“.

One of my clients – large corporation (obviously cannot say the name of it due NDA) asked me to advise of what to choose between Traditional BI tools with long Development Cycle (like Cognos, Business Objects or Microstrategy), modern BI tools (like JavaScript and D3 toolkit) which is attempt to modernize traditional BI but still having  sizable development time and leading Data Visualization tools with minimal development time (like Tableau, Qlikview or Spotfire).

Since main criterias for client were

  • minimize IT personnel involved and increase its productivity;

  • minimize the off-shoring and outsourcing as it limits interactions with end users;

  • increase end users’s involvement, feedback and action discovery.

So I advised to client to take some typical Visual Report project from the most productive Traditional  BI Platform (Microstrategy), use its prepared Data and clone it with D3 and Tableau (using experts for both). Results in form of Development time in hours) I put below; all three projects include the same time (16 hours) for Data Preparation & ETL, the same time for Deployment (2 hours) and the same number (8) of Repeated Development Cycles (due 8 consecutive feedback from End Users):


It is clear that Traditional BI requires too much time, that D3 tools just trying to prolongate old/dead BI traditions by modernizing and beautifying BI approach, so my client choose Tableau as a replacement for Microstrategy, Cognos, SAS and Business Objects and better option then D3 (which require smart developers and too much development). This movement to leading Data Visualization platforms is going on right now in most of corporate America, despite IT inertia and existing skillset. Basically it is the application of the simple known principle that “Faster is better then Shorter“, known in science as Fermat’s Principle of least time.

This changes made me wonder (again) if Gartner’s recent marketshare estimate and trends for Dead Horse sales (old traditional BI) will stay for long. Gartner estimates the size of BI market as $13B which is drastically different from TBR estimate ($30B).

BIDeadHorseTheoryTBR predicts that it will keep growing at least until 2018 with yearly rate 4% and BI Software Market to Exceed $40 Billion by 2018 (They estimate BI Market as $30B in 2012 and include more wider category of Business Analytics Software as opposed to strictly BI tools). I added estimates for Microstrategy, Qliktech, Tableau and Spotfire to Gartner’s MarketShare estimates for 2012 here:


However, when Forrester asked people what BI Tools they used, it’s survey results were very different from Gartner’s estimate of “market share:


“Traditional BI is like a pencil with a brick attached to it” said Chris Stolte at recent TCC13 conference and Qliktech said very similar in its recent announcement of Qlikview.Next. I expect TIBCO will say similar about upcoming new release of Spotfire (next week at TUCON 2013 conference in Las Vegas?)


These bold predictions by leading Data Visualization vendors are just simple application of Fermat’s Principle of Least Time: this principle stated that the path taken between two points by a ray of light (or development path in our context) is the path that can be traversed in the least time.

Pierre_de_Fermat2Fermat’s principle can be easily applied to “PATH” estimates to multiple situations like in video below, where path from initial position of the Life Guard on beach to the Swimmer in Distress (Path through Sand, Shoreline and Water) explained: 

Even Ants following the Fermat’s Principle (as described in article at Public Library of Science here: ) so my interpretation of this Law of Nature (“Faster is better then Shorter“) that  traditional BI is a dying horse and I advise everybody to obey the Laws of Nature.

AntsOn2SurfacesIf you like to watch another video about Fermat’s principle of Least Time and related Snell’s law, you can watch this: