A while ago, for my own education and for demo purposes, I implemented various Motion Charts using:
To implement Motion Chart in Tableau, you can use Page Shelf and place there either a Timing dimension (I used Dimension “Year” in Tableau example above) or even Measures Names (Average Monthly Home Value per ZIP Code) in my implementation of Motion Map Chart below.
Tableau’s ability to move through pages (automatically when Tableau Desktop or Tableau reader are in use and manually when Data Visualization hosted by Tableau Server and accessed through Web Browser) enabling us to create all kind of Motion Charts, as long as Visualization Author will put onto Pages a Time, Date or Timestamp variables, describing a Timeline. For me the most interesting was to make a Filled Map (Chart Type supported by Tableau, which is similar to Choropleth Map Charts) as a Motion Map Chart, see the result below.
As we all know, 80% of any Data Visualization are Data and I found the appropriate Dataset @Zillow Real Estate Research here: http://www.zillow.com/blog/research/data/ . Dataset contains Monthly Sales Data for All Homes (SFR, Condo/Co-op) for entire US from 1997 until Current Month (so far for 12604 ZIP Codes, which is only 25% of all USA ZIP codes) – average for each ZIP Code area.
This Dataset covers 197 Months and contains about 2.5 millions of DataPoints. All 5 Dimensions in Dataset are very “Geographical”: State, County, Metro Area, City and ZIP code (to define the “Region” and enable Tableau to generate a Longitude and Latitude) and each record has 197 Measures – the Average Monthly Home Prices per Given Region (which is ZIP Code Area) for each available Month since 1997.
In order to create a Motion Filled Map Chart, I put Longitude as Column and Latitude as Row, Measure Values as Color, Measure Names (except Number of Records) as Pages, States and Measure Names as Filters and State and ZIP code as Details and finally Attribute Values of County, Metro Area and City as Tooltips. Result I published on Tableau Public here:
so you can review it online AND you can download it and use it within Tableau Reader or Tableau Desktop as the automated Motion Map Chart.
For Presentation and Demo purposes I created the Slides and Movie (while playing it don’t forget to setup a Video Quality to HD resolution) with Filled Map Chart colored by Home Values for entire USA in 2013 as a Starting points and with 22 follow-up steps/slides: Zoom to Northeast Map, colored by 2013 Values, Zoom to SouthEastern New England 2013, start the Motion from Southeastern New England, colored by 1997 Home Values per each ZIP Code and then automatic Motion through all years from 1997 to 2014, then Zoom to Eastern Massachusetts and finally Zoom to Middlesex County in Massachusetts, see movie below:
Now I think it is appropriate to express my New Year Wish (I repeating it for a few years in a row) that Tableau Software Inc. will port the ability to create AUTOMATED Motion Charts from Tableau Desktop and Tableau Reader to Tableau Server. Please!
My previous blogpost, comparing footprints of DV Leaders (Tableau 8.1, Qlikview 11.2, Spotfire 6) on disk (in terms of size of application file with embedded dataset with 1 million rows) and in Memory (calculated as RAM-difference between freshly-loaded (without data) application and the same application when it will load appropriate application file (XLSX or DXP or QVW or TWBX) got a lot of feedback from DV Blog visitors. It even got mentioning/reference/quote from Tableau Weekly #9 here:
http://us7.campaign-archive1.com/?u=f3dd94f15b41de877be6b0d4b&id=26fd537d2d&e=5943cb836b and the full list of Tableau Weekly issues is here: http://us7.campaign-archive1.com/home/?u=f3dd94f15b41de877be6b0d4b&id=d23712a896
The majority of feedback asked to do a similar Benchmark – the footprint comparison for larger dataset, say with 10 millions of rows. I did that but it required more time and work, because the footprint in memory for all 3 DV Leaders depends on the number of visualized Datapoints (Spotfire for years used the term Marks for Visible Datapoints and Tableau adopted these terminology too, so I used it from time to time as well, but I think that the correct term here will be “Visible Datapoints“).
Basically I used the same dataset as in previous blogpost with main difference that I took subset with 10 millions of rows as a opposed to 1 Million rows in previous Benchmarks. The Diversity of used Dataset with 10 Million rows is here (each row has 15 fields as in previous benchmark):
I removed from benchmarks for 10 million rows the usage of Excel 2013 (Excel cannot handle more the 1,048,576 rows per worksheet) and PowerPivot 2013 (it is less relevant for given Benchmark). Here are the DV Footprints on disk and in Memory for Dataset with 10 Million rows and different number of Datapoints (or Marks: <16, 1000, around 10000, around 100000, around 800000):
Main observations and notes from benchmarking of footprints with 10 millions of rows as following:
Tableau 8.1 requires less (almost twice less) disk space for its application file .TWBX then Qlikview 11.2 (.QVW) for its application file (.QVW) or/and Spotfire 6 for its application file (.DXP).
Tableau 8.1 is much smarter when it uses RAM then Qlikview 11.2 and Spofire 6, because it takes advantage of number of Marks. For example for 10000 Visible Datapoints Tableau uses 13 times less RAM than Qlikview and Spotfire and for 100000 Visible Datapoints Tableau uses 8 times less RAM than Qlikview and Spotfire!
THe Usage of more than say 5000 Visible Datapoints (even say more than a few hundreds Marks) in particular Chart or Dashboard often the sign of bad design or poor understanding of the task at hand; the human eye (of end user) cannot comprehend too many Marks anyway, so what Tableau does (in terms of reducing the footprint in Memory when less Marks are used) is a good design.
For Tableau in results above I reported the total RAM used by 2 Tableau processes in memory TABLEAU.EXE itself and supplemental process TDSERVER64.EXE (this 2nd 64-bit process almost always uses about 21MB of RAM). Note: Russell Christopher also suggested to monitor TABPROTOSRV.EXE but I cannot find its traces and its usage of RAM during benchmarks.
Qlikview 11.2 and Spotfire 6 have similar footprints in Memory and on Disk.
Last month Tableau and Qliktech both declared that Traditional BI is too slow (I am saying this for many years) for development and their new Data Visualization (DV software) is going to replace it. Quote from Tableau’s CEO: Christian Chabot: “Traditional BI software is obsolete and dying and this is very direct challenge and threat to BI vendors: your (BI that is) time is over and now it is time for Tableau.” Similar quote from Anthony Deighton, Qliktech’s CTO & Senior VP, Products: “More and more customers are looking at QlikView not just to supplement traditional BI, but to replace it“.
Since main criterias for client were
minimize IT personnel involved and increase its productivity;
minimize the off-shoring and outsourcing as it limits interactions with end users;
increase end users’s involvement, feedback and action discovery.
So I advised to client to take some typical Visual Report project from the most productive Traditional BI Platform (Microstrategy), use its prepared Data and clone it with D3 and Tableau (using experts for both). Results in form of Development time in hours) I put below; all three projects include the same time (16 hours) for Data Preparation & ETL, the same time for Deployment (2 hours) and the same number (8) of Repeated Development Cycles (due 8 consecutive feedback from End Users):
It is clear that Traditional BI requires too much time, that D3 tools just trying to prolongate old/dead BI traditions by modernizing and beautifying BI approach, so my client choose Tableau as a replacement for Microstrategy, Cognos, SAS and Business Objects and better option then D3 (which require smart developers and too much development). This movement to leading Data Visualization platforms is going on right now in most of corporate America, despite IT inertia and existing skillset. Basically it is the application of the simple known principle that “Faster is better then Shorter“, known in science as Fermat’s Principle of least time.
This changes made me wonder (again) if Gartner’s recent marketshare estimate and trends for Dead Horse sales (old traditional BI) will stay for long. Gartner estimates the size of BI market as $13B which is drastically different from TBR estimate ($30B).
TBR predicts that it will keep growing at least until 2018 with yearly rate 4% and BI Software Market to Exceed $40 Billion by 2018 (They estimate BI Market as $30B in 2012 and include more wider category of Business Analytics Software as opposed to strictly BI tools). I added estimates for Microstrategy, Qliktech, Tableau and Spotfire to Gartner’s MarketShare estimates for 2012 here:
However, when Forrester asked people what BI Tools they used, it’s survey results were very different from Gartner’s estimate of “market share:
“Traditional BI is like a pencil with a brick attached to it” said Chris Stolte at recent TCC13 conference and Qliktech said very similar in its recent announcement of Qlikview.Next. I expect TIBCO will say similar about upcoming new release of Spotfire (next week at TUCON 2013 conference in Las Vegas?)
These bold predictions by leading Data Visualization vendors are just simple application of Fermat’s Principle of Least Time: this principle stated that the path taken between two points by a ray of light (or development path in our context) is the path that can be traversed in the least time.
Fermat’s principle can be easily applied to “PATH” estimates to multiple situations like in video below, where path from initial position of the Life Guard on beach to the Swimmer in Distress (Path through Sand, Shoreline and Water) explained:
Even Ants following the Fermat’s Principle (as described in article at Public Library of Science here: http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0059739 ) so my interpretation of this Law of Nature (“Faster is better then Shorter“) that traditional BI is a dying horse and I advise everybody to obey the Laws of Nature.
If you like to watch another video about Fermat’s principle of Least Time and related Snell’s law, you can watch this: