By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.
Nohyun Myung
Jun 17, 2021

HEAVY.AI vs. Tableau and Snowflake: A Tale of the Tape

Try HeavyIQ Conversational Analytics on 400 million tweets

Download HEAVY.AI Free, a full-featured version available for use at no cost.

GET FREE LICENSE

NOTE: OMNISCI IS NOW HEAVY.AI

We've all seen countless static reports and stiff Business Intelligence (BI) dashboards over the past 20 years from different industry players.  

Those companies intended on offering a method for communicating business-critical information to various stakeholders through data visualization tools.

Unfortunately, their business analytics tools have been unable to accommodate the growing volume, velocity, or variety of observational datasets available to most public and private organizations today.

Recently our team took part in an exercise to understand what users can accomplish with common business intelligence software that uses industry-standard data warehouses as their backend and compare those results with HEAVY.AI, a user experience purpose-built for modern data.

We chose a Tableau frontend paired with a Snowflake backend as the solution to compare against HEAVY.AI for two primary reasons:

  • the prevalence of the two technologies in the data & analytics marketplace
  • and they are disparate technologies for visualization and compute, which represents a legacy approach

In this post, we'll define our HEAVY.AI, Tableau and Snowflake environments, provide performance observations and benchmarks, and share a few key takeaways from our comparison exercise that you can apply to your business immediately.

Definition of HEAVY.AI, Tableau and Snowflake Environments


Before reporting our benchmarks, it's important to note where and how we deployed each technology.

We stood up a set of machines in AWS to host Tableau and HEAVY.AI and launched the largest Snowflake compute warehouse we could. 


Environment comparison (licensing not included)


You may be wondering why we didn't configure a Snowflake compute warehouse that is more cost aligned with our HEAVY.AI environment? 

The answer is pretty simple. 

If we used any lower tier of the Snowflake compute warehouse, our queries, and ultimately the exercise, wouldn't have been completed. The amount of time it took to perform the desired queries and data analysis in our initial test was astronomical. 

The only way we could achieve something comparable was by using the 4X-Large instance type. 

Similarly, a Tableau Server environment was our alternative to the Tableau desktop product, which could not finish the tests or deliver an optimal user experience with the volume of data we wanted to visualize.

The remainder of the exercise had a basic premise: Given the same starting point, how quickly and easily can users build a dashboard in Tableau and Snowflake versus HEAVY.AI, all while providing incredible scale, performance, and beautiful visualizations without any compromises?

Tableau Comparison: Benchmarks and Observations


We ran query and application response benchmarks on three distinct data and technology profiles approximately 30 times to evaluate our premise:

  • HEAVY.AI with 11.6 billion vessel observations
  • Tableau and Snowflake with 1 million vessel observations
  • Tableau and Snowflake with 11.6 billion vessel observations without a map


Query Performance and Application Response Time in Seconds


The one million row sample dataset was used to capture Tableau's geospatial performance because it couldn't map the entire 11.6 billion row dataset. We wanted to establish a reasonable number of rows that users could expect relative interactivity with a map. 

Downsampling the data to one million rows opened up the opportunity to pan and zoom the map in just under 15 seconds and geospatially filter the ship observations in 50 seconds compared to 3 seconds in HEAVY.AI for each workload respectively.

We attempted to create a map of the entire corpus of ship observations in Tableau, but it's impossible as far as we're concerned. Tableau was allowed to work on a map for a whole workday during testing, but it never returned and eventually crashed.

Key Takeaways

This Tableau comparison exercise was illuminating for all of us here at HEAVY.AI. 

We knew that HEAVY.AI was a unique technology, but the Tableau alternative comparison validated its differentiating factors and exposed the limitations of traditional tools we often get lumped in with; here are a few of our key takeaways:

  1. Speed and performance - HEAVY.AI oriented the most extensive set of observational data we had available in seconds, ultimately driving faster decisions and, most importantly, action. 
  2. Instant interactivity - As a result of HEAVY.AI's speed and performance, we instantly interacted with every map, chart, and graph we configured without compromising on wait times. We've seen this as a requirement of many organizations that want to grow, scale, and ask more complex data driven questions.
  3. Scale of data - HEAVY.AI didn't sacrifice features or capabilities when asked to process and visualize 11.6 billion ship observations in real-time. Traditional tools have to bin, aggregate, or downsample big data to use it in any way, and even then, they struggle to deliver a fully-featured solution for exploration.
  4. Time to value - The process of loading data, building maps, charts, and graphs, visually analyzing and exploring information, and moving on is accelerated. We may enjoy this process, but action and value are at the end of the chain. Getting there quicker frees time for other essential tasks. 
  5. Geospatial context - Modern data is massive, fast, and, more often than not, location-enabled. HEAVY.AI takes advantage of the fact that we're storing these variables by offering mapping, filtering, and spatiotemporal exploration. Tableau did not provide that experience for our 1 million row sample set, let alone the whole corpus of 11.6 billion records.
  6. Cost - The Tableau with Snowflake solution we used for the comparison cost $387 per hour before Tableau licensing compared to $43 per hour for HEAVY.AI with licensing included. These machines may be larger than those in an average deployment, but this demonstrates the disparity in the total cost of ownership for interactive visual analytics solutions, particularly when considering the volume, velocity, and variety of modern data.

Check out the webinar below if you are interested in taking a deeper dive and learning more about how we evaluated these solutions, created the visualizations, configured the data, and more.


Try HEAVY.AI for yourself today, download HEAVY.AI Free, a full-featured version available for use at no cost.

Please share your thoughts with us on LinkedInTwitter, or our Community Forums!


Nohyun Myung

Nohyun is Vice President of the Solution Engineering practice globally for HEAVY.AI and brings extensive experience as a technologist, strategic executive and board advisor spanning 20 years in the data, analytics and high-growth technology space. He has played critical roles in leading teams in pursuit of seeing the adoption of emerging technologies applied to some of the most challenging business problems across telecommunications, automotive, transportation and logistics, retail and utilities markets. Prior to joining HEAVY.AI, Nohyun held roles as the Vice President of Solution Engineering at Kinetica, Director of Commercial Solutions at GeoDecisions and Lead Solution Engineering roles at Esri.