What to expect from the recent consolidation in the Business Intelligence Market

Why the Looker acquisition by Google makes sense and the Tableau acquisition by Salesforce does not

There were two very high-profile acquisitions in the month of June 2019 in the BI tools space. One was the acquisition of Looker by Google ($2.6 Billion) and another one is the acquisition of Tableau by Salesforce ($16 Billion).

Out of the two major acquisitions, one makes a lot of sense (Looker by Google) and the other one not so much (Salesforce by Tableau). Here is my view of these two acquisitions -

Looker acquisition by Google Cloud

Google has been investing heavily in the enterprise cloud area for the last several years. The scale and scope of their investment was expected to increase drastically when they hired away Thomas Kurian from Oracle. Google Cloud Platform (GCP) entered the enterprise BI market via Google Big Query. Google Big Query has evolved over the years by starting as purely a cloud based serverless columnar store database with a lot of features for the big data developer community. It was not really considered a true enterprise data warehouse supporting all the SQL syntax like Oracle, SQL Server and Teradata. Google probably realized over time how big the enterprise data warehouse market truly was by looking at the success of Snowflake and has caught up in terms of full SQL support. Now it is considered a fully functional cloud data warehouse to which companies can move their traditional data warehouse workloads.

In addition, Google was also building their own BI tools on GCP — Google Data Studio. Google Data Studio is considered as a simple BI tool which primarily connects to Big Query and other Google properties and provides reporting and visualizations. It was never a serious contender as an enterprise BI tool against the likes of Tableau and Qlik. GCP also announced several data integration tools this year — Cloud Data Flow and Cloud Data Fusion. Although these tools look very powerful with lots of features they don’t look like a good fit for the traditional ETL developers who are familiar with Informatica or SSIS. These tools look more for the modern data engineers in startups which is a whole different but less lucrative market than the traditional BI market.

Not surprisingly, a significant development which happened earlier this year after Thomas Kurian joined was the acquisition of Alooma. Alooma specializes in moving data from various cloud data application sources like Salesforce or Workday to cloud data warehouses like Snowflake and Big Query as well as syncing on-premise data sources to the cloud. Clearly the future focus for Alooma will be syncing more and more data sources to Google Big Query. Google also announced an in-memory caching layer on top of Big Query called BI Engine. BI Engine can cache data sets from Big Query for extremely fast results and can compete with Tableau’s hyper file or with Qlik’s in-memory engine in terms of performance.

The missing piece in GCP’s data platform was an enterprise level BI tool. Google Cloud was an early partner with Looker and would have had a ring side view of its popularity and growth. Looker’s big advantage was its all cloud architecture, semantic layer using LookML and integration of the LookML code with Git for versioning. Looker is very popular with startups and is starting to penetrate the enterprise companies who have code-savvy data analysts.

With the acquisition of Alooma and Looker and its own BI Engine and Big Query, GCP now has a complete stack for enterprise BI. GCP can now provide a one stop solution for replacing traditional enterprise BI built on top of Oracle/Teradata warehouses, Informatica as ETL and OBIEE/BO/Cognos/MicroStrategy as BI tools.

If they make these tools easy to purchase, easy to build end to end BI solutions then they really have a gold mine in their hands. It is going to be critical for Google to create a seamless experience across all these four tools for users and developers. They must keep in mind that the customer base is SQL savvy and not code savvy. The tools need to work together without much configuration or coding to truly go after the enterprise BI market.

Google Cloud BI Architecture post Looker and Alooma acquisitions

It remains to be seen if Google can keep its focus for the long haul and really invest in these four capabilities (database, ETL, caching layer and visualization) rather than building more and more tools which distract them and confuse customers as to what their road map is.

Google clearly can provide excellent advanced analytics, ML and AI capabilities on top of this core set of capabilities. That market can be captured if GCP starts hosting the data and the visualizations. It is well known that without the data in proper form, advanced capabilities like ML and AI are hard to achieve.

Tableau acquisition by Salesforce

While Google’s acquisition of Looker was all cash and much smaller, the Tableau acquisition by Salesforce was a massive all stock $16 billion deal. Tableau’s acquisition by Salesforce is a real head scratcher not only in terms of price paid but also because of the significant architecture incompatibilities and Salesforce’s own relatively poor post-acquisition track record for the various companies they have purchased so far.

Salesforce needs to continue acquiring to have a end to end BI Solution

The motivation for entering the BI space makes sense for Salesforce as they can use their much bigger sales channel and customer base to expand into new markets. The question is why Tableau?

Customers love Tableau because of its easy to use and powerful desktop product. Tableau does have server and cloud products, but they were built more as a sharing mechanism for dashboards and workbooks between users rather than as the primary development platform. Tableau also completely relies on data extracts in .hyper form to perform in a reasonable manner. This adds a whole layer of complexity in the environment as it leads to two stage ETLs and data sitting in various desktops leading to security issues. Salesforce’s whole premise was to get rid of on-premise software and especially desktop-based applications.

They might have been better off buying cloud based BI tools like Domo rather than Tableau. With Tableau they now must explain to the customers who love the desktop tool in the ultra-competitive BI market as to what their future architecture is for the product. Microsoft with its Power BI tool set which includes a near free desktop product was already giving Tableau a run for their money, but now their sales team can create fear and uncertainty in the market regarding Tableau’s future architecture.

One explanation for the acquisition of Tableau is that Salesforce may not be done yet but is planning on adding a cloud data warehouse as well as a ETL tool to its product set. This might lead them to buy Alteryx or Talend for ETL and possibly Snowflake as the data warehouse. But this is going to be an extremely costly route to take as each of these companies are already very big. Salesforce’s shareholders will demand that the Tableau acquisition has led to significant growth acceleration for both products before its next big acquisition in the analytics space.

Analytics and Intelligent Automation Architecture, Tools and Best Practices. https://spockanalytics.com