Skip to main content

Tableau Integration: choose the right tool to integrate with Tableau

About Us
Published by admin
14 January 2021


Tableau is a visual analytics platform empowering people and organizations to see, explore, manage data, and faster to discover and share insights based on it.

This business intelligence and analytics tool increases the power of customer data by connecting all data sources and providing detailed analytics to everyone in the organization.

Tableau has a lot of opportunities for integrations, which might be used for different purposes, like modifying a workbook without using Tableau Desktop, building custom applications or workflows that react to events that happen in Tableau, making statistical models accessible during analytics in Tableau and so on.



The first tool we will consider is Tableau Server REST API. 

Tableau Server is a Business Intelligence application that allows its users to organize, edit, share, and collaborate on Tableau dashboards.  In other words it is a portal of analytics for end users. Tableau Server is installed on your chosen cloud storage or your own on-premises server.

While Tableau Desktop is designed for you to create dashboards, Tableau Server is made for you and your organization to organize and share those dashboards. Tableau Server allows you to edit your stories, dashboards, and workbooks and at the same time choose who has access to them and to what extent. 

Using the Tableau Server REST API, you can manage and change Tableau Server resources programmatically, via HTTP. The API gives you simple access to the functionality behind the data sources, projects, workbooks, site users, and sites on a Tableau server. You can use this access to create your own custom applications or to script interactions with Tableau Server resources.

For example, you can allow clients to enroll in your Tableau site. When a client signs up, a Project, a Group could be automatically created, Users could be added, then a demo worksheet and datasource could be published. Also, API's could be used to allow 'admin' users the ability to modify their users through a hosted website instead of giving any security access in Tableau itself.

See resources 1,2


Document API 

Another tool is Document API. The Document API provides a supported way to programmatically make updates to Tableau workbook and data source files. 

With the release of Tableau 10, Tableau released a python utility called the Tableau Document API (or TDA for short). TDA allows users to programmatically modify Tableau workbooks with ease. Modifying Tableau workbooks without using Tableau Desktop was possible before as Tableau files .twb are actually just XML files. However, manually editing the XML of .twb files could easily result in a corrupted workbook. Fortunately, with the release of this tool, it is now much less risky to modify workbooks without using Tableau Desktop

Document API is part of the Developer toolkit, which also includes the JavaScript API, Extract API, and the REST API. All you need in order to work with the Document API is some knowledge of Python.

For example, that API could be used to create and deploy templates or migrate workbooks from test to production data sources.

See resources 3,4


Hyper API

The Hyper API contains a set of functions you can use to automate your interactions with Tableau extract (.hyper) files. 

A Tableau data extract is a compressed snapshot of data stored on disk and loaded into memory as required to render a Tableau viz. 

You can use the API to create new extract files or to open existing files, and then insert, delete, update, or read data from those files. Using the Hyper API developers and administrators can:

  • Create extract files for data sources not currently supported by Tableau;
  • Automate custom extract, transform and load (ETL) processes (for example, implement rolling window updates or custom incremental updates);
  • Retrieve data from an extract file.

There are two aspects of Tableau data extract (TDE) design that makes them ideal for supporting analytics and data discovery. 

The first is that a TDE is a columnar store. It reduces the input/output required to access and aggregate the values in a column. That’s what makes them so wonderful for analytics and data discovery.

The second key aspect of TDE design is how they are structured which impacts how they are loaded into memory and used by Tableau. 

Basically, architecture-awareness means that TDEs use all parts of your computer’s memory, from RAM to hard disk, and put each part to work as best fits its characteristics.

The following SDKs are available to use directly from apps implemented with corresponding programming languages:

For example, connect to data sources with the Hyper API and write the data into extract files (in the .hyper file format for Tableau 10.5 and later). Write custom scripts that update data in existing extract files or read data from them.

See resource 5


Extract API

Hyper API can be used to create .hyper files for Tableau 10.5 and later. Also, Extract API 2.0 can be used to create extracts, but the Hyper API provides more capabilities and improved performance.

Extract API 2.0 contains a set of functions for creating extracts. Using the Extract API you can:

  • Create and populate extract (.hyper) files to improve performance and provide offline access to your data sources;
  • Write a program that connects to data sources that are not currently supported by Tableau, and then writes the data into a .hyper file for use later by Tableau;
  • Write a program to create an extract that contains multiple tables. 

See resources 6,7,8


Webhooks API

Webhooks are a common method whereby one computer system can notify another that an event has occurred using standard web technologies such as HTTP and JSON.

Webhooks let you build custom applications or workflows that react to events that happen in Tableau. For example, you could use webhooks to send an SMS or Slack notification any time a datasource refresh fails. 

The following events are supported:


See resources 9, 10, 11, 12


Data science integration

To make statistical models accessible during analytics in Tableau - integrate and visualize the data from your: 

  • R
  • Python 
  • Matlab models.

See resource 13


Tableau and R Integration

R is a popular open-source environment for statistical analysis. Tableau Desktop can now connect to R through calculated fields and take advantage of R functions, libraries, packages and even saved models. 

Tableau Server can also be configured to connect to an instance of Rserve through the tab admin utility, allowing anyone to view a dashboard containing R functionality. Combining R with Tableau gives you the ability to bring deep statistical analysis into a drag-and-drop visual analytics environment.

One challenge that arises in this type of deployment is that R is a tool that is intended to be used by trained personnel with familiarity with R or the Python programming language. 

See resource 14, 15


Python Integration (TabPy)

TabPy framework allows Tableau to remotely execute Python code. One of the benefits of that tool is that it allows authoring calculated fields in Python. For example, it can be used for data cleaning and predictive algorithms inside Tableau.

See resource 16


MATLAB Integration

With the Tableau MATLAB integration, models can be published on MATLAB Production Server inside Tableau calculated fields, passing data from your dashboards in real time to get predictive insights—with all the performance and scalability benefits of an enterprise-grade compute infrastructure.

Also, data can be pre-processed using MATLAB and persisted into a Tableau data extract for further analysis.

MATLAB combines a high-level language that enables you to perform computationally intensive tasks in an authoring environment built for engineers and data scientists. 

Example: Users with models published on MATLAB production Server want to share model results as Tableau visualizations.

See resource 17, 18


Embedded analytics

Embedding Tableau content allows you to add the power of interactive visualization to external applications. Common use cases of embedding are:

  • Tableau dashboards as components of line-of-business or vertical applications;
  • Embedding into internal knowledge bases and CRM systems;
  • Adding interactive visualizations to blog posts;
  • Embedding into custom mobile applications.

The act of embedding a single dashboard or visualization into a single webpage is quite simple, but a well-engineered integration requires handling other things such as authentication, authorization, content management, and performance. Depending on your integration goals, you may require the use of a variety of features and techniques.

See resource 19, 20


Custom data connectors 

Despite the fact that integrations with all possible systems are not created, there is a possibility to use connectors to data sources that are not currently supported by Tableau, including websites and custom applications. With the Web Data Connector, ODBC driver, and more, get the data your organization needs.

Example: Create custom connections to data on the web and make that data available in Tableau.

See resource 21

For the majority of possible purposes, Tableau has opportunities for integrations, which once again emphasizes the versatility of the tool and its applicability for analytics.


How to choose the right connector?







About author

Kate Rusakovich is a Certified Salesforce Field Service Lightning Consultant, Sales and Service Cloud Consultant, Salesforce Administrator, working 4 years with Salesforce and 9 years in IT sphere.

Kate Rusakovich
Business Analyst / Salesforce Consultant
Question to the expert

We have available resources to start working on your project within 5 business days

1 UX Designer


1 Admin


2 QA engineers


1 Consultant


Steps following request submission



After receiving your request, we analyze it and we offer free online meeting slots (via email) so that we can discuss your needs in as much detail as possible


We begin gathering all necessary requirements to create comprehensive estimates, including timelines, resource allocations, risk assessments, and underlying assumptions.


Once all preparations are in place, we will initiate the project and move forward with the planned tasks