top of page

Tableau Guide: Hammering your first Analytics assignment.

Updated: Sep 3, 2020

Guide for successful implementation of first analytics assignment

It's your first analytics assignment, and you are overwhelmed not sure on how to go about it. You are pretty sure Tableau is the tool you're going to visualize your data with, but you doubt the preconceived framework in your mind, you even not sure how your consumers will respond, we can say you're scared but very optimistic to continue with the assignment. This is the process newbies go through in every industry and I hope this article will shed some light on how to deal with analytics projects in Tableau. But first, lets take a shot at Tableau pillars which will guide you evaluate the capacity of the tool to handle your project.

Tableau is build on six pillars;

  • Broad access to Big data platforms -Tableau supports over 40 data sources as well as countless others through extensibility options.

  • Self-service visualization of big data for business users - Tableau software is a drag and drop and does not require users to write complex codes.

  • Hybrid data architecture for optimizing query performance - Tableau can connect to data live or bring it in-memory to accelerate slow databases.

  • Data blending - Tableau enables users perform analytics across data sources, and therefore distributed data sources should not give an headache.

  • Overall platform query performance - Tableau 10.0 on-wards runs on .hyper engine which facilitate real time conversation with data.

  • Powerful and homogeneous visual interface to data - With Tableau you can filter, run forecasts and perform trend line analysis using simple actions.

Now you understand the capacity and capability of the tool, and you're quite sure it will deliver on your project. But, which framework executes best? Below is a guideline which you may find useful in your first analytics project.

1. Understanding the Business questions or KPI's you're trying to answer

Let me first make it clear that every step matters equally though you may find myself emphasizing on some steps, and this is one of them, understanding the questions you're trying to answer will enable you map your questions back to the data and decide on the best data structure to answer these questions. It will also help you solve the mystery of data required to answer these questions.

2. Source for Data

With clear understanding of your business questions, it's good to look for the necessary data that will help you answer your business questions. You can evaluate this data using the following criteria;

  • What is the size of data required to answer my business questions? (Evaluated in-terms of Rows or Megabytes).

  • Do I need all the data or a subset of the data to answer my business questions? (Note; all data will allow you perform granular analysis while subset of the data will enable you deliver on efficiency of the system).

  • Do I have access credentials to all data sets required to answer my business questions.

With a good satisfaction on data requirements, you can proceed to the next step.

3. Data Wrangling

Data will always be messy and therefore requiring some cleaning. Remember the GIGO thing? -Garbage In Garbage Out. If you empower your system with poorly prepared data, then expect poorly designed product and hence poor usage and adoption of the product. Therefore, here you might require the expertise of other tools you're familiar with to clean your data. But in case you want to remain in the Tableau ecosystem, Tableau prep will help you clean your messy data.

4. Connect data.

Now that your data is ready, you can move it into the analytics engine. Here you'll need to make a decision on whether to work with data live or move your data into the Tableau In-memory depending with demands of your project. No bad decision here, it's only deciding the best approach for your project. But, if I can take a quick shot at it.

Live data connection will be useful in the following;

  • When you need the last minute data in your analysis.

  • When you're connected to a fast database. (Remember the speed of your query will be determined by the speed of your database among other factors).

When should I prefer bringing my data into Tableau In-memory?

  • When am connected to a slow database.

  • When my database doesn't support computations available in the Tableau engine.

  • When I would like to work offline, sometime you may prefer to work offline and still ensure the continuity of your project, bringing you data In-memory will allow you do that.

  • When you want to load off from transactional database.

5. Verify your data

Now you have empowered your analytics engine with clean data. And you would like to throw yourself into answering your business questions. But something needs to be done. It's good to verify if your data conforms with industry known metrics. Remember you are human and an error might have occurred in above four steps. Once you've verified your data and fully satisfied with it. You can now move to answering your business questions.

At this stage you'll require to understand the best way to communicate your insights. Remember the final product is not for you but rather for somebody who might not be data savvy, and therefore putting your persona into consideration is very key here. Understanding how to build different charts will help you establish your story faster. (Like I say, charts & graphs are the building blocks of data story).

I hope this one approach will guide you in your Tableau analytics project. I would appreciate to learn from you some of data frameworks which can work better. I welcome any criticisms too. See you in the next article.

Thanks for reading.

Black & white.jpg

About Me

More About the Author

Bernard K

Analytics Consultant | 3X Tableau Certified

Bernard is a data analytics consultant helping businesses reveal the true power of their data and bring clarity to their reporting dashboards. He loves building things and sharing knowledge on how to build dashboards that drive better outcomes.

Let’s discuss your data challenges! Let’s work together!

bottom of page