A Guide to Exploratory Data Analysis: Discovering the Hidden Ge(r)ms with EDA

A Guide to Exploratory Data Analysis: Discovering the Hidden Ge(r)ms with EDA

Krzysztof Suwada - March 15, 2023

Are you ready to go on a treasure hunt? Well, you’re in luck, because that’s exactly what we’ll be doing in this guide to exploratory data analysis (EDA). Think of your data as a mine filled with hidden ge(r)ms waiting to be discovered.

With the right tools and techniques, you can extract insights that could make all the difference in your business or organization’s success. However, it’s important to note that not all data is created equal. Sometimes within your data, you can uncover issues such as data quality problems or data that doesn’t hold the insights you’re looking for.

That’s why EDA is such a critical step in the data analysis process. It helps you better understand your data, identify potential issues, and make more informed decisions.

In this guide, we’ll explore the various techniques and benefits of EDA and discuss how it can help you avoid potential pitfalls in your data analysis. So grab your pickaxe, and let’s start exploring the data mine with EDA.


• Exploratory data analysis (EDA) helps you uncover hidden insights within your data.
• Skipping EDA can lead to incorrect conclusions, missed opportunities or even failed investments in AI/ML projects.
• EDA involves techniques for cleaning and preparing data, and exploring data sets with various statistical methods. The common data analytics process include univariate analysis, bivariate analysis and multivariate data analysis (referred to as MVA or MVDA).
• Almost all EDA techniques include some data visualization and drawing dedicated plots and graphs to highlight some insights.
• EDA helps identify patterns, relationships, and other features that can provide valuable insights and inform decision-making.
• Want to take your data analysis to the next level? Contact nexocode data science experts to learn how you can implement artificial intelligence in your business and gain a competitive edge.

What Is Exploratory Data Analysis and Why Your Data Needs It?

Exploratory Data Analysis (EDA) is the process of investigating a data set in order to discover patterns, relationships, and other features hidden within your data that can provide valuable insights and inform decision-making.

EDA is not just a data analysis technique – it’s an entire mindset rather than a rigid set of rules that define a formal process. This approach involves techniques for cleaning and preparing data, as well as investigative methods for revealing potential issues and emerging trends systematically.

In addition, almost all EDA approaches include some data visualization, such as drawing dedicated plots and other graphical representations, to highlight specific insights or check certain assumptions.

Since EDA is best carried out iteratively, you need to have the freedom to investigate and ask any questions of your data that come to mind. Although you will probably have some ideas in the back of your head about what you are looking to discover, it is crucial to remain open-minded.

When performing exploratory data analysis, allow yourself to uncover unexpected correlations – these are often the most valuable insights. Likewise, while you may start the process off with some objectives in mind, it is just as vital to be flexible with your data exploration.

EDA is a continuous process asking questions about data sets, looking for the answers through visualization, transformation, and statistical modeling, then using your insights to refine your ideas and pose new questions.

The more you carry out exploratory analyses, the more likely you are to uncover interesting features and relationships in your data. Doing so may even eventually lead you to honing in on particular EDA techniques that you can report on and share with your team or other interested parties in your organization.

Your data needs EDA because this process can be carried out to identify correlations between variables, trend lines, or outliers that may not have been visible in a data set by simply looking at the numbers.

Furthermore, exploratory data analysis can also be beneficial when used in conjunction with predictive analytics to better understand the implications of different scenarios and make more informed choices.

Don’t Skip This Step: The Risks of Omitting EDA

The consequences of not thoroughly investigating data sets include drawing incorrect conclusions, missing out on golden opportunities, or even making failed investments in artificial intelligence (AI) / machine learning (ML) projects.

When exploratory data analysis is not carried out properly, or not done at all, datasets can be wrongly interpreted and poor decisions or false predictions made based on guesswork or gut feelings instead of the facts.

The Datasaurus Dozen

The Datasaurus Dozen

In other words, exploratory data analysis enables you to better understand your data and ask the right questions so that you can pick the right path to go down. This, in turn, will minimize risks and maximize the potential return on investment (RoI).

On the other hand, the potential negative impact of omitting the critical step of EDA from your overall data strategy can easily outweigh the time and effort required to carry out this process.

How to Approach Exploratory Data Analysis?

Exploratory data analysis involves several techniques and processes, from initially organizing the data sets in question all the way to actually carrying out the necessary analyses with visual techniques.

Start with Initial Data Analysis – Data Cleaning, Spotting Missing Values, and Data Preparation

The initial exploratory data analysis process ought to begin with understanding the data through tidying it up, correcting any issues, and preparing the data for further exploratory analysis.

Data cleaning means getting the data into a standard form, such as tabular format, to simplify the EDA process, remove or replace any outliers or duplicates, and handle any null values appropriately rather than replace them with arbitrary data.

These preliminary steps include inspecting the raw data, identifying any missing values, and formatting the data so that it can be properly worked with. Taking care at the beginning of the process will streamline the analyses that follow.

Questions to Ask When You Try to Analyze Data

Some of the most common questions to ask during the EDA process are as follows:

What is the overall shape of the data?

The shape of the data – that is, the overall structure and pattern of the data points (which could simply be the number of rows and columns) – is fundamental to exploratory analysis. It allows you to determine whether the data is skewed or distributed evenly.

Are there any outliers?

Not having a normal distribution of values can distort the results of exploratory data analysis and should be handled with care. To spot any possible outliers, employ summary statistics such as the mean and median value or a box and whisker plot to visualize a distribution of values.

What are the data types?

Common data types – including numeric, categorical, and ordinal data – tell you what kind of analysis is required and which exploratory analysis techniques would be most suitable for the particular data set.

Which is the target, and which is the feature?

The former, also known as the output or dependent variable, is the variable that will be analyzed or predicted or analyzed, while the latter, known as the input or independent variable is a variable or set thereof used to determine the former.

Can you identify patterns in the data?

Recognizing trends is essential when carrying out exploratory data analysis. Understanding the relationships between different variables and their correlations lets you identify patterns or connections that could prove to be invaluable for decision making.

Univariate Analysis – Exploring Individual Variables in the Dataset

Univariate analysis deals with one single variable at a time, enabling you to explore its characteristics and check for any unusual patterns or unexpected results. This exploratory technique provides an effective way of understanding individual variables in order to better comprehend the whole data set.

Techniques for Univariate Analysis

Descriptive statistics measures quantities such as the mean – the sum of all values different by their total number, the median – the midpoint value of a given data set, and the mode – the most commonly occurring value in the data set.

Histogram example

Histogram example

Histograms are used to visualize the distribution of a given variable, providing a graphical representation of the frequency of each data point to identify any underlying patterns by dividing the data set into equally spaced bins and displaying the quantity of observed values that fall into each.

Violin plot example

Violin plot example

A box plot is a graphical representations of the five-number summary showing the minimum and maximum values, as well as visualizing a distribution of values according to three lines – the first quartile, the third quartile (the distance between is the interquartile range), and the median (or second quartile) – giving a sense of spread, symmetricity, and skewness.

Bivariate Analysis – Exploring the Relationship Between Two Variables in the Dataset

Bivariate exploratory data analysis takes a deeper look at the relationships between two variables in order to assess the correlation (or lack thereof) between them. Through exploratory bivariate analysis, you can get further insights and start to identify any significant relationships.

Techniques for Bivariate Analysis

Scatter plots are used to visualize the relationship between two variables, enabling you to identify any potential trends or patterns.

Scatterplot example

Scatterplot example

Correlation coefficients are used to measure the linearity, if it exists, between two variables and can be used to quantify the strength of the relationship or predict the value of one variable based on the other.

Linear Regression is an algorithm to find the correlation coefficient in a linear relationship between two or more variables, allowing a straight line – the regression line – to be drawn between them and predict future values.

Multivariate Analysis – Exploring the Multidimensional Relationship Between Variables in the Dataset

Also referred to as multivariate data analysis, MVA or MVDA is used to identify patterns and discern relationships between three or more variables.

Techniques for Multivariate Analysis

Cluster analysis is used to group data points based on their similarities, allowing you to identify patterns in the data as well as any outliers.

Scatterplot matrix example

Scatterplot matrix example

The principal component analysis (PCA) is a dimension reduction technique used to reduce the multidimensional data set into smaller components, which can then be visualized more easily using the abovementioned univariate or bivariate techniques.

Gathering Data Analysis Outcomes

Once all the exploratory analysis has been carried out, it is necessary to document the results in an easily understandable manner so that the stakeholders and decision makers are able to interpret the data set.

Gathering the outcomes of EDA in a comprehensive report can be used as the basis to move forward with more advanced techniques such as predictive analytics.

At the end of the exploratory data analysis process, the outcomes should be used to better understand any hidden ‘germs’ in the data set and use them to gain insights that can be used to make more effective decisions in the future.

Data Visualization and Statistical Tools Used in Data Science

In order to carry out exploratory data analysis, a combination of graphical and numerical methods are used. Commonly used EDA visualization techniques include histograms, box plots, and scatter diagrams to draw dedicated graphs, plus numerical tools like descriptive statistics, correlation coefficients, and cluster analysis to derive insights from the data.

The most popular programming languages used by data scientists to perform exploratory data analysis are Python and R. Both languages provide extensive exploratory data analysis capabilities, including exploratory visualization and statistical modeling tools, as well as the ability to perform complex calculations and operations on massive datasets.

The former contains powerful data science libraries such as Pandas, Matplotlib, and Seaborn to provide extensive support for exploratory data analysis. The latter, meanwhile, is a statistical programming language that comes with packages such as ggplot2 and dplyr to facilitate data wrangling and exploratory analysis.

Deliverables of Exploratory Data Analysis

Exploratory data analysis is an investigative process that is intended to facilitate more clear and deep thought about a problem at hand in order to come up with innovative solutions to address it. In addition, EDA should provide the following deliverables:

  • A summary of descriptive statistics – this includes any results such as means, medians, modes.
  • Identification of outliers – results that may skew an otherwise normally distributed data set.
  • Gathered insights on potential trends – relationships between variables according to their correlation coefficients.
  • Insightful graphs and visualizations – providing a pictorial representation of the data set.
  • Documentation of exploratory analysis findings – used as a reference for the stakeholders.
  • A comprehensive exploratory data analysis report – to be used as the basis to move forward with more advanced techniques such as predictive analytics.

Benefits of EDA – Why Is Exploratory Data Analysis Important?

Exploratory data analysis is an essential aspect of data science as it allows you to better understand the data set, develop hypotheses and uncover hidden patterns or features that can be used to make informed decisions.

Other key benefits of analyzing data in an exploratory manner include the following:

  • Cleaning up germs in the data – detecting missing values, errors, and anomalies in a data set so that it can be prepared for further analysis.
  • Discovering the hidden gems in a data set – revealing trends and correlations that may not be immediately apparent on the surface.
  • Gaining a better understanding of the data – learning more about a data set and its characteristics.
  • Uncovering hidden relationships – understanding how variables interact with one another and the effect that they have on the outcomes.
  • Generating new ideas – formulating testable hypotheses from the gathered data set.
  • Kickstarting an analytics project – presenting you with the information you need to decide on the best course of action.
  • Informing deeper insights – having a holistic view enables you to look at a data set from different angles and draw appropriate conclusions.
  • Visualizing investigative findings – provides a way to present the results in an aesthetically pleasing and easy-to-interpret manner.
  • Supporting decision making – assessing the current state of a system and making decisions based on the derived insights.
  • Facilitating report writing – creating a comprehensive summary that can be used for future reference and to inform stakeholders.
  • Improving cost and time efficiency – completing the process relatively quickly and cheaply as compared to more complex techniques.

Summary – Unlock the Mysteries of Your Data With EDA Techniques

Exploratory data analysis is an essential data science methodology for learning more about data sets, uncovering hidden patterns and relationships, fostering new ideas, and deriving deeper insights.

Leveraging exploratory techniques such as exploratory visualization, statistical modeling, clustering, and correlation analysis can all provide key information to inform decision-making.

For those new to exploratory data analysis, understanding the principles and techniques involved can take some time. However, with patience and practice, along with this guide, you will soon be able to uncover the mysteries of your data with exploratory data analysis.

If you want to take your data analysis to the next level right away, the data science experts at nexocode can assist you with implementing AI solutions in your business in order to gain an edge over your competitors without further ado. Get in touch with us about your data analytics today to learn more.


Data Science Tutorial on GitHub - source for plots from this article

About the author

Krzysztof Suwada

Krzysztof Suwada

Data Science Expert

Linkedin profile

Krzysztof is a data scientist who applies machine learning and mathematical methods to solve business problems. He is particularly interested in developing end-to-end solutions for companies in various industries using deep learning and NLP techniques.
Mathematician, software developer, and trainer. Krzysztof's expertise in machine learning earned him a Google Developer Expert title. A fan of Albert's Einstein quote: "If you can't explain it simply, you don't understand it well enough."

Would you like to discuss AI opportunities in your business?

Let us know and Dorota will arrange a call with our experts.

Dorota Owczarek
Dorota Owczarek
AI Product Lead

Thanks for the message!

We'll do our best to get back to you
as soon as possible.

This article is a part of

Becoming AI Driven
92 articles

Becoming AI Driven

Artificial Intelligence solutions are becoming the next competitive edge for many companies within various industries. How do you know if your company should invest time into emerging tech? How to discover and benefit from AI opportunities? How to run AI projects?

Follow our article series to learn how to get on a path towards AI adoption. Join us as we explore the benefits and challenges that come with AI implementation and guide business leaders in creating AI-based companies.

check it out

Becoming AI Driven

Insights on practical AI applications just one click away

Sign up for our newsletter and don't miss out on the latest insights, trends and innovations from this sector.


Thanks for joining the newsletter

Check your inbox for the confirmation email & enjoy the read!

This site uses cookies for analytical purposes.

Accept Privacy Policy

In the interests of your safety and to implement the principle of lawful, reliable and transparent processing of your personal data when using our services, we developed this document called the Privacy Policy. This document regulates the processing and protection of Users’ personal data in connection with their use of the Website and has been prepared by Nexocode.

To ensure the protection of Users' personal data, Nexocode applies appropriate organizational and technical solutions to prevent privacy breaches. Nexocode implements measures to ensure security at the level which ensures compliance with applicable Polish and European laws such as:

  1. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (published in the Official Journal of the European Union L 119, p 1); Act of 10 May 2018 on personal data protection (published in the Journal of Laws of 2018, item 1000);
  2. Act of 18 July 2002 on providing services by electronic means;
  3. Telecommunications Law of 16 July 2004.

The Website is secured by the SSL protocol, which provides secure data transmission on the Internet.

1. Definitions

  1. User – a person that uses the Website, i.e. a natural person with full legal capacity, a legal person, or an organizational unit which is not a legal person to which specific provisions grant legal capacity.
  2. Nexocode – NEXOCODE sp. z o.o. with its registered office in Kraków, ul. Wadowicka 7, 30-347 Kraków, entered into the Register of Entrepreneurs of the National Court Register kept by the District Court for Kraków-Śródmieście in Kraków, 11th Commercial Department of the National Court Register, under the KRS number: 0000686992, NIP: 6762533324.
  3. Website – website run by Nexocode, at the URL: nexocode.com whose content is available to authorized persons.
  4. Cookies – small files saved by the server on the User's computer, which the server can read when when the website is accessed from the computer.
  5. SSL protocol – a special standard for transmitting data on the Internet which unlike ordinary methods of data transmission encrypts data transmission.
  6. System log – the information that the User's computer transmits to the server which may contain various data (e.g. the user’s IP number), allowing to determine the approximate location where the connection came from.
  7. IP address – individual number which is usually assigned to every computer connected to the Internet. The IP number can be permanently associated with the computer (static) or assigned to a given connection (dynamic).
  8. GDPR – Regulation 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of individuals regarding the processing of personal data and onthe free transmission of such data, repealing Directive 95/46 / EC (General Data Protection Regulation).
  9. Personal data – information about an identified or identifiable natural person ("data subject"). An identifiable natural person is a person who can be directly or indirectly identified, in particular on the basis of identifiers such as name, identification number, location data, online identifiers or one or more specific factors determining the physical, physiological, genetic, mental, economic, cultural or social identity of a natural person.
  10. Processing – any operations performed on personal data, such as collecting, recording, storing, developing, modifying, sharing, and deleting, especially when performed in IT systems.

2. Cookies

The Website is secured by the SSL protocol, which provides secure data transmission on the Internet. The Website, in accordance with art. 173 of the Telecommunications Act of 16 July 2004 of the Republic of Poland, uses Cookies, i.e. data, in particular text files, stored on the User's end device.
Cookies are used to:

  1. improve user experience and facilitate navigation on the site;
  2. help to identify returning Users who access the website using the device on which Cookies were saved;
  3. creating statistics which help to understand how the Users use websites, which allows to improve their structure and content;
  4. adjusting the content of the Website pages to specific User’s preferences and optimizing the websites website experience to the each User's individual needs.

Cookies usually contain the name of the website from which they originate, their storage time on the end device and a unique number. On our Website, we use the following types of Cookies:

  • "Session" – cookie files stored on the User's end device until the Uses logs out, leaves the website or turns off the web browser;
  • "Persistent" – cookie files stored on the User's end device for the time specified in the Cookie file parameters or until they are deleted by the User;
  • "Performance" – cookies used specifically for gathering data on how visitors use a website to measure the performance of a website;
  • "Strictly necessary" – essential for browsing the website and using its features, such as accessing secure areas of the site;
  • "Functional" – cookies enabling remembering the settings selected by the User and personalizing the User interface;
  • "First-party" – cookies stored by the Website;
  • "Third-party" – cookies derived from a website other than the Website;
  • "Facebook cookies" – You should read Facebook cookies policy: www.facebook.com
  • "Other Google cookies" – Refer to Google cookie policy: google.com

3. How System Logs work on the Website

User's activity on the Website, including the User’s Personal Data, is recorded in System Logs. The information collected in the Logs is processed primarily for purposes related to the provision of services, i.e. for the purposes of:

  • analytics – to improve the quality of services provided by us as part of the Website and adapt its functionalities to the needs of the Users. The legal basis for processing in this case is the legitimate interest of Nexocode consisting in analyzing Users' activities and their preferences;
  • fraud detection, identification and countering threats to stability and correct operation of the Website.

4. Cookie mechanism on the Website

Our site uses basic cookies that facilitate the use of its resources. Cookies contain useful information and are stored on the User's computer – our server can read them when connecting to this computer again. Most web browsers allow cookies to be stored on the User's end device by default. Each User can change their Cookie settings in the web browser settings menu: Google ChromeOpen the menu (click the three-dot icon in the upper right corner), Settings > Advanced. In the "Privacy and security" section, click the Content Settings button. In the "Cookies and site date" section you can change the following Cookie settings:

  • Deleting cookies,
  • Blocking cookies by default,
  • Default permission for cookies,
  • Saving Cookies and website data by default and clearing them when the browser is closed,
  • Specifying exceptions for Cookies for specific websites or domains

Internet Explorer 6.0 and 7.0
From the browser menu (upper right corner): Tools > Internet Options > Privacy, click the Sites button. Use the slider to set the desired level, confirm the change with the OK button.

Mozilla Firefox
browser menu: Tools > Options > Privacy and security. Activate the “Custom” field. From there, you can check a relevant field to decide whether or not to accept cookies.

Open the browser’s settings menu: Go to the Advanced section > Site Settings > Cookies and site data. From there, adjust the setting: Allow sites to save and read cookie data

In the Safari drop-down menu, select Preferences and click the Security icon.From there, select the desired security level in the "Accept cookies" area.

Disabling Cookies in your browser does not deprive you of access to the resources of the Website. Web browsers, by default, allow storing Cookies on the User's end device. Website Users can freely adjust cookie settings. The web browser allows you to delete cookies. It is also possible to automatically block cookies. Detailed information on this subject is provided in the help or documentation of the specific web browser used by the User. The User can decide not to receive Cookies by changing browser settings. However, disabling Cookies necessary for authentication, security or remembering User preferences may impact user experience, or even make the Website unusable.

5. Additional information

External links may be placed on the Website enabling Users to directly reach other website. Also, while using the Website, cookies may also be placed on the User’s device from other entities, in particular from third parties such as Google, in order to enable the use the functionalities of the Website integrated with these third parties. Each of such providers sets out the rules for the use of cookies in their privacy policy, so for security reasons we recommend that you read the privacy policy document before using these pages. We reserve the right to change this privacy policy at any time by publishing an updated version on our Website. After making the change, the privacy policy will be published on the page with a new date. For more information on the conditions of providing services, in particular the rules of using the Website, contracting, as well as the conditions of accessing content and using the Website, please refer to the the Website’s Terms and Conditions.

Nexocode Team


Want to unlock the full potential of Artificial Intelligence technology?

Download our ebook and learn how to drive AI adoption in your business.