Event-Driven Architecture in Logistics Company. Case Study of EDA in Modern Supply Chain Management

Event-Driven Architecture in Logistics Company. Case Study of EDA in Modern Supply Chain Management

Dorota Owczarek - December 19, 2022 - updated on January 9, 2023

It’s no secret that the logistics industry is under pressure to modernize and digitize its operations. The global economy has become more complex, with customers expecting faster delivery of goods at lower prices. At the same time, technological advances are making it possible for businesses to automate more tasks and use machine learning to optimize their processes.

In this article, we will explore how event-driven architecture (EDA) can be used within a logistics company to improve performance and enable digital transformation. We will look at a case study of a trucking company that implemented EDA (not without a struggle) and saw dramatic performance improvements, gained 360-degree visibility of their business, and enabled them to step up on the AI maturity ladder.

The Hyperconnected Space of Transport and Logistics

The logistics industry is under immense pressure to deliver goods faster and at lower costs. At the same time, logistics and supply chain business IT infrastructures must process an ever-growing amount of real-time data to support hyperconnected operations. The emergence of the Internet of Things (IoT) connects physical objects and devices to the internet, allowing them to communicate and exchange data in real time. This is resulting in a new level of visibility and control for businesses, as well as creating new opportunities for automation and efficiency gains.

To keep up with the pace of change, logistics companies need to digitize their operations and move away from traditional, siloed IT infrastructures. They need to adopt event-driven architectures designed to handle large amounts of real-time data and enable hyperconnected operations.

The landscape of logistics data sources

The landscape of logistics data sources

EDA is a way of designing software applications in which events or changes in data are used to trigger actions. This is in contrast to the more traditional “request-response” model, where a user initiates an action and then waits for a response from the system.

Event-driven architecture is particularly well suited to logistics applications because of the many different types of events that can occur in a supply chain: a truck arriving at a loading dock, a package being scanned at a sorting facility, an order being placed by a customer, etc.

Quick Recap of Event-Driven Architecture

Before we dig into the case study, let’s recap some basics behind the event-driven architecture model. Event-driven architecture is event-centric. It begins with event producers producing messages or “events,” which are consumed by event consumers. The event consumer can perform some action in response to the event, such as updating a database or sending a notification.

What would be the events? Events can be anything that changes the state of data, such as a new pricing inquiry being sent, new order being placed, a package being delivered, or a truck arriving at a loading dock.

Logistics events examlples. How various logistics systems and business processes produce a single event.

Logistics events examlples. How various logistics systems and business processes produce a single event.

This event-driven architecture model has several benefits over the more traditional request-response model:

  1. It is asynchronous, meaning that the event producer and consumer do not need to be running simultaneously. This makes it easier to scale applications because different parts of the system can be updated independently.
  2. It is loosely coupled, meaning that the event consumers and event producers do not need to know about each other to communicate. Loose coupling makes it easier to change or update event consumers or producers without affecting the rest of the system.
  3. It is scalable, meaning that it can handle a large number of events with a minimal performance impact.
  4. It is event-centric, meaning the focus is on the event data rather than the requestor or the response. This makes adding new event consumers or producers easier without affecting the existing system.
  5. It is flexible and can be used to build simple applications or complex systems. It can support various events, including system events, application events, and user-generated events.

Now that we’ve reviewed some of the basics of event-driven architecture, let’s look at how it was applied in a real-world logistics company.

Example of an event-driven architecture for supply chain management

Example of an event-driven architecture for supply chain management

Digital Transformation with Event-Driven Architecture in a Logistics Company

We decided to present the following case study to highlight some of the challenges faced by logistics companies on the path of digital transformation. Not only can logistics or trucking companies relate to the issues mentioned below, but also companies from other industries, like manufacturing, healthcare, eCommerce, etc., can take inspiration from this case study.

To keep some of the details we share here private, we decided to change our client’s name to “ABC Trucking.”

Company Description and Business Scenario

ABC Trucking is a trucking company that transports goods across the United States. The company has a fleet of trucks and a team of drivers who pick up and deliver cargo daily. They serve a wide range of customers, and their business model is divided between contract and on-spot transportation (60% vs. 40%).

The company has been in business for over 30 years and has always used traditional, siloed IT systems to track its operations. However, as the business grew and the industry changed, it became clear that this legacy infrastructure could no longer meet the needs of the business. The company needed a more modern IT infrastructure that could support its growth and allow them to compete in the new digital economy.

Coordinating the movements of trucks and drivers and managing price inquiries and orders are complex tasks that require real-time data and constant communication between different parts of the organization. To keep up with the demands of the modern logistics industry, ABC Trucking decided to digitize its operations and move its on-premise infrastructure to a cloud-based event-driven architecture.

Business Use Cases and Event-Driven Architecture Pattern Implementations

Digital Transformation Journey

Over the years, the company has undergone several changes. The most recent change occurred five years ago when the company decided to focus on the digital market and provide an improved customer experience. This shift required them to invest in new technology, including updates in multiple systems and a general shift in business logic.

They needed a new enterprise resource planning (ERP) system and a new transportation management system. They decided to select a SaaS product for invoicing, finances, and ERP; and transform their current TMS and Order Management System into custom, tailor-made products that will be designed and developed by the company in-house (with the support of external consultants).

Integrating Legacy Infrastructure

The first step in ABC Trucking’s digital transformation journey was integrating its legacy on-premise systems with the new cloud-based SaaS products. This first iteration of hybrid architecture allowed them to keep their existing legacy systems while quickly benefiting from off-the-shelf SaaS products.

At this point, they decided to adopt a hybrid cloud setup where they used eventing technology (in the form of an event bus) to communicate and synchronize data between on-premise and cloud applications. This allowed them to maintain their existing infrastructure while slowly moving towards a fully digital technology based on cloud infrastructure.

Moving to Cloud and Hybrid Architecture

The ongoing digital transformation journey required ABC Trucking to move all its systems to the cloud. They decided to pick AWS Cloud as their cloud services provider and implement a modern logistics infrastructure at scale. This meant migrating existing legacy systems and developing new applications enabling them to take full advantage of the cloud and event-driven architecture.

Their cloud migration process was not an easy one. Over the years, they developed multiple legacy systems for various business processes that couldn’t be simply rehosted through a lift and shift migration strategy. They had to create complex migration plans and refactor existing systems, and in many cases, develop new services and applications from scratch to get the most out of cloud computing and support the new business models.

Supporting a Growing Number of Services

As the business grew, ABC Trucking needed to expand their IT landscape to support a growing number of services and use cases. At first, they didn’t move straight for event-driven architectures. Still, they implemented a service-oriented architecture (SOA) to hide the complexities of siloed systems and legacy applications behind standardized API contracts and message canonicals.

Customer Ordering System and Microservices Scoping

ABC Trucking decided to use microservices architecture to process orders and other core business processes by breaking down its operations into smaller, independent units that can be developed, tested, and deployed independently. These units, known as microservices, are designed to perform specific tasks or functions and can be combined to form a complete business process.

For example, they developed a microservice for order processing, another for inventory management, and another for shipping and delivery. Each of these microservices can be designed and maintained independently, allowing the company to update and improve its operations more efficiently.

Microservices architecture also made it easier for the logistics company to scale its operations, as it can add new microservices or increase the capacity of existing ones as needed. Additionally, microservices can be deployed in different environments, such as on-premises servers or in the cloud, making it possible to take advantage of the benefits of both approaches.

Overall, using microservices architecture helped ABC Trucking increase its efficiency, agility, and scalability, enabling it to meet its customers’ needs better and stay competitive in an increasingly dynamic market.

Going Event-Driven with Apache Kafka on Confluent

Aggregating data from multiple different data sources (supply chain data, orders, pricing, IoT sensors, and telematics) and continuously processing it in real time as they change can be somewhat daunting. As the amount of data to be processed grew and as the real-time stream processing capabilities became an apparent necessity for the company, ABC Trucking realized that event-driven architecture was the ideal tool to complete their digital transformation process. EDA allowed them to process the ever-increasing amounts of data in a scaled manner, react quickly to market changes, and enable a more agile approach within their IT infrastructure.

Deciding event-driven architecture would be the best for their logistics business, ABC Trucking chose to partner with Confluent and deploy Apache Kafka on the Confluent Platform. Apache Kafka, as a fully managed service provided by Confluent Cloud, allowed them to build an event mesh across all their services and applications, which enabled better communication between different parts of the organization and serve and collect data across multiple web and mobile apps (systems that supported the internal business process and customer-facing services).

The event-driven architecture enabled them to integrate all of their services into one seamless event stream, allowing instantaneous communication between different parts of the organization. This event stream then became the backbone for AI-powered systems, such as predictive analytics, machine learning algorithms, and dynamic pricing technology. But, as you can imagine, these advanced solutions came in later as the company invested more efforts in their digital transformation.

Vehicle Tracking System - Predicting ETAs and Transit Times

By applying an event-driven approach, ABC Trucking was able to extend their logistics capabilities and build a vehicle tracking system that allowed them to predict ETAs and transit times based on market and business events coming from various sources.

Event driven architecture with Kafka ksqlDB and TensorFlow for batch and stream analytics in logistics

Event driven architecture with Kafka ksqlDB and TensorFlow for batch and stream analytics in logistics

For example, the system collected data from IoT sensors installed in vehicles, telematics systems, weather conditions, and other external events that could affect the delivery of goods. By leveraging event-driven architecture, they could process this data in real-time while also maintaining an audit trail of all these events to improve their service quality further.

From reactive and real-time visibility of the fleet, cargo details, schedules, etc., to predictive and prescriptive modeling for fleet and route optimization.

From reactive and real-time visibility of the fleet, cargo details, schedules, etc., to predictive and prescriptive modeling for fleet and route optimization.

Stream analytics and predictive analytics were enabled by implementing the Apache Flink framework for big data processing. Apache Flink allowed the company to understand event signals coming from different sources in real-time, identify potential supply chain disruption, and react quickly to them. Read more about predictive analytics in supply chain management HERE.

Real-Time Visibility of All Shipments

Using event-driven architecture, ABC Trucking was able to get real-time visibility of all shipments. The final step of ABC Trucking’s digital transformation journey was to leverage machine learning (ML) and artificial intelligence (AI) technologies to gain insights from the event streams they had created. Using event streams with sensor data and telematics, they can process the data in real time and detect anomalies that would otherwise have gone unnoticed. This event stream became the primary source of truth for the company’s logistics systems and allowed them to make faster decisions across their supply chain operations. This core implementation enabled delivery estimates computation based on a vehicle’s current location so that they could provide a better customer experience (and a new, at the time, competitive advantage) with real-time status updates on all orders.

Batch Processing for Lane Analytics

Hybrid solutions that integrate streaming analytics of IoT data with batch analytics to support machine learning solutions and real-time visualizations are essential to automate intricate, multifaceted workflows. Lane analytics was a critical use case behind batch processing at ABC Trucking. Freight analysis is the process of understanding lane-by-lane movement and identifying trends. This helps businesses make decisions about where to dispatch their trucks, how to price their services, and how to optimize their network. With Apache Spark, a unified analytics engine for large-scale data processing, batch processing was used to analyze the event stream of hundreds of thousands of shipments (and even more pricing requests from customers).

At ABC Trucking, event streams from sensors and telematics are processed with Apache Kafka for real-time message ingestion and then stored in Apache Hadoop HDFS. The data stored in HDFS is then transformed into batches by Apache Spark and analyzed using machine learning algorithms to detect anomalies or patterns that could give insights into improving delivery routes and lane utilization. This allows ABC Trucking to get an accurate view of their operations while increasing efficiency and reducing costs related to inefficient fuelling strategy, empty miles, unused truck capacity, and unnecessary deadheads.

Real-Time Analytics for Dynamic Pricing Models

The transportation and supply chain sector is a highly competitive one. Shippers require instant quotes for their freight, and carriers need to respond quickly to remain competitive. ABC Trucking decided to implement a dynamic pricing model based on event-driven architecture to meet customer expectations and improve operations efficiency.

Real-time stream processing is used to calculate market prices for each lane by considering factors like freight details (loadings, unloadings, cargo type, miles), fuel price trends, traffic patterns, location attractiveness and headhaul probability, shipment consolidation options, customer segmentation, customer churn prediction, negotiation trends, and competitor prices. This data is then fed into machine learning algorithms (based on reinforcement learning) that can predict optimal pricing for each shipment. This helps them stay competitive by improving margins and customer loyalty in the ever-increasing logistics industry while providing customers with accurate quotes in real-time.

A roadmap of key features that drove a successful dynamic pricing strategy for FTL transportation

A roadmap of key features that drove a successful dynamic pricing strategy for FTL transportation

Predictive Maintenance for Fleet Management

Fleet management is a critical component of logistics operations. To reduce downtime and ensure fleet uptime, event-driven architecture can be leveraged to process data from vehicle sensors in real-time.

Predictive maintenance for trucks - Architecture example based on Apache Kafka and ksqlDB

Predictive maintenance for trucks - Architecture example based on Apache Kafka and ksqlDB

By analyzing event streams from the truck’s engine, other IoT sensors onboard, historical data on realized routes by a particular vehicle, and its maintenance history, ABC Trucking was able to detect anomalies and predict maintenance needs before they become costly problems. This event stream is then fed into predictive maintenance models based on machine learning algorithms that analyze the data for patterns, predict failures before they occur, and plan preventive maintenance. This proactive approach helps them reduce operational costs while ensuring that their trucks run smoothly.

Data from sensors on fleet vehicles and additional data costitute to predictive maintenance systems

Data from sensors on fleet vehicles and additional data costitute to predictive maintenance systems

Monitoring Cold Supply Chain Network and Reefer Containers

Cold supply chains transport temperature-sensitive products, such as food and pharmaceuticals, across the logistics network. To ensure that these items are not exposed to unsafe temperatures and spoilage risks, event-driven architecture monitors containers in real-time.

By ingesting event streams from sensors attached to refrigerated containers and other components of their cold supply chain networks, ABC Trucking could detect any changes in temperature or pressure levels before they became an issue. This helps them ensure safe product delivery while taking proactive steps to address any potential problems before they arise. More about Event-Driven Architecture HERE.

Swap Body Disposition

Swap bodies are containers that can be swapped on a delivery truck and reused for the next shipment. This reduces costs related to shipping empty containers back to their origin, as well as storage costs associated with storing them in warehouses while they await return transportation.

By analyzing event streams from orders and supply chain analytics, ABC Trucking could track and predict where swap bodies will be needed in the future to prioritize order fulfillment based on anticipated demand. This helps them reduce operational costs by ensuring the timely delivery of goods while also reducing storage costs associated with maintaining a large inventory of swap bodies.

Digital Twins of Supply Chain Network and Modeling Business Processes

Using event-driven architecture, ABC Trucking was also able to create digital twins of their entire logistics network. This enabled them to simulate different scenarios and analyze the effects on their supply chain performance.

By modeling their business processes and analyzing event streams from all facets of their operations, they could identify bottlenecks in the system that changes in processes or digital technologies upgrades could address. This allows them to optimize their operations for increased operational efficiency and cost savings while remaining agile enough to respond quickly to changes in demand.

Event Tracking and Monitoring of the Infrastructure at Scale

Implementing digital business based on EDA at scale and monitoring the whole infrastructure can be daunting. ABC Trucking faced several challenges while transitioning to event-driven architecture, such as managing massive amounts of events, ensuring data accuracy and quality, and developing analytics and observability models that could process event streams quickly.

Observability is key to success in the long run. ABC Trucking needed to implement dedicated solutions to fully understand what’s going on with everything — their data, their artificial intelligence models, infrastructure, code, users, and so on.

Summary - A Fully-Fledged EDA for a Transportation Company

Event-driven architecture is revolutionizing the logistics industry by providing unprecedented visibility into supply chains and business operations while enabling organizations to reduce operational costs by optimizing processes for greater efficiency.

ABC Trucking implemented event streams for their most business-critical operations, including tracking trucks in real-time, pricing shipments, fleet inventory management, predictive maintenance, managing customer orders, and coordinating driver schedules. By leveraging event streams to track events across the company’s IT infrastructure, ABC Trucking was able to build up a continuous flow of data that provided visibility into their operations and helped them identify areas where improvements could be made.

ABC Trucking’s digital transformation journey is a testament to what event-driven architecture can do for businesses of all sizes and the wider supply chains and logistics sector. By leveraging Apache Kafka on Confluent Cloud and transforming their infrastructure into distributed systems based on AWS Cloud, they could move away from legacy applications and databases, improve their digital capabilities, and embrace event streams as their primary source of information.

Challenges of Moving Ahead With Digital Transformation and AI Maturity for Logistics Businesses

The event-driven architecture enables logistics businesses to become more agile and efficient, but it’s not without its challenges. Companies need to ensure that event tracking is done accurately and reliably, as well as develop analytics models that can process event streams quickly and in a scalable manner to support big data. Additionally, they need to have the right tools to monitor their infrastructure at scale to gain a complete understanding of what’s happening in their operations.

But by leveraging event streaming technologies such as Apache Kafka and stream processing tools like Apache Flink, companies can unlock the potential of event-driven architecture and enjoy long-term benefits like improving operational efficiency, providing cost savings, and increasing customer service levels.

ABC Trucking’s experience shows just how robust event-driven architectures are for modernizing logistics businesses. The event-driven approach enabled them to better understand their operations, track events more accurately and reliably, and be agile enough to respond quickly to changes in demand. Moving forward with digital transformation and AI maturity can help companies stay ahead of the curve and remain competitive in today’s ever-changing environment.

If you’re interested in learning more about event-driven architecture and how it can benefit your business, please don’t hesitate to reach out to us. We’d be happy to discuss your specific needs and see how we can help you get started on your own digital transformation journey. Our team of experts has years of experience helping businesses like yours transition to an event-driven system and reap all the benefits that come with it.

About the author

Dorota Owczarek

Dorota Owczarek

AI Product Lead & Design Thinking Facilitator

Linkedin profile Twitter

With over ten years of professional experience in designing and developing software, Dorota is quick to recognize the best ways to serve users and stakeholders by shaping strategies and ensuring their execution by working closely with engineering and design teams.
She acts as a Product Leader, covering the ongoing AI agile development processes and operationalizing AI throughout the business.

Would you like to discuss AI opportunities in your business?

Let us know and Dorota will arrange a call with our experts.

Dorota Owczarek
Dorota Owczarek
AI Product Lead

Thanks for the message!

We'll do our best to get back to you
as soon as possible.

This article is a part of

AI in Logistics
51 articles

AI in Logistics

Artificial Intelligence is becoming an essential element of Logistics and Supply Chain Management, where it offers many benefits to companies willing to adopt emerging technologies. AI can change how companies operate by providing applications that streamline planning, procurement, manufacturing, warehousing, distribution, transportation, and sales.

Follow our article series to find out the applications of AI in logistics and how this tech benefits the whole supply chain operations.

check it out


Insights on practical AI applications just one click away

Sign up for our newsletter and don't miss out on the latest insights, trends and innovations from this sector.


Thanks for joining the newsletter

Check your inbox for the confirmation email & enjoy the read!

This site uses cookies for analytical purposes.

Accept Privacy Policy

In the interests of your safety and to implement the principle of lawful, reliable and transparent processing of your personal data when using our services, we developed this document called the Privacy Policy. This document regulates the processing and protection of Users’ personal data in connection with their use of the Website and has been prepared by Nexocode.

To ensure the protection of Users' personal data, Nexocode applies appropriate organizational and technical solutions to prevent privacy breaches. Nexocode implements measures to ensure security at the level which ensures compliance with applicable Polish and European laws such as:

  1. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (published in the Official Journal of the European Union L 119, p 1); Act of 10 May 2018 on personal data protection (published in the Journal of Laws of 2018, item 1000);
  2. Act of 18 July 2002 on providing services by electronic means;
  3. Telecommunications Law of 16 July 2004.

The Website is secured by the SSL protocol, which provides secure data transmission on the Internet.

1. Definitions

  1. User – a person that uses the Website, i.e. a natural person with full legal capacity, a legal person, or an organizational unit which is not a legal person to which specific provisions grant legal capacity.
  2. Nexocode – NEXOCODE sp. z o.o. with its registered office in Kraków, ul. Wadowicka 7, 30-347 Kraków, entered into the Register of Entrepreneurs of the National Court Register kept by the District Court for Kraków-Śródmieście in Kraków, 11th Commercial Department of the National Court Register, under the KRS number: 0000686992, NIP: 6762533324.
  3. Website – website run by Nexocode, at the URL: nexocode.com whose content is available to authorized persons.
  4. Cookies – small files saved by the server on the User's computer, which the server can read when when the website is accessed from the computer.
  5. SSL protocol – a special standard for transmitting data on the Internet which unlike ordinary methods of data transmission encrypts data transmission.
  6. System log – the information that the User's computer transmits to the server which may contain various data (e.g. the user’s IP number), allowing to determine the approximate location where the connection came from.
  7. IP address – individual number which is usually assigned to every computer connected to the Internet. The IP number can be permanently associated with the computer (static) or assigned to a given connection (dynamic).
  8. GDPR – Regulation 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of individuals regarding the processing of personal data and onthe free transmission of such data, repealing Directive 95/46 / EC (General Data Protection Regulation).
  9. Personal data – information about an identified or identifiable natural person ("data subject"). An identifiable natural person is a person who can be directly or indirectly identified, in particular on the basis of identifiers such as name, identification number, location data, online identifiers or one or more specific factors determining the physical, physiological, genetic, mental, economic, cultural or social identity of a natural person.
  10. Processing – any operations performed on personal data, such as collecting, recording, storing, developing, modifying, sharing, and deleting, especially when performed in IT systems.

2. Cookies

The Website is secured by the SSL protocol, which provides secure data transmission on the Internet. The Website, in accordance with art. 173 of the Telecommunications Act of 16 July 2004 of the Republic of Poland, uses Cookies, i.e. data, in particular text files, stored on the User's end device.
Cookies are used to:

  1. improve user experience and facilitate navigation on the site;
  2. help to identify returning Users who access the website using the device on which Cookies were saved;
  3. creating statistics which help to understand how the Users use websites, which allows to improve their structure and content;
  4. adjusting the content of the Website pages to specific User’s preferences and optimizing the websites website experience to the each User's individual needs.

Cookies usually contain the name of the website from which they originate, their storage time on the end device and a unique number. On our Website, we use the following types of Cookies:

  • "Session" – cookie files stored on the User's end device until the Uses logs out, leaves the website or turns off the web browser;
  • "Persistent" – cookie files stored on the User's end device for the time specified in the Cookie file parameters or until they are deleted by the User;
  • "Performance" – cookies used specifically for gathering data on how visitors use a website to measure the performance of a website;
  • "Strictly necessary" – essential for browsing the website and using its features, such as accessing secure areas of the site;
  • "Functional" – cookies enabling remembering the settings selected by the User and personalizing the User interface;
  • "First-party" – cookies stored by the Website;
  • "Third-party" – cookies derived from a website other than the Website;
  • "Facebook cookies" – You should read Facebook cookies policy: www.facebook.com
  • "Other Google cookies" – Refer to Google cookie policy: google.com

3. How System Logs work on the Website

User's activity on the Website, including the User’s Personal Data, is recorded in System Logs. The information collected in the Logs is processed primarily for purposes related to the provision of services, i.e. for the purposes of:

  • analytics – to improve the quality of services provided by us as part of the Website and adapt its functionalities to the needs of the Users. The legal basis for processing in this case is the legitimate interest of Nexocode consisting in analyzing Users' activities and their preferences;
  • fraud detection, identification and countering threats to stability and correct operation of the Website.

4. Cookie mechanism on the Website

Our site uses basic cookies that facilitate the use of its resources. Cookies contain useful information and are stored on the User's computer – our server can read them when connecting to this computer again. Most web browsers allow cookies to be stored on the User's end device by default. Each User can change their Cookie settings in the web browser settings menu: Google ChromeOpen the menu (click the three-dot icon in the upper right corner), Settings > Advanced. In the "Privacy and security" section, click the Content Settings button. In the "Cookies and site date" section you can change the following Cookie settings:

  • Deleting cookies,
  • Blocking cookies by default,
  • Default permission for cookies,
  • Saving Cookies and website data by default and clearing them when the browser is closed,
  • Specifying exceptions for Cookies for specific websites or domains

Internet Explorer 6.0 and 7.0
From the browser menu (upper right corner): Tools > Internet Options > Privacy, click the Sites button. Use the slider to set the desired level, confirm the change with the OK button.

Mozilla Firefox
browser menu: Tools > Options > Privacy and security. Activate the “Custom” field. From there, you can check a relevant field to decide whether or not to accept cookies.

Open the browser’s settings menu: Go to the Advanced section > Site Settings > Cookies and site data. From there, adjust the setting: Allow sites to save and read cookie data

In the Safari drop-down menu, select Preferences and click the Security icon.From there, select the desired security level in the "Accept cookies" area.

Disabling Cookies in your browser does not deprive you of access to the resources of the Website. Web browsers, by default, allow storing Cookies on the User's end device. Website Users can freely adjust cookie settings. The web browser allows you to delete cookies. It is also possible to automatically block cookies. Detailed information on this subject is provided in the help or documentation of the specific web browser used by the User. The User can decide not to receive Cookies by changing browser settings. However, disabling Cookies necessary for authentication, security or remembering User preferences may impact user experience, or even make the Website unusable.

5. Additional information

External links may be placed on the Website enabling Users to directly reach other website. Also, while using the Website, cookies may also be placed on the User’s device from other entities, in particular from third parties such as Google, in order to enable the use the functionalities of the Website integrated with these third parties. Each of such providers sets out the rules for the use of cookies in their privacy policy, so for security reasons we recommend that you read the privacy policy document before using these pages. We reserve the right to change this privacy policy at any time by publishing an updated version on our Website. After making the change, the privacy policy will be published on the page with a new date. For more information on the conditions of providing services, in particular the rules of using the Website, contracting, as well as the conditions of accessing content and using the Website, please refer to the the Website’s Terms and Conditions.

Nexocode Team