10 Real-World Event-Driven Architecture Examples in Logistics. Implementing Kafka at Scale to Handle Supply Chain Network

10 Real-World Event-Driven Architecture Examples in Logistics. Implementing Kafka at Scale to Handle Supply Chain Network

Dorota Owczarek - October 9, 2022 - updated on May 27, 2024

Digital transformation and data management are two of the biggest challenges that companies in the logistics and transportation sector are currently facing in their tech departments. The old ways of doing things just don’t work anymore - we need to find new ways to manage data and ensure it is accessible and actionable in real-time. Digital transformation projects are struggling to keep up with the demands of the modern supply chain network. But all is not lost! Event-driven architecture (EDA) provides a solution that can help businesses handle the torrents of data coming from their transportation and logistics systems. Apache Kafka is at the heart of many EDA implementations, providing a platform for streaming data in real-time.

In this blog post, we will take a look at ten real-world event-driven architecture examples from some of the biggest names in logistics. We will see how they are using Kafka to handle their supply chain networks and learn what new opportunities event-driven architecture can bring to logistics businesses.

Challenges of Today’s Logistics and Supply Chain Systems That Face Digital Transformation

There are many challenges that logistics and transportation companies face when it comes to digital transformation and data management. Let’s take a look at some of the most common ones:

  • Lack of real-time visibility into the supply chain network. This can lead to delays, errors, and inefficiencies.
  • Inability to track and trace shipments or fleets throughout the supply chains. This leads to lost or damaged goods, as well as missed opportunities for customer engagement.
  • Difficulty integrating new applications and technologies into existing systems. This can lead to silos of information and stovepiped processes.
  • High costs associated with maintaining legacy systems. These outdated systems are often complex and challenging to maintain, leading to high IT costs.
  • Massive amounts of logistics data that need to be processed. This data comes from various sources, including GPS devices, RFID tags, sensors, and ERP systems. It can be challenging to make sense of everything and turn it into actionable insights.
  • Lack of agility and flexibility. Many logistics companies are stuck in their ways, using legacy systems that are inflexible and cannot keep up with the rapidly changing needs of the business.
  • A need for a cost-efficient data-processing architecture. As logistics companies strive to become more efficient and cut costs, they are looking for ways to reduce the complexity of their IT systems. With growing amounts of data, they need a way to process it all without breaking the bank.
  • Difficulty in approaching a migration to cloud services. Many logistics companies are hesitant to move their data and applications to the cloud due to security concerns and the complexity of the migration process. The length of the process and changing demands make it harder for companies that are undergoing migration to the cloud to keep up and foresee any issues that could arise.
  • Inability to leverage artificial intelligence solutions without a significant overhaul. Many logistics companies have been using outdated software and simplified architectures for years that are not compatible with newer technologies. Even though, in many cases, they have been collecting historical data for years, accessing and using it for AI applications is not possible without a major system update. As a result, they are unable to take advantage of newer solutions, such as artificial intelligence or machine learning, which could help them optimize their business processes.

Digital Transformation with Event-Driven Architecture to the Rescue!

Event-Driven Architecture can help solve many of these challenges by providing a way to create distributed systems that enable data stream processing in real-time and respond quickly to changes in the environment. EDA can give logistics companies the agility and flexibility they need to keep up with the demands of their business.

Event-Driven Architecture - How it Works?

Event-Driven Architecture is a type of architecture that enables applications to communicate with each other by producing and reacting to events. In an event-driven system, an event can be anything that happens in the environment, such as a change in state, a message being sent, or data being received. When we think about the logistics and supply chain environment, events can be orders, truck position updates, information on delivery statuses, payment updates, etc.

Logistics events examlples. How various logistics systems and business processes produce a single event.

Logistics events examlples. How various logistics systems and business processes produce a single event.

EDA has two key components:

  • Event producer - responsible for generating the event.
  • Event consumer - responsible for reacting to the event.

In order for an application to be able to react to an event, it must first be subscribed to that event. Once it is subscribed, the application will receive the event whenever it occurs.

Many different types of events can occur in a system, but they can generally be categorized into two types: system events and business events.

System events are generated by the underlying infrastructure and are typically used to trigger a reaction from the application, such as starting or stopping a process.

Business events, on the other hand, are generated by the business itself and usually contain information that is relevant to the operation of the business. For example, a business event might be an order being placed, a shipment being received, or a customer complaint being filed.

Example of an event-driven architecture for supply chain management

Example of an event-driven architecture for supply chain management

One of the benefits of using Event-Driven Architecture is that it decouples applications from each other. This means that applications can communicate without knowing anything about each other’s implementation details. The traditional approach, like service-oriented architecture, requires applications to know the details of the other services they need to interact with. This can lead to tight coupling between services, making it difficult to change or add new individual services that support new business logic.

Another benefit of EDA is that it allows for greater flexibility when adding or removing applications from the system. Unlike in a traditional architecture, where changes to the system can be difficult and time-consuming, in an event-driven system, applications can be added or removed without affecting the rest of the system.

Finally, Event-Driven Architecture can also improve performance by reducing the number of unnecessary calls between services. In a traditional architecture, service A might need to call service B every time it requires data that service B has. However, in an event-driven system, service A can subscribe to the event that is generated when data changes in service B. This means that service A will get data from service B according to the subscription definition within an event, usually instantly as the event occurs, which can help reduce network traffic and improve overall performance.

What is Apache Kafka?

Apache Kafka is a distributed streaming platform that can be used to build event-driven architectures. Kafka is designed to handle high throughput and low latency, making it a good choice for building real-time applications.

Kafka consists of a few key components:

  • Producers - responsible for generating events.
  • Consumers - responsible for consuming events.
  • Topics - a stream of events that are grouped together.
  • Brokers - accountable for storing and replicating events.

How Does Kafka Work?

Kafka works by providing a publish/subscribe messaging system that is built on top of a distributed commit log. When an event occurs, the producer will broadcast the event to a specific topic in Kafka. The broker will then send the event to all consumers who subscribe to that topic.

Kafka and Kafka Connect joining multiple systems

Kafka and Kafka Connect joining multiple systems

This architecture has a few key benefits:

  • It allows for the decoupling of producers and consumers.
  • It provides high availability and fault tolerance through replication.
  • It allows for horizontal scaling by adding more brokers to the cluster.

Now that we’ve briefly seen how Kafka works let’s look at some real-world examples of where it can be used.

10 Use Cases of Event-Driven Architecture with Apache Kafka in Logistic Companies

Event-driven architecture is an approach to software development that is recently gaining more and more popularity in the world of logistics. Most of the industry leaders, as well as smaller enterprises, are turning their attention to event-driven architecture as a way to improve their operations. To see where this technology can be implemented in your business, check out these ten real-world examples of event-driven architecture being used by logistics companies.

Hermes - Real-Time Analytics and Visibility Into the Entire Logistics Process

Hermes - Real-Time Analytics and Visibility Into the Entire Logistics Process

The German postal and logistics company Hermes runs one of the largest Kafka clusters in Europe. They use it to provide real-time analytics and visibility into their entire logistics process. Hermes’s cloud-based event-driven infrastructure can replicate data rapidly and securely, allowing Hermes to integrate it into Kafka for real-time decision-making and gaining greater visibility into the entire logistics process.

Advanced analytics and delivery heatmap at Hermes. The architecture works as distributed systems spread across multiple datacenters with high scalability potential

Advanced analytics and delivery heatmap at Hermes. The architecture works as distributed systems spread across multiple datacenters with high scalability potential

By processing over a billion events daily, Hermes can track every package and parcel throughout its journey, from collection to delivery. This allows them to optimize their operations and provides their customers with up-to-the-minute information on the status of their deliveries. Thanks to the implemented solution, the company can optimize the distribution and disposition of swap bodies, as well as control parcel and delivery traffic in real time. Digital tour sorting provides support for sorting parcels and planning delivery tours ahead based on processing Kafka event streams to calculate forecasts.

Austrian Post - Achieving Greater Efficiency and Visibility with Parcel Track & Trace Systems

Austrian Post - Achieving Greater Efficiency and Visibility with Parcel Track & Trace Systems

Austrian Post, the national postal service of Austria, uses Kafka, which runs on Microsoft Azure, as part of its parcel track & trace system. Data streaming allows them to achieve greater efficiency and visibility in their operations. By using Kafka, they can process massive amounts of events per second and update their customers in real time on the status of their deliveries. Austrian Post creates digital twins of all parcels to accurately reflect the current state of their business process and, when needed, run simulations and analytics for process improvement.

This event-driven infrastructure has helped Austrian Post to become one of the most efficient postal services in Europe, with one of the lowest rates of undelivered parcels.

Penske - Creating Connected Fleet With IoT Scale Event-Stream Processing Based on Apache Kafka

Penske - Creating Connected Fleet With IoT Scale Event-Stream Processing Based on Apache Kafka

Before we jump into business applications, let’s have a quick look at some technical numbers. Penske, a leading transportation services provider in the US, uses Apache Kafka and event-stream processing to create a connected fleet. Penske’s entire fleet of trucks produces 2 billion pings per day to the vehicle tracking system; the connected fleet architecture consists of 850+ app instances with 50+ streams ingesting 700GB of data daily. And these are just numbers the company shared in 2020.

Penske’s vehicles are equipped with sensors that collect data like GPS, speed, mileage, fuel level, and engine temperature. This sensor data is then streamed in real time to Kafka, which is processed and analyzed. Thanks to that, Penske has access to the real-time locating system and can run remote diagnostics on any truck anywhere in the country. It also allows them to understand fleet performance better and make data-driven fleet management decisions.

This helps Penske improve its operations’ efficiency by reducing downtime and maximizing the utilization of its assets. A machine learning algorithm processes vehicle sensor data to determine each truck’s best course of action. With predictive and prescriptive maintenance, Penske can proactively address issues before they result in truck breakdowns.

Kuehne + Nagel - Connecting Over 200 Applications in the Cloud and Orchestrating Their Interconnectivity Using Apache Kafka to Support Shipping and Air Freight Management

Kuehne + Nagel - Connecting Over 200 Applications in the Cloud and Orchestrating Their Interconnectivity Using Apache Kafka to Support Shipping and Air Freight Management

Synchronizing data in real-time across over 200 applications in the cloud and orchestrating their interconnectivity is no small feat. But thanks to Apache Kafka, Kuehne + Nagel, a leading international logistics company, can do just that.

Kuehne + Nagel’s event-driven architecture uses Kafka as a central messaging platform for connecting all their applications (KNITE platform). This allows them to share data between different parts of their business and allows triggering events based on changes in any part of the system.

Sennder - Machine Learning Powered Digital Freight Marketplace That Processes 3M+ Real-Time Data Points Daily with Kafka

Sennder - Machine Learning Powered Digital Freight Marketplace That Processes 3M+ Real-Time Data Points Daily with Kafka

Sennder, a digital freight marketplace, uses Apache Kafka and machine learning to power its business.

As a freight forwarding company, they offer a digital marketplace for shippers to post their loading and carriers to provide their transportation services. To do that, they need to process a massive amount of data - over three million data points daily.

Kafka is used to ingesting all of this data in real time and make it available for processing by Sennder’s machine learning algorithms. This allows them to match the suitable carrier with the right load (carrier matching solution), which results in lower costs and fewer empty miles for carriers. They are also constantly working to improve their dynamic pricing solution that, to produce the right quotes, needs instant access to the whole data available within the Sennder data ecosystem and from external data sources. The value that Kafka brings to their business is immeasurable.

Deutsche Bahn - Passenger Information Application for Real-Time Rail Data Streaming

Deutsche Bahn - Passenger Information Application for Real-Time Rail Data Streaming

Deutsche Bahn (DB), the German railway company, has developed a Passenger Information Application (PIA or RI-Plattform) that uses Apache Kafka for streaming real-time rail data.

The PIA is a mobile app and website which provide DB passengers (over 5.7M rail passengers daily) with up-to-the-minute information about their journeys, such as delays, platform changes, and disruptions based on over 180 million events processed every single day. Mobile applications also offer a chatbot interface for improved customer experience.

To provide this level of service, the PIA needs to process a large amount of data in real-time - everything from train schedules and live sensor data. All this data is ingested into Kafka, which is processed and made available to the PIA in near-real-time.

The event-streaming architecture also enabled the addition of supporting microservices. For example, the DB team created a microservice that uses sensor data from station platforms to detect when a train arrives. The microservice then triggers an announcement over the station’s loudspeakers.

The initiative, which combines Kafka Streams with Confluent Platform, has proven to be very reliable, supporting Deutsche Bahn to maintain 99.9 percent availability from the start (reliability is another crucial benefit of using Apache Kafka and EDA).

Singapore Airlines - ksqlDB and Kafka Streams for Predictive Maintenance Pipeline

Singapore Airlines - ksqlDB and Kafka Streams for Predictive Maintenance Pipeline

Have you heard of ksqlDB? It’s a new open-source streaming SQL engine that runs on Apache Kafka. And Singapore Airlines are using it to power its predictive maintenance pipeline.

The goal of the pipeline is to detect potential issues with aircraft components before they cause problems. To do this, data from sensors on the planes is ingested into Kafka, where it is filtered by ksqlDB (filtering based on plane types, tail number, etc.) and processed by Kafka Streams for an essential transformation that calls a machine learning algorithm for predictive maintenance purposes. The analysis results are then used to trigger alerts when there may be a problem with a particular component.

Predictive Maintenance in Logistics - Architecture example from the Singapore Airlines

Predictive Maintenance in Logistics - Architecture example from the Singapore Airlines

So far, the system has successfully detected issues early, proactively resolved them, and prevented them from causing delays or cancellations. The whole setup is a fault-tolerant solution that is scalable and enables the integration of new tools and streaming analytics.

Here Technologies - Kafka Streaming APIs to Share Mapping Content, Routing, Navigation, Traffic Data, and Location Services

Here Technologies - Kafka Streaming APIs to Share Mapping Content, Routing, Navigation, Traffic Data, and Location Services

Here Technologies is a leading provider of mapping content, routing, navigation, traffic, and location services. Millions of people use their products daily in over 100 countries.

They have developed a set of Kafka Streaming APIs that allows their partners to investigate and consume mapping and location data in real-time (instead of the classic REST APIs approach that is just not scalable at some point). This allows their partners to build innovative new applications and services on top of Here Technologies’ data.

Lyft - Supporting Ride Sharing with Kafka

Lyft - Supporting Ride Sharing with Kafka

Lyft is a ride-sharing company that uses Kafka for various purposes, including real-time ride-matching.

When you request a ride on the Lyft app, the request is sent to a Kafka cluster. A set of microservices then consumes the request and starts searching for an available driver. Once a driver is found, the information is sent back to the customer in real-time.

This process happens very quickly - usually within seconds - and wouldn’t be possible without Kafka.

Uber - One of the Largest Kafka Deployments in the World

Uber - One of the Largest Kafka Deployments in the World

While we cannot name them all, some of the largest Kafka deployments in the world are by tech unicorn companies such as Uber.

Kafka is used extensively at Uber for a variety of tasks. Apache Kafka allows over 300 microservices at Uber to thrive, processing petabytes of data daily close-to-real-time, as well as numerous workflow processes for passing event data from the rider, driver, and customer service apps, streaming database changelogs to subscribers, and processing data from the Hadoop data lake (e.g., rider and driver location tracking, and real-time ride-matching).

In fact, they have so much data that they built their own platform for building streaming data lakes called Hudi (released under Apache Foundation). Uber’s big data architecture also relies on tools like Spark, Parquet, Presto, and Hive.

The Path Toward Event-Driven Architecture for Logistics Companies

The examples highlighted above vary from one another, just as each company’s path to get to the production-ready application of EDA. Companies that have been industry leaders for decades have relied on legacy solutions and on-premise hosting. Moving to the cloud infrastructure, in many cases, meant a hybrid architecture first, with some data still residing on-premise while the new applications were running in the cloud. These companies chose Kafka to enable digital transformation. In contrast, others have been born in the digital age and have taken a more modern approach by using novel technologies from the get-go. But there is one commonality between them: they all needed a way to process large amounts of data quickly, reliably, and at scale - precisely what Apache Kafka excels at.

Logistics companies are now turning to EDA for its ability to support real-time data processing and complex event processing. It is important to note that as the company grows, as the market changes, the IT infrastructure has to adapt to transforming demands. Therefore, the work on your event-driven architecture is never “done.” The event-driven approach is not a one-time project with a finite end. It is an evolutionary journey that should be constantly monitored, tweaked, and improved.

This is the only way to ensure that you are making the most of your investment and reaping all the benefits of event-driven architecture.

Why Should You Consider Event-Driven Architecture and Apache Kafka for Your Business?

Digital transformation is essential for the logistics and transportation sector and many other industries. Apache Kafka and Event-Driven Architecture provide a way to manage that data effectively and efficiently, saving time and money while improving customer experience.

If you’re still on the fence about event-driven architecture and Apache Kafka, hopefully, this article has given you a few reasons to consider them. Both offer great benefits for businesses of all sizes and industries.

At nexocode, we specialize in helping businesses implement Kafka and EDA. We have years of experience helping companies just like yours get the most out of their data. Contact us today to learn more about how we can help you achieve industry success with event-driven architecture and Apache Kafka.

About the author

Dorota Owczarek

Dorota Owczarek

AI Product Lead & Design Thinking Facilitator

Linkedin profile Twitter

With over ten years of professional experience in designing and developing software, Dorota is quick to recognize the best ways to serve users and stakeholders by shaping strategies and ensuring their execution by working closely with engineering and design teams.
She acts as a Product Leader, covering the ongoing AI agile development processes and operationalizing AI throughout the business.

Would you like to discuss AI opportunities in your business?

Let us know and Dorota will arrange a call with our experts.

Dorota Owczarek
Dorota Owczarek
AI Product Lead

Thanks for the message!

We'll do our best to get back to you
as soon as possible.

This article is a part of

AI in Logistics
51 articles

AI in Logistics

Artificial Intelligence is becoming an essential element of Logistics and Supply Chain Management, where it offers many benefits to companies willing to adopt emerging technologies. AI can change how companies operate by providing applications that streamline planning, procurement, manufacturing, warehousing, distribution, transportation, and sales.

Follow our article series to find out the applications of AI in logistics and how this tech benefits the whole supply chain operations.

check it out


Insights on practical AI applications just one click away

Sign up for our newsletter and don't miss out on the latest insights, trends and innovations from this sector.


Thanks for joining the newsletter

Check your inbox for the confirmation email & enjoy the read!

This site uses cookies for analytical purposes.

Accept Privacy Policy

In the interests of your safety and to implement the principle of lawful, reliable and transparent processing of your personal data when using our services, we developed this document called the Privacy Policy. This document regulates the processing and protection of Users’ personal data in connection with their use of the Website and has been prepared by Nexocode.

To ensure the protection of Users' personal data, Nexocode applies appropriate organizational and technical solutions to prevent privacy breaches. Nexocode implements measures to ensure security at the level which ensures compliance with applicable Polish and European laws such as:

  1. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (published in the Official Journal of the European Union L 119, p 1); Act of 10 May 2018 on personal data protection (published in the Journal of Laws of 2018, item 1000);
  2. Act of 18 July 2002 on providing services by electronic means;
  3. Telecommunications Law of 16 July 2004.

The Website is secured by the SSL protocol, which provides secure data transmission on the Internet.

1. Definitions

  1. User – a person that uses the Website, i.e. a natural person with full legal capacity, a legal person, or an organizational unit which is not a legal person to which specific provisions grant legal capacity.
  2. Nexocode – NEXOCODE sp. z o.o. with its registered office in Kraków, ul. Wadowicka 7, 30-347 Kraków, entered into the Register of Entrepreneurs of the National Court Register kept by the District Court for Kraków-Śródmieście in Kraków, 11th Commercial Department of the National Court Register, under the KRS number: 0000686992, NIP: 6762533324.
  3. Website – website run by Nexocode, at the URL: nexocode.com whose content is available to authorized persons.
  4. Cookies – small files saved by the server on the User's computer, which the server can read when when the website is accessed from the computer.
  5. SSL protocol – a special standard for transmitting data on the Internet which unlike ordinary methods of data transmission encrypts data transmission.
  6. System log – the information that the User's computer transmits to the server which may contain various data (e.g. the user’s IP number), allowing to determine the approximate location where the connection came from.
  7. IP address – individual number which is usually assigned to every computer connected to the Internet. The IP number can be permanently associated with the computer (static) or assigned to a given connection (dynamic).
  8. GDPR – Regulation 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of individuals regarding the processing of personal data and onthe free transmission of such data, repealing Directive 95/46 / EC (General Data Protection Regulation).
  9. Personal data – information about an identified or identifiable natural person ("data subject"). An identifiable natural person is a person who can be directly or indirectly identified, in particular on the basis of identifiers such as name, identification number, location data, online identifiers or one or more specific factors determining the physical, physiological, genetic, mental, economic, cultural or social identity of a natural person.
  10. Processing – any operations performed on personal data, such as collecting, recording, storing, developing, modifying, sharing, and deleting, especially when performed in IT systems.

2. Cookies

The Website is secured by the SSL protocol, which provides secure data transmission on the Internet. The Website, in accordance with art. 173 of the Telecommunications Act of 16 July 2004 of the Republic of Poland, uses Cookies, i.e. data, in particular text files, stored on the User's end device.
Cookies are used to:

  1. improve user experience and facilitate navigation on the site;
  2. help to identify returning Users who access the website using the device on which Cookies were saved;
  3. creating statistics which help to understand how the Users use websites, which allows to improve their structure and content;
  4. adjusting the content of the Website pages to specific User’s preferences and optimizing the websites website experience to the each User's individual needs.

Cookies usually contain the name of the website from which they originate, their storage time on the end device and a unique number. On our Website, we use the following types of Cookies:

  • "Session" – cookie files stored on the User's end device until the Uses logs out, leaves the website or turns off the web browser;
  • "Persistent" – cookie files stored on the User's end device for the time specified in the Cookie file parameters or until they are deleted by the User;
  • "Performance" – cookies used specifically for gathering data on how visitors use a website to measure the performance of a website;
  • "Strictly necessary" – essential for browsing the website and using its features, such as accessing secure areas of the site;
  • "Functional" – cookies enabling remembering the settings selected by the User and personalizing the User interface;
  • "First-party" – cookies stored by the Website;
  • "Third-party" – cookies derived from a website other than the Website;
  • "Facebook cookies" – You should read Facebook cookies policy: www.facebook.com
  • "Other Google cookies" – Refer to Google cookie policy: google.com

3. How System Logs work on the Website

User's activity on the Website, including the User’s Personal Data, is recorded in System Logs. The information collected in the Logs is processed primarily for purposes related to the provision of services, i.e. for the purposes of:

  • analytics – to improve the quality of services provided by us as part of the Website and adapt its functionalities to the needs of the Users. The legal basis for processing in this case is the legitimate interest of Nexocode consisting in analyzing Users' activities and their preferences;
  • fraud detection, identification and countering threats to stability and correct operation of the Website.

4. Cookie mechanism on the Website

Our site uses basic cookies that facilitate the use of its resources. Cookies contain useful information and are stored on the User's computer – our server can read them when connecting to this computer again. Most web browsers allow cookies to be stored on the User's end device by default. Each User can change their Cookie settings in the web browser settings menu: Google ChromeOpen the menu (click the three-dot icon in the upper right corner), Settings > Advanced. In the "Privacy and security" section, click the Content Settings button. In the "Cookies and site date" section you can change the following Cookie settings:

  • Deleting cookies,
  • Blocking cookies by default,
  • Default permission for cookies,
  • Saving Cookies and website data by default and clearing them when the browser is closed,
  • Specifying exceptions for Cookies for specific websites or domains

Internet Explorer 6.0 and 7.0
From the browser menu (upper right corner): Tools > Internet Options > Privacy, click the Sites button. Use the slider to set the desired level, confirm the change with the OK button.

Mozilla Firefox
browser menu: Tools > Options > Privacy and security. Activate the “Custom” field. From there, you can check a relevant field to decide whether or not to accept cookies.

Open the browser’s settings menu: Go to the Advanced section > Site Settings > Cookies and site data. From there, adjust the setting: Allow sites to save and read cookie data

In the Safari drop-down menu, select Preferences and click the Security icon.From there, select the desired security level in the "Accept cookies" area.

Disabling Cookies in your browser does not deprive you of access to the resources of the Website. Web browsers, by default, allow storing Cookies on the User's end device. Website Users can freely adjust cookie settings. The web browser allows you to delete cookies. It is also possible to automatically block cookies. Detailed information on this subject is provided in the help or documentation of the specific web browser used by the User. The User can decide not to receive Cookies by changing browser settings. However, disabling Cookies necessary for authentication, security or remembering User preferences may impact user experience, or even make the Website unusable.

5. Additional information

External links may be placed on the Website enabling Users to directly reach other website. Also, while using the Website, cookies may also be placed on the User’s device from other entities, in particular from third parties such as Google, in order to enable the use the functionalities of the Website integrated with these third parties. Each of such providers sets out the rules for the use of cookies in their privacy policy, so for security reasons we recommend that you read the privacy policy document before using these pages. We reserve the right to change this privacy policy at any time by publishing an updated version on our Website. After making the change, the privacy policy will be published on the page with a new date. For more information on the conditions of providing services, in particular the rules of using the Website, contracting, as well as the conditions of accessing content and using the Website, please refer to the the Website’s Terms and Conditions.

Nexocode Team