Artificial Intelligence in Drug Discovery and Development

Artificial Intelligence in Drug Discovery and Development

Dariusz Jacoszek - June 7, 2021

The pharmaceutical industry is undergoing a dramatic shift with the advent of artificial intelligence (AI) and machine learning. And while many are quick to paint this new technology as an imminent threat, it may instead be the solution to our chronic drug shortages and other issues in pharmaceutical supply chain management. In fact, AI has already been successfully applied in various facets of drug discovery and development – from helping scientists find new potential medicines to predicting which drugs will fail clinical trials. As more pharma companies begin adopting these technologies, there’s no question that they’ll significantly affect the future of medicine. What exactly does that mean? Let’s explore how AI can improve your company’s success rate with this insightful blog post!

Artificial Intelligence in Drug Discovery

The starting stage of Research and Development in the drug discovery process lasts up to six years. Whereas the next phase of clinical trials, on average, takes more than 5 years. Just 10 out of 10000 initially tested candidates for new drugs make it to clinical trials during this time. By and large, at the end of this lengthy drug design process, just one medical product out of every ten that enters clinical trials is finally approved for use in patients by regulators.

According to the US Food and Drug Administration (FDA), approximately 33 percent of drugs move from Phase II to III, while around 25 to 30 percent move from Phase III to the next phase (Deloitte, n.d.).

phases of drug design and development

Why Does Discovering New Molecules Take That Long?

Drug development is a costly, time-consuming, and gradual process that begins with identifying a successful molecule and ends with the final new molecular entity. The main aim is to find an active particle whose ultimate goal is to affect the human body and prove its quality, purity, and usefulness in treating patients. The foregoing criteria guarantee that the endorsed by regulators, new medications strides patients’ quality of life, not only by treating their condition but also by ensuring that the drug does not become the source of other issues, such as adverse reactions.

Overall, following the FDA report, it lasts more than a decade for new drug discovery until reaching the patients. This experimental process of finding the right drug target, which begins on the laboratory bench and finishes on the pharmacy counter, costs more than US$20 billion. This enormous investment is made mainly by US and EU companies. One-fifth of this money is budgeted on screening assays and testing the toxicity of the newly identified drug candidates.

Barriers to Drug Discovery

Besides costs, regulatory barriers have become a concern, leading to the high failure rate in selecting new drug candidates. Not surprisingly, despite the increase in development costs, the quantity of recently endorsed drugs has continually diminished for quite a long time. This tendency for growing prices of drug discovery, testing in vivo, and complex regulation of the approval process—creates a rising issue for both the pharmaceutical industry and patients who are waiting for new medications to improve their quality of life. It is convenient to consider how unused advances, specifically useful intelligence in drug discovery along with genomics, proteomics, and the related field of toxicogenomics, can offer assistance to speed up drug discovery and help to make it more effective (Kraljevic et al., 2004, #).

drug discovery and development timeline

Price Tag and Timelines of Developing a New Treatment Will Be Rewritten Thanks to AI

It is estimated that the approximate cost of discovering a new drug goes beyond US$2.5 billion. Since money is spent on the nine out of ten candidate new treatment developments that struggle anywhere between clinical trial phase I and regulatory acceptance, a large portion of it basically goes down the drain. Surprisingly, only a few in the industry question the importance of doing it differently.

According to many big pharma organizations, the answer is at the reach of a hand when supported by cutting-edge technology. Pfizer leverages machine learning (ML) or deep learning (DL) and advancements to invent immuno-oncology drugs using the superpower computer named IBM Watson. Sanofi has signed an agreement with a British pioneer in artificial intelligence technology in drug design - Exscientia, that propels finding therapies for metabolic diseases, and Genentech (bought by Roche in 2009) is using an AI in drug discovery from a similar start-up headquartered in Cambridge, Massachusetts which focuses on computational biology models of interactions among genes and proteins in cells, to fuel cancer treatment searches.

If researchers supporting these approaches are right, artificial intelligence and machine learning can commence an era of drug development that is faster, cost-effective, and more successful. A few are doubtful, but most researchers believe these methods will become more relevant in the future. Scientists face both challenges and opportunities due to this transition, especially when the solutions are matched with computerization and robotization (Fleming, 2018).

Creating Value From Mountains of Data

Scientific progress is data-driven, but big data poses challenges for data scientists. As indicated by a new report of the IBM Institute for Business Value, 84% of life science companies benefit from structured and unstructured data. Researchers do not keep up with professional articles, patents, and other unstructured data referring to their respective fields. Therefore they leave out of consideration other scientific discoveries that can enhance or supplement their own research results (IBM Institute for Business Value, 2018). Almost 80% of medical data remains unstructured and untapped after being created (Kong, 2019). Most medical advances are not achieved in isolation, so it is recommended to combine knowledge from different organizations and research fields. AI can investigate large amounts of data sets and has relentless capabilities to coordinate and match data. Researchers can use AI to support activities such as target identification, drug design, and drug repurposing in drug discovery. AI plays an important role in improving the skills of scientists rather than making scientists’ positions redundant. This is because subject matter experts are critical in defining data for AI analysis and providing peer review and end-to-end verification of results. Traditional drug discovery can only evaluate a limited number of experiments or evidence at a time, which increases the possibility of bias. However, artificial intelligence methods in drug discovery and development can help researchers avoid implicit bias when only local data is used. Research on active substances for drug discovery also supports new tests that cannot be performed using traditional methods (IBM Institute for Business Value, 2018).

most disruptive technologies
Source: GlobalData Pharma Intelligence Center

AI Applications Becoming Critical to Drug Discovery

Drug discovery and development have been sped up thanks to the progression in computer technology. Across many industries and academia, artificial intelligence is commonly utilized. While machine learning is a key component of AI, it has found its way into various areas, including data creation and analysis. Algorithm-based methods, such as machine learning, include a lot of mathematical and computational theories.

Numerous promising innovations, such as deep learning, helped with the development of self-driving cars, expedited methodologies that enabled the recognition and translation of spoken language into text, and assisted with support vector machines, which can be defined as supervised learning models with associated learning algorithms that analyze data for classification and regression analysis (Support-Vector Machine, n.d.).

This change made the fight against disease a success, but the high cost of developing drug candidates put a strain on medical care. The costs of drug discovery and development are quite diverse and candidate-specific but have increased steadily and dramatically. The components of early drug discovery include target identification and characterization, lead discovery, and lead optimization.

Many computer-oriented strategies and other methods were used to invent and optimize lead compounds, along with molecular docking, pharmacophore modeling, selection forests, and comparative molecular discipline analysis. Machine learning and deep learning have come to be appealing techniques for drug discovery. The utilization of algorithms of machine learning and deep learning in drug discovery aren’t attached to a particular step. However, they can be used at any stage of this lengthy process.

Machine Learning is becoming an essential technology for drug discovery and development. AI platform architectures can process vast amounts of data, which helps researchers in the discovery and development of drugs by providing them with valuable insights. In the following paragraphs, we will discuss several artificial neural network architectures that have been used for ML tasks, such as classification and regression analysis for drug development. It is important to understand the technicalities of each approach to make the right decision for ML projects.

the iterative process of artificial intelligence in drug discovery
Machine learning and deep learning algorithms may participate in each iterative phase, e.g., by mining proteomic in target discovery, discovering small molecules as candidates in lead discovery, developing quantitative structure-activity relationship models to optimize the lead structure for improved bioactivity, and analyzing massive assay results.

Deep Learning (DL) Methods

In almost every science and technical field, deep learning algorithms are seen as one of the forefront spaces of research and development. DL algorithms enable computational models to memorize and master a representation of multidimensional information through abstraction. DL solves many problems faced by standard ML algorithms, including image and speech recognition. In the process of drug discovery, the DL method has become an exemplary method for predicting drug activity, targeting, and discovering basic molecules as new drug candidates. The foundation of DL widely involves the usage of neural network systems (NNS), which are used to create systems that can identify, translate, and originate complex data (Patel et al., n.d.).

Artificial Neural Networks (ANNs)

Deep learning is a kind of machine learning algorithm that analyses data representations using artificial neural networks with several layers of nonlinear processing units. A contemporary ANN’s fundamental structure consists of a structure similar to the human brain. The ANN has three simple layers: the input layer, the hidden layer, and the output layer. The nodes, also known as neurons, in adjacent layers are either totally or partly connected depending on ANN. Input variables are conveyed by input nodes, and the variables are converted by hidden nodes to be measured at output nodes. An ANN is trained by iteratively modifying the network’s weight values to optimize the miscalculations between expected and exact values, usually using generalizations of backpropagation algorithms.

Deep Neural Networks (DNN)

The first neural network (NN) architecture is the completely linked deep neural network (DNN), which has several hidden layers and hundreds of nonlinear process units in each layer. DNNs can process a sizeable number of input features, and neurons in various layers of a DNN can naturally separate highlights at various progressive levels.

the general scheme of deep neural network dnn

Convolutional Neural Networks (CNN)

Convolutional neural network (CNN) is another well-known neural network that is commonly used for visual recognition. It typically has several convolution and subsampling layers. The convolution layer comprises a set of filters (or kernels) with a limited receptive area and learnable parameters. Each filter is convoluted around the width and height of the input volume, calculates the dot product between the filter input and its receiving field in the input volume, and creates a two-dimensional map of the filter.

The subsampling layer is utilized to decrease the estimate of activation maps. As a result, the activation maps are integrated into layers where neurons in neighboring layers are all associated with a conventional ANN to donate an output value. Owing to sharing the same parameters for each channel, a CNN generally decreases the number of free parameters learned, bringing down the used up memory and expanding the learning speed. It has beaten other sorts of machine learning calculations in picture recognition.

convolution neural network schema

Recurrent Neural Networks (RNN)

Another variant of ANN is a recurrent neural network (RNN). Compared with a directly driven NN, the connections between neurons in the same hidden layer can form a direction loop. RNN can accept serial data as input attributes, which is very suitable for time-sensitive tasks - including language modeling. Using a technique called long short term memory (LSTM), RNN can reduce the problem of dissolution gradients.

recurrent neural network schema

Autoencoder (AE)

The following 4th ANN architecture is known as the autoencoder (AE). An AE could be a NN utilized for unaided learning. It has an encoder element, an NN that remodels data obtained from the input layer into a limited number of hidden units, and then a decoder NN with the same number of nodes as the input layer. In place of guessing the labels of input cases, the decoder neural networks aim to recreate its own inputs from a smaller number of hidden units. The aim of AE is typically to reduce nonlinear dimensionality. The AE principle has recently gained popularity for learning generative models from results. In the graphic below, we can see how these DL technologies are being used in drug discovery studies (Chen, 2018).

autoencoder schema

AI Drug-Hunting Captures Big Pharma Interest

The number of AI companies focused on discovering new drugs using innovative approaches to deep learning in drug discovery and preclinical testing has increased rapidly. According to the Landscape of AI for Drug Discovery and Advanced R&D Q2 2019, issued by DEEP KNOWLEDGE ANALYTICS in July 2019 (Deep Knowledge Analytics Pharma Division, n.d.), the landscape of the AI R&D market consists of over 170 AI companies, 50 corporations, 400 investors and 35 major R&D centers across different geographical regions. This booming area for new opportunities in drug discovery will grow year by year, and today its worth surpluses US$700 million. However, many researchers predict that in the next five years, the market capitalization of the AI R&D market will increase to US$20 billion.

Different Models to Leverage AI Capabilities

The most innovative biopharma organizations are currently exploring different models of harnessing AI-powered drug discovery technologies by striking several deals to gain full access to disruptive innovation. Large players are not just acquiring the start-ups for their intellectual property. They also establish dedicated internal teams by recruiting AI researchers and data scientists, as well as forming partnerships with silicon valley tech companies, achieving greater advancements in adopting AI in drug discovery methods into the drug design practices. Although it is too early to determine which approach would be the most effective, a consistent pattern is introducing multiple AI solutions at different levels.

Another trending strategy to turning AI solutions into successful drug discovery is to partner with contract research organizations (CROs), which are highly adaptable and specialized, so they help big pharma companies to move fast in the right direction, however at cost, which has to be factored in unstoppably growing budgets for new drug discoveries (Deloitte, 2019).

aI deals in pharma companies

AI in Drug Discovery - How to Approach It?

New drug discovery methods are needed to improve the recognition rate of new drug candidates while reducing the costs associated with early key discovery.

Given recent advancements in machine learning, the field is now ripe for applying algorithmic solutions for molecular property prediction to identify novel structural classes of new drug candidates. Indeed, adopting methodologies that allow early drug discovery to be performed largely in silico enables the exploration of vast chemical spaces beyond the reach of current experimental approaches.

With recent advances in machine learning, the field is now ready to play algorithmic solutions to predict molecular properties and identify new structural classes for drug design. Novelty methodologies in this process support early drug identification to be performed via computer simulation, which enables exploring vast chemical spaces beyond the scope of current experimental approaches (Stokes, 2020).

Intelligence in Drug Discovery by Using Different AI Models

AI models are much more capable than humans at finding correlations in large amounts of data from various sources. AI-driven drug discovery proponents claim that such methods can find targets, explore and optimize new drugs, and even design drug candidates from the ground up. It’s also possible to increase the chances of effective preclinical testing by using AI to determine the best animal model for a particular disease. According to a 2019 Deloitte survey, 40% of drug discovery start-ups use AI to screen chemical repositories for potential drug candidates, 28% use AI to find new drug targets, and 17% use it for computer-assisted molecular design (Burki, 2020).

Exscientia Example

Exscientia is known as a global Artificial Intelligence AI-driven drug discovery company, which is focusing on the power of the original AI with the practical knowledge of how to research new drug targets. In 2020 Sumitomo Dainippon Pharma and Exscientia announced that, in collaboration, they developed a new drug candidate, DSP-1181, that was selected with the usage of Artificial Intelligence. The phase 1 clinical study was started in Japan. It was announced that this venture was conveyed by the solid cooperative energy and common ground in research, requiring less than 12 months to achieve the exploratory research stage, whilst the average industry lead time is 4.5 years.

This new drug candidate created with Artificial Intelligence, combined with deep expertise, knowledge, and experiences in chemistry and pharmacology on monoamine GPCR drug discovery, can be the first AI-discovered new medicine entity submitted to treat obsessive-compulsive disorder.

Applied Intelligence in Drug Discovery

Exscientia is the first organization to announce the first AI-designed molecule for immuno-oncology into human clinical trials in 2021. Now, this drug-hunting company is responsible for two AI-developed drugs that enter human clinical trials. The drug in question has excellent properties of high selectivity to target receptors. It has the potential benefits of reducing systemic side effects and minimizing brain exposure to avoid potential psychological side effects. This can bring great benefits to cancer patients, who are waiting for such immuno-oncology drug discoveries as very promising therapies to their unmet needs (Exscientia, 2021).

BenevolentAI Case Study

Profile of the Company

BenevolentAI is another innovative pharma company that uses AI algorithms to conduct large amounts of basic research data from public and private sources, including machine learning, to improve target predictions, understanding of disease mechanisms, and identification of new targets for drugs.

Similar to Exscientia, BenevolentAI is a UK-based company that was established in 2013. It develops and implements artificial intelligence (AI) technology to be at the head of improving how drugs are designed, through the discovery and development phase, all the way long to bringing new drugs to market. This company is proudly stating that building the right team is an art. Therefore they bring together the unique skills of over 200 researchers, such as biologists, chemists, engineers, informaticians, and data scientists, who pioneer precision medicine and drug discovery. BenevolentAI has offices in many locations, with headquarters based in London and additional offices in New York. In contrast, the research laboratory is located in Cambridge, UK - one of the world’s biggest biotech hubs (Deloitte, 2019).

AI Technology and Platforms

BenevolentAI actively develops drugs for diseases such as Amyotrophic lateral sclerosis (ALS), Parkinson’s disease, ulcerative colitis, and sarcopenia. It has established alliances with various big biopharma firms, thanks to its AI drug discovery approach and capabilities ranging from early research to late-stage drug development. The company’s superiority lies in a Benevolent Platform® - a cutting-edge computational and experimental AI-powered platform that enables their researchers to explore and find desired properties in new molecules and create a new intelligence in drugs by personalizing medicines for patients. The Benevolent Platform®’s three focal points are target identification, molecular design, and precision medicine (Deloitte, 2019).

benevolent platform for aI driven drug design

Impactful Cooperations

In April 2019, BenevolentAI announced a long-term agreement with AstraZeneca to create novel therapies for chronic kidney disease (CKD) and idiopathic pulmonary fibrosis (IPF). Top data scientists and many other researchers from both companies will collaborate to integrate AstraZeneca’s genomics, chemistry, and clinical data with BenevolentAI’s target recognition database and a network of contextualized analytical data and their relationships. Machine learning analyzes data in a structured manner to discover relations between information data sets, and AI-based inference is used to extrapolate previously unknown relations. The firms will collaborate to analyze the findings better to explain the fundamental causes of these complex diseases and define new possible drug candidates more rapidly.

BenevolentAI concluded a Framework Collaboration Agreement with Novartis Pharma AG in September 2019. In this initial project with Novartis in oncology, AI and ML technologies will be used to stratify patients and develop a greater understanding of patient and condition heterogeneity to tailor drugs for patients more accurately. BenevolentAI will use its technologies developed to make more data-driven decisions and identify new approaches to cure illness and personalize medications for patients under the terms of the arrangement. The Benevolent Platform® consumes genetic, clinical, and pharmacological data, as well as scientific literature, to extract qualitative relationships in data between genomes, pathogens, medications, and biological pathways, resulting in the identification of novel/optimal drug targets. The Platform assists scientists in designing and optimizing the best drug molecule for a specific patient population.


The corporation intends to leverage the power of AI to put patients first and tangibly change their lives by developing a method to reduce drug research and production prices, decrease rejection rates, and accelerate the delivery of drugs to patients. BenevolentAI has had many pieces of literature presented in prestigious scientific journals and at world-renowned conferences.

Despite the damage created by the global pandemic and associated lockdowns, BenevolentAI achieved significant results in 2020. Their COVID-19 study, which began in January 2020, resulted in the identification of baricitinib as a possible therapy over the course of one weekend. Just nine months later, baricitinib was approved by the FDA as one of the few coronavirus drugs for emergency use.

E.Coli Example

A paper published in Cell in February 2020 described how a deep learning algorithm was used to explore a new possible antibiotic. There is an increasing urgency to find new antibiotics due to the sudden proliferation of antibiotic-resistant bacteria. To resolve this problem, a deep neural network was trained to be capable of predicting antibacterial molecules. Scientists wanted to build a training dataset from scratch that was affordable, chemically diverse and didn’t necessitate sophisticated laboratory tools. This would allow for implementing a rigorous model for predicting new antibiotics without the realistic challenges associated with large-scale antibiotic screening efforts.

The researchers trained a neural network on a dataset of 2335 specific compounds to find molecules that inhibit Escherichia Coli development. They then applied the model to a library of 6111 molecules being studied for human diseases, intending to find effective compounds against E. coli. Based on their composition, the model correctly predicted antibacterial behavior in 51 compounds. One compound, in particular, seemed encouraging. The c-Jun N-terminal kinase inhibitor SU3327 was bactericidal to E Coli. The compound was selective against Clostridium difficile and pan-resistant Acinetobacter baumannii infections in mouse models (Stokes, 2020, #).

AI Use Cases and Data Sources

AI platforms are particularly effective in the following areas:

  • Predictions of emerging biological relationships, such as drug-target, gene-disease, drug-adverse event, gene-pathway predictions for use cases such as target discovery, indication extension of a drug, or toxicity predictions. Read more on predictive analytics in pharma here.
  • Optimizing clinical trial patient selection and stratification, biomarker assessments, length, and recruiting are all aspects of clinical trial design. You can read more on AI in Clinical Trials in our recent article.
  • Increasing confidence in current expertise and evidence, such as assessing the certainty of biomedical knowledge.
  • Personalized care includes patient stratification as well as planning and providing the appropriate medication to the appropriate patient.

AI platforms will also assist researchers in integrating and creating value from a variety of data sources:

  • Unstructured evidence or information bases, such as biomedical literature, textbooks, and patents
  • Organized databases of drugs, substances, pathogens, genomic, metabolomic, proteomic, and biomarker details with well-defined classification focused on characteristics or biological principles.
  • Scientific records are derived from unstructured and standardized data channels, such as prescribed medicines (Rx) claims, electronic health record data, clinical trial data, and toxicology reports.
  • Imaging includes radiologic images, computed tomography (CT), magnetic resonance imaging, positron emission tomography (PET), PET-CT, ultrasound, x-rays, and histologic images of cells and tissues.
  • Data obtained from computing systems found in commonplace artifacts, such as the Internet of Things.
prominent applications of aI in pharmaceutical industry
Artificial intelligence (AI) applications in various subfields of the pharmaceutical industry, ranging from drug development to pharmaceutical product management.

The Future of Drug Discovery

The vast chemical room, which contains more than 1000 molecules, encourages the synthesis of a significant number of drug molecules. The shortage of modern technology, on the other hand, inhibits the drug discovery process, making it a time-consuming and costly challenge that can be solved by using AI. AI will identify hit and lead molecules, allowing for faster confirmation of the drug target and drug optimization.

Identifying a novel drug molecule necessitates its integration into an appropriate dosage type with the desired distribution characteristics. In this case, AI will take the place of the traditional trial-and-error process. With the aid of a quantitative structure-property relationship, various computational tools can solve problems encountered in the formulation design field, such as stability issues, dissolution, porosity, and more. Decision-support tools use rule-based methods to select the form, composition, and quantity of excipients based on the physicochemical properties of the compound, and they use a feedback mechanism to track and adjust the whole procedure (Paul et al., 2020).

Concept of ‘4P’ Medicine

As yet, AI for drug discovery firms has mostly concentrated on analyzing massive databases, because deep learning necessitates large amounts of data. In the future, large datasets mixed with observations from tiny datasets obtained in real-time from patients and people within the target demographic can cause chaos. To push the architecture of new precision drugs, these small datasets will be combined with algorithms trained on large datasets.

If the number of compounds discovered using AI grows, novel medications capable of treating particular pathologies will become available, with personalized medicines developed from scratch in a matter of weeks. These therapies can be highly targeted, related to individual genetic backgrounds, and will prevent risks such as side effects. This transition would usher in a new era for the healthcare sector, as a greater understanding of disease causes expands the number of drugs available and, in many ways, cures diseases that did not previously have successful treatments. This also implies that intellectual property can be used to safeguard not only chemical formulas but also goods and services as part of the approval process. Although individual drug costs may increase as treatments become more effective to treat more concentrated individuals, aggregate drug spending may decline as the number of patients served by individual drugs decreases. Improved adherence and earlier intervention can also help to increase the cost-effectiveness of these experimental treatments (Deloitte, 2019).

prominent applications of aI in pharmaceutical industry
Artificial intelligence (AI) applications in various subfields of the pharmaceutical industry, ranging from drug development to pharmaceutical product management.

The Future of Healthcare With AI-Driven Drug Discovery

When existing dots are paired, great breakthroughs arise. Penicillin, for example, got its start in 1928, when Alexander Fleming returned from vacation to find a mold had infected some of his bacteria cultures, preventing the natural growth of the bacteria. However, it wasn’t until a decade later that Howard Florey came across Fleming’s article on the Penicillium mold in a medical journal that it gained traction. With his colleague Ernst Chain, he started discovering penicillin, and the first human patient was successfully treated with the drug in 1942. A sequence of dot interactions resulted in one of the most significant medical breakthroughs of our day.

That is similar to how artificial intelligence (AI) and machine learning (ML) are used to link “data dots” to yield new insights. However, to discover interactions between various information realms, these systems must access massive, complex data sets. This can be difficult in sectors such as health and pharma, where data is gathered and processed in many locations and is considered highly sensitive. Participants of digital health networks, such as suppliers, insurers, states, academics, and others, can exchange patient information freely and compliantly by using an integrated distributed data framework (Waters, 2020).

Areas of Healthcare Where AI Is Visibly Saving Lives

Technological advancements have always been critical to bettering our lives, including our well-being. Inventions such as the X-ray machine, the microscope, and the ultrasound machine have made important contributions to disease diagnosis, treatment, and prevention over the past century. The introduction of artificial intelligence has become one of humanity’s most notable technological leaps in this century. From knowledgeable aides to self-driving cars, AI is transforming our everyday lives, both professionally and personally. Fortunately, there is a wide range of applications in healthcare for artificial intelligence to contribute to our well-being by intelligently assisting patients, physicians, healthcare professionals, hospital administrators, drug manufacturers, and scientists (Glauner, 2021).

The uses of AI in healthcare are numerous, and the number of scientific journals, research projects of major corporations, startups, and blog posts seems to be exploding in recent years.

100 Most Cited Articles in Medical AI

A new bibliographic review of the 100 most cited articles relating to the utilization of AI in the practice of medicine between the years 1950 and 2019, in that area reveals that medical informatics, defined as the management and utilization of patient healthcare records in technologies such as precision medicine and diagnostics has gained the most recognition from researchers in recent years is referred to as medical informatics. Radiology, oncology, and non-radiological diagnostic image processing are the next on the list. Interestingly, cardiovascular medicine is missing in AI research despite its high disease burden. They also discover that clinical trials account for just 11 of the top 100 most cited articles (Sreedharan, 2020). As a result, 89 of the top 100 research articles have not been validated through clinical trials, showing that there is still a need for AI to be used in healthcare. In contrast, many firms with appealing services in the area of medical artificial intelligence are attracting a lot of interest from venture capitalists. This trend can be observed on the below visualization, which has been prepared by the market analysis platform CB Insights, proving that funding in healthcare AI has increased by a factor of eight since 2015.

healthcare ai equity funding trends
Source: CB Insights


When it comes to battling an illness, a proper diagnosis is always an important step towards achieving it. It is not surprising, then, that one of the first uses of AI in healthcare, dating back to the 1970s, was disease diagnosis. Stanford University created a rule-based expert scheme that successfully detected blood-borne bacterial infections. However, it was never used in clinical settings because it did not outperform human physicians and had limited alignment with other healthcare information systems.

Today, there are many AI approaches to diagnostics, with the majority of them depending on machine learning techniques rather than expert programs. To begin with, keeping us up to date with all of the medical science that is going on and the evidence that is bursting is an impossible challenge. Secondly, machine learning methods have taken a frog leap due to recent breakthroughs in deep learning. They can now be used ideally in healthcare, with image processing in radiology being the most apparent use (Glauner, 2021).

Individualized Healthcare/ Personalized Medicine

Healthcare, as we know it now in our developed society, is a structure that has been largely unchanged for many decades. You are ill. You go to see a physician, he makes a diagnosis and recommends a drug that is proven to heal everything you have, you go to the pharmacy for what was prescribed, and you eventually get recovered.

What if the prescribed medicine was personalized not only to your symptoms but also to your unique organism, physiology, and medical history? And what if you didn’t need to see a physician at all when you could get a professional and individualized treatment recommendation from the comfort of your own home? And wouldn’t that be awesome if you didn’t get sick at the very beginning? Hence, you get just the diet and exercise recommendations your body needs to stay healthy depending on its unique characteristics? All of this is made possible by personalized healthcare driven by AI.

Precision Medicine

Many individuals use nutrition recommendations and symptom checker software for self-service wellbeing.

Precision medicine refers to healthcare providers’ attempts to predict the best medical option for each particular patient. It is based on the same concept and employs similar details, such as nutritional and lifestyle statistics. However, it usually contains further data gathered during whatever health check-ups were performed; it can also contain DNA analysis. When it comes to therapeutic applications, professional expertise from physicians will optimally supplement AI-based guidelines. Precision medicine also extends to clinical projects aimed at tailoring therapies to certain subgroups.

personalized medicine

Precision medicine may also be a paradigm change for cardiovascular diseases (CVDs), diverse and heterogeneous. CVDs are highly affected by both external and internal patient factors, such as a person’s lifestyle and diet, as well as genes. AI allows cardiovascular experts to extract trends from massive amounts of available data that the human brain will never be able to process.

However, this is often where the risk and drawback of AI, not necessarily in medicine, lies. If the underlying use case or medical challenge is misunderstood and not reflected in the results, then the strongest AI method cannot solve the assigned challenge correctly. To use AI to help patients, physicians and data scientists must collaborate and discuss diagnostic and mathematical assumptions.

Geographical Spread of AI Drug Discovery Sector

The AI-driven drug development environment is largely focused on the United States. This apparent regional disparity is partially attributed to the fact that many businesses, such as Chinese startups, are currently incorporated in the United States (BioPharma Trend, 2021).

Source: BioPharma Trend

In 2019, North America led the industry, accounting for the highest sales share of almost 60%. One of the main reasons for the strong market share is the widespread adoption of AI systems in the United States, aided by the participation of many companies in the region. According to the 2020 RELX Emerging Tech Executive Report, AI is now leading industry transformation. In 2018, less than half of corporate executives (48%) said their firms used these innovations, and last year this figure rose to 81%, from 72% in 2019. This is mostly attributed to companies’ increasing favorable AI view (Grand View Research, 2020).


The Asia Pacific is projected to be the market’s fastest-growing area. The region’s demand for AI platforms for drug exploration is expected to be driven by the increasing adoption of emerging technologies in India and China for new drug growth, as well as an emphasis on boosting pharmaceutical capacities within the countries. By the end of the forecast period, the demand in this area is projected to develop at a Compound Annual Growth Rate (CAGR) of nearly 32%.

Top-100 AI Experts in Drug Discovery Distribution by Countries

The below map depicts the global location of the top AI pioneers in pharma and healthcare. The United States and the United Kingdom continue to house the greatest number of top researchers. It should be remembered, however, that China has the capacity to significantly alter these numbers in the coming years due to the reverse migration of top AI experts from the United States.

aI experts in drug discovery distribution by countries
Source: Deep Knowledge Analytics

Data-Driven Revolution in Drug Discovery

Inventive organizations will improve the odds of survival for their clinical applicants by using these data archives. Data storage and analysis that is efficient can speed up drug production processes. Established data processing strategies, however, are now struggling to perform at the rate needed to reach the exponentially growing volume of scientific information produced.

Pipelines for pharmaceutical and biotechnology companies are collapsing, leaving many firms unable to handle the mounting demand on existing networks successfully. Specialist systems built to promote and advance alongside drug discovery and testing data outputs are desperately required to solve these crucial market challenges and bring industries in line with environmental demands (Chilukuri et al., 2017).

Big data provides a new perspective on complex problems and is changing the way clinical data management processes are organized. For more information on the subject, head over to our article on specialist systems for pharmaceutical data management here.

Embracing a Data-Based Transformation

Specialist experience, which was once only open to universities and other research institutions, is now being made available to a new wave of start-up firms. Companies that use this academic expertise are poised to make breakthroughs not only in pipelines and systems but also in the fundamental way that organizations manage, form, and direct data flow across their whole organization. These developments are so radical and rapid that businesses would need to develop an entirely different perspective regarding their interpretation of the data they generate and how it is used in the science or commercial context.

R&D-focused organizations in the pharmaceutical and biotechnology sectors are emphasizing the importance of being more familiar with big data technologies. Large enterprises, such as Novartis, are rebranding themselves as “medicines and data science” firms. This has increased the visibility and significance of bioinformaticians within organizations that have historically provided support roles for biologist colleagues on data collection, statistics, and pipelining of tools.

The revolution is focused on the deliberate construction of a deeper relationship between the researchers who make scientific decisions and the people who produce their technical solutions. Bridging both areas promises to bring transformation and cutting-edge creativity to the industry in the coming years (Chilukuri et al., 2017).

Machine Learning: A Way to Make Informed Decisions

Addressing universal data processing problems and prevalent environmental challenges has resulted in efficient automation and streamlining various analytical processes. New machine learning technologies allow the incorporation of datasets into a fully data-driven decision-making process. These datasets may be from various workstreams, such as mass spectrometry, next-generation sequencing (NGS), high-throughput imaging, immunoassays, and biophysical assays.

Early Detection of Patterns Through ML

Decision-making becomes easier and of better quality in the right setting. Researchers now have a better understanding of the scientific information available to them without manually combining disparate data sets. The occurrence of adverse drug reactions or side effects, for example, can be a major explanation why clinical trials stall during candidate screening. Integrating complex data assets with a range of statistical methods raises the probability of discovering these side effects early in life before a potential medication enters the clinical process.

Platforms with a powerful machine learning component can detect patterns and trends that are not apparent to the naked eye. By emphasizing these trends, scientists would concentrate on the critical details while reducing disruptive “noise” or meaningless detail. Since the amount of experiments needed is limited, time and effort are saved.

There may also be instances where previously unsuccessful candidates may be used to make decisions on the development of new candidates with identical data profiles. As a result, certain applicants will be able to ‘fail early,’ as construction programs may be canceled at an earlier level to prevent further expense and study. Since higher costs are typically borne as applicants pass on to clinical trials, the fail quick paradigm is economically important for drug developers.

Machine Learning for Disease Diagnosis and Therapy

Augmented Reality Microscope for Cancer Diagnosis

Microscopic analysis of patient samples is critical in cancer diagnostics for evaluating cancer staging. However, microscopy is dependent on an expert’s image interpretation, which can be subjective; recently, objective analysis of microscopy samples has been required. Finally, the number of such expert individuals can be limited.

AI will aid in pathological image processing by improving diagnosis precision, quantification, and performance. One such example is the augmented reality microscope, an optical light microscope that allows for real-time AI integration.

The generation of large amounts of data containing knowledge about human genetics enables machine learning techniques for either supervised or unsupervised clustering algorithms. Algorithms in supervised learning learn from branded data, while algorithms in unsupervised learning attempt to grasp relationships from unlabeled data.

augmented reality microscope
Hardware components of the Augmented Reality Microscope (ARM) system enable real-time capture of the field of view and display of information in the eyepiece of the microscope.

Cost-Effective and Timely Alternative to Current Experimental Procedures

The use of in silico drug development and modeling frameworks would further enhance machine learning for therapeutic purposes. The DrugBank database, for example, provides quantitative, analytic, and molecular knowledge about drugs and drug targets.

DrugBank Database

DrugBank is divided into four main categories:

  1. a low molecular weight organic compound that has been authorized by the FDA as small molecule drugs (over 700 substances),
  2. Biotech (protein/peptide) products authorized by the FDA as drugs (more than 100 entities),
  3. Nutraceuticals or micronutrients such as vitamins and metabolites (more than 60 registrations),
  4. Unapproved, delisted, and forbidden medicines, as well as enzyme inhibitors and possible poisons (constituting 3,200 entries).

Machine learning for drug development can provide a more cost-effective and time-efficient solution to conventional laboratory procedures. Another viewpoint is to use machine learning technology to forecast opioid therapeutic efficacy and individualized treatment approaches.

This approach, known as “drug scoring” or “personalized (individual) medicine,” would take into account features that explain the activation of cell signaling and metabolic pathways to distinguish between patients who benefit from treatment and patients who do not.

Imperfections of Using Neural Networks

Even though machine learning has the potential to transform disease detection and treatment, it still has some drawbacks, including the inability to separate cause from inference, exclude skewed evidence, and regulate predictive analytics. For machine learning to be used safely in disease detection and/or treatment, the data used must be carefully tested to ensure it is suitable for the particular problem. Data must be compiled and annotated fairly for proper identification and interpretation, and it must be descriptive of minorities in heterogeneous populations.

Despite its many benefits, machine learning in biomedicine is still in its early stages. It needs a multidisciplinary team to address ethical, legal, moral, and technical problems until it can critically assist in better medical care.

How AI and Big Data Technology Could Make Drug Development Cost-Effective and Quicker

Artificial intelligence and Machine Learning are already changing the pharmaceutical industry and the whole health sector. Companies that fail to join early with these technologies are at risk of losing market share, annoying health care providers, and distress patients waiting for new drugs.

Drug development is rapidly evolving, and therefore pharma companies must possess the most up-to-date and accurate information to make informed decisions. Companies utilizing these strategies can demonstrate better predictability, decrease time to market, and advance with efficiencies. These innovations are not designed to supplant human cognitive processes. But they are aiming to improve the capacity to analyze the information and make a significant difference in forming a more secure and streamlined environment for drug development. As the volume and complexity of data generated by clinical trials exponentially increases year by year, and real-world evidence (RWE) data continues to grow, AI/ML/ NLP solutions play their role in how clinical trials are changing and how this technology brings new drug discoveries closer to patients.

There is an increasing realization among academic institutions, biopharma companies, CROs, and smaller biotech firms of the potential of AI to transform clinical trials. Specifically, AI has the potential to transform many of the key steps in clinical trials, from protocol design to study execution, thereby improving trial success rates and lessening the biopharma R&D burden (Deloitte, 2020).

Role of Healthcare Professionals Strengthened With AI

More and more medical activities involve AI-powered solutions, which support those traditionally carried out by healthcare professionals (HCP). AI-driven improvements have the potential to aid in the completion of assignments performed by highly qualified HCPs, including such activities that require a high level of expertise or exceed the human brain processing capabilities (for example, analyzing large amounts of data to detect disease). AI solutions will support human beings in jobs, which require inhuman focus and attention and trigger exhaustion or be extremely difficult to sustain for a longer time (Working Group on Digital and AI in Health, 2020).

To strengthen health systems and improve outcomes for patients, the introduction of AI/ML/ NLP into today’s healthcare system requires applying AI-based algorithms in every application that augment the capabilities of individuals involved in strengthening the health systems and its continuous improvement. The lead five applications of artificial intelligence in healthcare are:

  • Public health interventions
  • Pre-clinical research and clinical trials
  • Clinical care workflows
  • Patient-facing solutions
  • Optimization of health back-end processes

Faster Clinical Trials With AI

To combat today’s growing clinical research challenges, we need to systematically integrate AI-enabled tools into how drug development is delivered and expand access for all patients in need. Many reports state that the average cost of bringing a new medicine to market increased from $1.2 billion to almost $2.0 billion within the last decade. In contrast, the average sales per asset fell from $820 million to a low of $380 million in the same period of time (Working Group on Digital and AI in Health, 2020).

The Current State of Clinical Trials

Today’s clinical trials require close collaboration between various professions across the health care system, including representatives from business, universities, government, noncommercial and patient groups, clinicians, patients, investigators, and governmental agencies. Each of them uses a set of tools to support their components of clinical trials. These stakeholders constitute the infrastructure that currently supports the conduct of clinical trials across the globe. Some resources like time, money, and medical personnel are hard to share. On the other hand, supportive IT systems and the research facility infrastructure have to be built in advance to encourage sponsors and researchers to make any investment decisions. Many regretted that most clinical trials are conducted in a “unique” way. Energy and money are spent on collecting different resources each time for a new scientific challenge. However, a lot of experts suggest that effectiveness can be achieved by adjusting the clinical trial methods with the usage of technology, including AI, ML, and NLP, so that those dealing with new research issues can quickly use existing data and its processing methods without having to reinvent the way of workings of each trial (Institute of Medicine (US) Forum on Drug Discovery, Development, and Translation., 2010).

Modern Challenges in Clinical Trials

Identifying appropriate patients will increase the pace and efficiency of clinical trials, resulting in faster acceptance and access to new drugs. However, statistics suggest that only a limited percentage of qualifying patients enroll in clinical trials. Only two to nine percent of adult cancer patients enroll in clinical trials.

According to a 2019 meta-analysis of 13 experiments affecting almost 9000 participants, only 8.1% of prospective patients enrolled in clinical trials; of those who did not, 55.6% did not have a study available where they were being treated, 21.5 % were considered unavailable, and 14.8% chose not to engage in an available trial.

The success rate of clinical trials is heavily dependent on disease type. A systematic analysis that examined 186,000 unique trials between 2000 and 2015 discovered that the average chance of success was 13.8%. Oncology trials, on the other hand, saw a much lower probability of success (POS) rate of 3.4%. This low POS is troublesome since many biopharma firms increasingly focus on oncology as their preferred therapy field.

Site Identification, Recruitment, and Enrolment Challenges

With an increase in the number and complexity of clinical trials, especially in oncology, there is also an increase in competition for suitable trial participants and locations.

Finding the right trial and the right patient is a time-consuming and difficult task for both the clinical research staff and the patient.

Key components undermining the effectiveness of clinical trials are patient and site selection along with patient recruitment and retention processes, as well as the lack of suitable infrastructure to manage the complexity of running a clinical trial.

This is particularly troublesome in the later stages, where effective and accurate adherence monitors, endpoint tracking, and patient monitoring systems are required. AI, specifically deep learning (DL), machine learning (ML), and natural language processing (NLP), when paired with an efficient automated platform, has the ability to increase prescription acceptance rates and lower production costs, and supply medicinal drugs to patients quicker.

AI and its applications are being invested in by all major biopharma firms. Novartis, for example, used AI to integrate clinical trial data from several internal sources to forecast and track trial cost, enrollment, and safety. As a result, the company announced that patient enrolment times in pilot trials were reduced by 10-15% (CB Insights, 2021).

benefits of aI in clinical trials

Adaptive Clinical Trials

The Covid-19 pandemic has accelerated the implementation of technology that can increase clinical trial performance and cost.

While standard experiments can be strict about main endpoints and dosing regimens before moving on to the next level, an adaptive design encourages researchers to change such measurements as trials advance.

The study testing an antibody therapy for Covid-19 patients conducted by Regeneron Pharmaceuticals and Sanofi used an adaptive design. The World Health Organization’s SOLIDARITY experiment, which emphasizes pace, has followed this approach, using a randomized, non-double-blind design (CB Insights, 2021).

Pharmaceutical Companies Applying AI

Artificial Intelligence (AI) and Machine Learning (ML) have been unparalleled growth boosters in the pharmaceutical industry around the value chain. The numerous AI technologies are gaining traction in terms of how businesses are driving creativity and developing market strategies. AI has streamlined and impacted the healthcare industry in various areas over the years, from developing innovative and improved treatments to treating rare diseases.

How Silicon Valley Companies Unsettle the Healthcare Industry

The Big Five companies (namely Amazon, Apple, Facebook, Google, and Microsoft) use medical data and artificial intelligence to create a slew of technologies aimed at transforming healthcare delivery. They are stepping up their exploration of the healthcare market in the hopes of reinventing the $3 trillion US healthcare system by designing and partnering on innovative tools that can benefit patients, medical providers, and insurers (CB Insights, n.d.).

Alphabet in Healthcare

Alphabet uses its experience in artificial intelligence and data storage to further accelerate the industry-wide movement toward predictive analytics, precision medicine, and interoperability. Alphabet aims to improve consumer well-being while lowering healthcare prices.

In November 2019, Alphabet announced its $2 billion purchase of wearable giant Fitbit, opening up a world of new opportunities for the tech giant in terms of health-tracking and employee benefits.

Alphabet is using its leadership in data collection and analytics to address interoperability issues and streamline clinical testing. Through addressing problems with electronic health record (EHR) interoperability and insufficient computing infrastructure, the organization is relying on its cloud network artificial intelligence (AI) to secure strategic hospital relationships (Business Insider, 2021).

Google Disrupting Electronic Health Record (EHR)

Google opened up its Cloud Healthcare API to health services in April 2020, and top medical centers such as the Mayo Clinic soon signed on. These efforts respond to Google’s commitment in 2018 to promote healthcare interoperability and data-sharing standards (also signed by Amazon, IBM, Microsoft, and Salesforce).

Google is now collaborating with EHR providers such as Meditech to migrate their applications and data to the cloud. One possible outcome of these collaborations is a two-way data flow in which EHR vendors are incentivized to incorporate patient-generated data into Google apps (CB Insights, n.d.).

Google’s AI Acquisitions

The three healthcare-focused subsidiaries are as follows: Verily, DeepMind, and Calico.

Verily is where Alphabet does the most of its healthcare job. The subsidiary is dedicated to using data to improve healthcare through analytics software, interventions, testing, and other initiatives. The subsidiary has primarily focused on collaborating with established healthcare organizations to identify areas where AI can be applied, especially through its Study Watch, a wearable device that collects biometric data. The Study Watch, which is currently pending FDA clearance, has been the focal point of various research projects, which are discussed further below.

Google has completed one of the biggest European acquisitions to date, acquiring DeepMind Technology, a London-based artificial intelligence company specializing in machine learning, computational algorithms, and systems neuroscience. One of the primary efforts is to investigate how AI can be extended to healthcare. The corporation collaborates directly with organizations affiliated with the National Health Service.

Calico’s mission is to educate people about and then fight aging and age-related diseases. AI is used by the subsidiary to make sense of vast databases and to simplify such lab operations. Calico is led by a former Genentech CEO (CB Insights, n.d.).


Apple is keen to expand its wellness division and realize its ambition to make Apple’s greatest contribution to humanity. Apple is attempting to turn its consumer brands into mobile patient wellness centers and useful clinical testing instruments. Each version of the Apple Watch has been enhanced with new health functionalities, and it is relying on the iPhone’s Health Records features to connect with provider organizations searching for opportunities to improve communication with patients’ multiple points of treatment.

Since 2015, Apple has been developing a clinical research ecosystem centered on the iPhone and Apple Watch, both of which can capture real-time health data. Its open-source applications, ResearchKit and CareKit, assist clinical trials in recruiting patients and remotely monitoring their wellbeing (Business Insider, 2021).


The appointment of a Stanford University professor and cardiologist was the clearest indication of Facebook’s involvement in healthcare. The scientist found potential in the social network to help influence the social facets of a person’s life, leading to improved health outcomes.

Facebook has made a few other notable health-related investments. The company’s Artificial Intelligence Research group (FAIR) created technologies that could significantly reduce the time required for an MRI.

Meanwhile, Facebook Reality Labs is developing a noninvasive brain-computer interface that will enable imagined commands to be translated to smartphones. The headset helps measure blood oxygen saturation and infer neuronal activity using near-infrared light, which is essentially a kind of health monitoring. The technology is still in its early stages, but it can eventually allow you to think of the words “end game” when playing virtual reality and have the game end.

In late 2019, Facebook released the Preventive Health tool. Given the breadth of personal data collected by Facebook and its self-organizing groups centered on health problems, this may be the first step in developing a clinical trial recruiting solution (Fast Company, 2021).


As the field of drug discovery and development matures, it becomes more important to find new ways to explore vast chemical spaces that are beyond our current experimental approaches. AI-based solutions can help us do this by augmenting traditional techniques with a new layer of computational intelligence. But professional expertise from physicians will always be needed as well for therapeutic applications where human intuition is required to make sense out of complex data sets or provide personalized care tailored to individual patients’ needs.

How could you use AI or machine learning technologies? What type of company would benefit most from adopting these tools? Let us know! We have experts ready and waiting to advise on how we can apply artificial intelligence to your organization. In the meantime head over to our AI in Pharma and Life Sciences page for more information on the services and solutions nexocode provides for the pharmaceutical industry.


Accelerating Drug Discovery - Kraljevic et al., 2004 Digital in R&D: The $100 billion opportunity - Chilukuri et al., 2017 AI in drug discovery and development - Paul et al., 2021

About the author

Dariusz Jacoszek

Dariusz Jacoszek

Life Sciences Expert

Linkedin profile

Dariusz is a qualified Pharmacist who has completed an Executive MBA at the Polish-American School of Business. He has worked in top pharma companies, being involved in activities focused on operational aspects of drug development and digital health, mainly in the oncology space.  Dariusz's experience spans building central services in Pharmacovigilance (PV), Trial Master File (TMF), and Transformation Management Office (TMO) with end-users focus, optimization, and data-driven approach.
As an enthusiast of emerging technologies, he’s keen to find new ways of implementing AI solutions in pharma and recommend digital solutions addressing customer needs at nexocode.

This article is a part of

AI in Pharma
14 articles

AI in Pharma

The pharmaceutical industry is one of the most regulated industries in the world. It's also one of the most expensive and challenging industries to work in. Pharma companies, like all other businesses, are looking for ways to reduce costs while improving quality and efficiency. This is where artificial intelligence comes into play!

Follow our article series to find out what are the benefits of AI in pharma and why this tech could be considered a game changer for the pharmaceutical sector.

check it out

Pharma & Life Sciences

Insights on practical AI applications just one click away

Sign up for our newsletter and don't miss out on the latest insights, trends and innovations from this sector.


Thanks for joining the newsletter

Check your inbox for the confirmation email & enjoy the read!

This site uses cookies for analytical purposes.

Accept Privacy Policy

In the interests of your safety and to implement the principle of lawful, reliable and transparent processing of your personal data when using our services, we developed this document called the Privacy Policy. This document regulates the processing and protection of Users’ personal data in connection with their use of the Website and has been prepared by Nexocode.

To ensure the protection of Users' personal data, Nexocode applies appropriate organizational and technical solutions to prevent privacy breaches. Nexocode implements measures to ensure security at the level which ensures compliance with applicable Polish and European laws such as:

  1. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (published in the Official Journal of the European Union L 119, p 1); Act of 10 May 2018 on personal data protection (published in the Journal of Laws of 2018, item 1000);
  2. Act of 18 July 2002 on providing services by electronic means;
  3. Telecommunications Law of 16 July 2004.

The Website is secured by the SSL protocol, which provides secure data transmission on the Internet.

1. Definitions

  1. User – a person that uses the Website, i.e. a natural person with full legal capacity, a legal person, or an organizational unit which is not a legal person to which specific provisions grant legal capacity.
  2. Nexocode – NEXOCODE sp. z o.o. with its registered office in Kraków, ul. Wadowicka 7, 30-347 Kraków, entered into the Register of Entrepreneurs of the National Court Register kept by the District Court for Kraków-Śródmieście in Kraków, 11th Commercial Department of the National Court Register, under the KRS number: 0000686992, NIP: 6762533324.
  3. Website – website run by Nexocode, at the URL: whose content is available to authorized persons.
  4. Cookies – small files saved by the server on the User's computer, which the server can read when when the website is accessed from the computer.
  5. SSL protocol – a special standard for transmitting data on the Internet which unlike ordinary methods of data transmission encrypts data transmission.
  6. System log – the information that the User's computer transmits to the server which may contain various data (e.g. the user’s IP number), allowing to determine the approximate location where the connection came from.
  7. IP address – individual number which is usually assigned to every computer connected to the Internet. The IP number can be permanently associated with the computer (static) or assigned to a given connection (dynamic).
  8. GDPR – Regulation 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of individuals regarding the processing of personal data and onthe free transmission of such data, repealing Directive 95/46 / EC (General Data Protection Regulation).
  9. Personal data – information about an identified or identifiable natural person ("data subject"). An identifiable natural person is a person who can be directly or indirectly identified, in particular on the basis of identifiers such as name, identification number, location data, online identifiers or one or more specific factors determining the physical, physiological, genetic, mental, economic, cultural or social identity of a natural person.
  10. Processing – any operations performed on personal data, such as collecting, recording, storing, developing, modifying, sharing, and deleting, especially when performed in IT systems.

2. Cookies

The Website is secured by the SSL protocol, which provides secure data transmission on the Internet. The Website, in accordance with art. 173 of the Telecommunications Act of 16 July 2004 of the Republic of Poland, uses Cookies, i.e. data, in particular text files, stored on the User's end device.
Cookies are used to:

  1. improve user experience and facilitate navigation on the site;
  2. help to identify returning Users who access the website using the device on which Cookies were saved;
  3. creating statistics which help to understand how the Users use websites, which allows to improve their structure and content;
  4. adjusting the content of the Website pages to specific User’s preferences and optimizing the websites website experience to the each User's individual needs.

Cookies usually contain the name of the website from which they originate, their storage time on the end device and a unique number. On our Website, we use the following types of Cookies:

  • "Session" – cookie files stored on the User's end device until the Uses logs out, leaves the website or turns off the web browser;
  • "Persistent" – cookie files stored on the User's end device for the time specified in the Cookie file parameters or until they are deleted by the User;
  • "Performance" – cookies used specifically for gathering data on how visitors use a website to measure the performance of a website;
  • "Strictly necessary" – essential for browsing the website and using its features, such as accessing secure areas of the site;
  • "Functional" – cookies enabling remembering the settings selected by the User and personalizing the User interface;
  • "First-party" – cookies stored by the Website;
  • "Third-party" – cookies derived from a website other than the Website;
  • "Facebook cookies" – You should read Facebook cookies policy:
  • "Other Google cookies" – Refer to Google cookie policy:

3. How System Logs work on the Website

User's activity on the Website, including the User’s Personal Data, is recorded in System Logs. The information collected in the Logs is processed primarily for purposes related to the provision of services, i.e. for the purposes of:

  • analytics – to improve the quality of services provided by us as part of the Website and adapt its functionalities to the needs of the Users. The legal basis for processing in this case is the legitimate interest of Nexocode consisting in analyzing Users' activities and their preferences;
  • fraud detection, identification and countering threats to stability and correct operation of the Website.

4. Cookie mechanism on the Website

Our site uses basic cookies that facilitate the use of its resources. Cookies contain useful information and are stored on the User's computer – our server can read them when connecting to this computer again. Most web browsers allow cookies to be stored on the User's end device by default. Each User can change their Cookie settings in the web browser settings menu: Google ChromeOpen the menu (click the three-dot icon in the upper right corner), Settings > Advanced. In the "Privacy and security" section, click the Content Settings button. In the "Cookies and site date" section you can change the following Cookie settings:

  • Deleting cookies,
  • Blocking cookies by default,
  • Default permission for cookies,
  • Saving Cookies and website data by default and clearing them when the browser is closed,
  • Specifying exceptions for Cookies for specific websites or domains

Internet Explorer 6.0 and 7.0
From the browser menu (upper right corner): Tools > Internet Options > Privacy, click the Sites button. Use the slider to set the desired level, confirm the change with the OK button.

Mozilla Firefox
browser menu: Tools > Options > Privacy and security. Activate the “Custom” field. From there, you can check a relevant field to decide whether or not to accept cookies.

Open the browser’s settings menu: Go to the Advanced section > Site Settings > Cookies and site data. From there, adjust the setting: Allow sites to save and read cookie data

In the Safari drop-down menu, select Preferences and click the Security icon.From there, select the desired security level in the "Accept cookies" area.

Disabling Cookies in your browser does not deprive you of access to the resources of the Website. Web browsers, by default, allow storing Cookies on the User's end device. Website Users can freely adjust cookie settings. The web browser allows you to delete cookies. It is also possible to automatically block cookies. Detailed information on this subject is provided in the help or documentation of the specific web browser used by the User. The User can decide not to receive Cookies by changing browser settings. However, disabling Cookies necessary for authentication, security or remembering User preferences may impact user experience, or even make the Website unusable.

5. Additional information

External links may be placed on the Website enabling Users to directly reach other website. Also, while using the Website, cookies may also be placed on the User’s device from other entities, in particular from third parties such as Google, in order to enable the use the functionalities of the Website integrated with these third parties. Each of such providers sets out the rules for the use of cookies in their privacy policy, so for security reasons we recommend that you read the privacy policy document before using these pages. We reserve the right to change this privacy policy at any time by publishing an updated version on our Website. After making the change, the privacy policy will be published on the page with a new date. For more information on the conditions of providing services, in particular the rules of using the Website, contracting, as well as the conditions of accessing content and using the Website, please refer to the the Website’s Terms and Conditions.

Nexocode Team