CORTEX AI Library of Solutions


Cortex Logic solves core business problems in a practical, cost-effective way using all available structured and/or unstructured data and smart technology. The solutions are implemented via the CORTEX AI Engine in an end-to-end, full stack, integrated, scalable, and secure manner. Cortex Logic operationalizes Data Science and AI by following international data science standards and implementing automated analytics within champion-challenger approach, which is typically maintained and kept up-to-date during the deployment and operational phase. The CORTEX AI Engine is used to operationalize solutions such as strategic business transformation & optimization, human capital valuation & employee profiling, intelligent virtual assistants, robo-advisors, process optimization, predictive maintenance, fraud detection, churn prediction, advanced risk scoring, machine learning-based trading, real-time customer insights, smart recommendations and purchase prediction, personalized search, cyber security, medical risk prediction, and precision medicine. See below for more details about the solutions from the CORTEX AI Library.

Thriving Business in Smart Technology Era


Optimized Business


Strategic AI-First Business Transformation &


AI Based Trading


Smart Risk Scoring for
Financial services


AI-based Supply Chain Optimization

Satisfied Customers


Artificially Intelligent
Virtual Assistants &


Smart Recommendation,
Cross-sell, Up-sell &Purchase Prediction


Smart Churn Prediction & Mitigation


AI-based advertising


Smart Risk Scoring for
Financial services


Real-time Customer
Insights, Segmentation &
Social network

Productive Employees


Talent Valuation,
Intelligence & Profiling


Artificially Intelligent
Virtual Assistants &


AI Learning Assistant & Knowledge Management


Personalised AI Search

Smart Systems


Real-time AI solutions for


Process & Equipment Performance


Real-time Fraud Detection
& Prevention


AI for Cyber Security

Relevant industries for CORTEX AI Library of Solutions



Mapping of CORTEX AI Library of Solutions to relevant Industries


Application Stack for CORTEX AI Solutions

Big Data & Analytics and Data Science Methodology

The Big Data & Analytics and Data Science methodology is a combination of sequential execution of tasks in certain phases and highly iterative execution steps in certain phases. Because of the scale issue associated with a Big Data & Analytics system, designers must adhere to a pragmatic approach of modifying and expanding their processes gradually across several activities as opposed to designing a system once and all keeping the end state in mind. The main phases are typically as follows:

  • Analyze and evaluate business use case
  • Develop the business hypothesis
  • Develop analytics approach
  • Build and prepare data sets
  • Select and prepare the analytical models
  • Build the production ready system (scale and performance)
  • Measure and monitor

This methodology also corresponds with the Data Science methodologies that Cortex Logic implements in operationalizing Data Science and developing AI-based solutions.

  • Cross-Industry Standard for Data Mining (CRISP-DM)
  • Analytics Solutions Unified Method for Data Mining/Predictive Analytics (ASUM-DM)

CRISP-DM and ASUM-DM methodologies for operationalizing Data Science & AI



The sequence of the CRISP-DM phases is not strict and moving back and forth between different phases is always required. The arrows in the process diagram indicate the most important and frequent dependencies between phases. The outer circle in the diagram symbolizes the cyclic nature of data mining itself. A data mining process continues after a solution has been deployed. The lessons learned during the process can trigger new, often more focused business questions and subsequent data mining processes will benefit from the experiences of previous ones.

Business Understanding

This initial phase focuses on understanding the project objectives and requirements from a business perspective, and then converting this knowledge into a data mining problem definition, and a preliminary plan designed to achieve the objectives. A decision model, especially one built using the Decision Model and Notation standard can be used.

Data Understanding

The data understanding phase starts with an initial data collection and proceeds with activities in order to get familiar with the data, to identify data quality problems, to discover first insights into the data, or to detect interesting subsets to form hypotheses for hidden information.

Data Preparation

The data preparation phase covers all activities to construct the final dataset (data that will be fed into the modeling tool(s)) from the initial raw data. Data preparation tasks are likely to be performed multiple times, and not in any prescribed order. Tasks include table, record, and attribute selection as well as transformation and cleaning of data for modelling tools.


In this phase, various modeling techniques are selected and applied, and their parameters are calibrated to optimal values. Typically, there are several techniques for the same data mining problem type. Some techniques have specific requirements on the form of data. Therefore, stepping back to the data preparation phase is often needed.


At this stage in the project you have built a model (or models) that appears to have high quality, from a data analysis perspective. Before proceeding to final deployment of the model, it is important to more thoroughly evaluate the model, and review the steps executed to construct the model, to be certain it properly achieves the business objectives. A key objective is to determine if there is some important business issue that has not been sufficiently considered. At the end of this phase, a decision on the use of the data mining results should be reached.


Creation of the model is generally not the end of the project. Even if the purpose of the model is to increase knowledge of the data, the knowledge gained will need to be organized and presented in a way that is useful to the customer. Depending on the requirements, the deployment phase can be as simple as generating a report or as complex as implementing a repeatable data scoring (e.g. segment allocation) or data mining process. In many cases it will be the customer, not the data analyst, who will carry out the deployment steps. Even if the analyst deploys the model it is important for the customer to understand up front the actions which will need to be carried out in order to actually make use of the created models.

Read more

Analytics spectrum utilized in CORTEX AI Solutions

A sample of cutting-edge Smart Technologies being used in CORTEX AI Solutions