Expanding Horizons of Knowledge
WHAT WE DO AI for CardioMetabolic Domain

OUR VISION Data driven innovation

PREDICTIVE MODELS DeepBiomics platform
Our DeepBiomics platform allows reliable and fast analysis of complex biomedical datasets. It can combine multiple evidence from heterogeneous data sources. Explore our portfolio and use cases
MICROBIOME ANALYSIS Type 2 Diabetes and Deepbiomics

EPIGENETICS ANALYSIS Inflammatory bowel disease
MULTI-OMICS IN MEDICINE With Contribution By E. LEVIN, PHD

WHAT WE DID Scientific Contributions
Cardiovascular risk
Microbiome and ethnicity
CID permutation importance
Metabolic syndrome
Methylation and microbiome
Graph space embedding
ANALYSE ALL YOU DATA. SMARTER AND FASTER From developing personalised treatment strategies to solving complex business problems, we believe that the world's most challenging issues can be solved with data.
You have a lot of data, in lots of different formats, in lots of different locations. Some of it is in the cloud, some of it is in various legacy databases, some of it is in spreadsheets on your desktop. You’ve got big data. You would like to analyze and gain business insights from it. That’s where HORAIZON comes in. Our solutions help integrating, analysing and getting insight from all your data. Smarter and faster.
METHODOLOGY
Deep Learning
We use state-of-the-art Deep Learning techniques to help to make sense of data such as images, sound, and text. These algorithms, inspired by biological neural networks, allow building highly predictive models for structured and heterogeneous data.
Intelligible Models
Our sparse models are a particularly useful in many applications where the main objective is to discover predictive patterns in data for enhancing our understanding of underlying biological processes, beyond just building accurate 'black-box' predictors.
Multi-View Methods
Our custom made algorithms, based on kernel methods framework, allow modelling and analysis of non-linear interactions in the data. These methods are fully optimized for large scale computations on thousands of processors. models for structured and heterogeneous data.