Q & A

Q & A

Have a Question for us?

TECHNOLOGY

Adaptix is the Swiss army knife of analytics. It’s an all-in-one solution for a whole wealth of different types of analysis: predictive, prescriptive, anomaly detection, profile, trend, root-cause analysis and more – all in real time.

Moreover, these different analyses are executed on top of the same knowledge base/representation stack. This ensures Adaptix is consistent, stable and computationally sustainable and makes combining individual primitive queries into more complex personalised queries extremely easy.

Adaptix has a portable architecture based on Docker technology. It can be deployed to almost any platform or cloud system and can collaborate with third party databases and sensor data stores through a REST API or message cues. Integration with any existing client-side applications that may benefit from Adaptix analytics is very easy.

Yes. Adaptix has a GUI for managing streams, devices, channels and patterns and, when running as a dedicated instance, an administration environment for managing platform users and their access rights. In addition, analysis results – such as those from predictions, anomaly detection and pattern searches, amongst others – are displayed through customisable visuals.

Adaptix delivers extremely high quality analysis in real time – even when the data stream becomes extremely long. Complex analysis queries can be executed in a sub-second timeframe while maximising precision of the calculus.

Conventionally, time-modelling quality is dependent on the computational resources and time available for executing the analysis. Suppose, for instance, you needed to predict the next x samples out of a time-series of sensor data. Traditional machine learning prediction methods assume what happens next is highly dependent on what has occurred in the recent past (the short-time memory window). This means the longer the memory window, the more accurate the prediction. But more time is needed to train the system and execute the prediction query. Thus, to achieve a quasi-real time response, the memory modelling window needs to be kept short enough, at the cost of overall precision and quality of results.

In contrast, Adaptix’s unique data representation scheme maximises the memory window and captures short-, mid- and long-term memory nuances with no significant computational overhead. This allows the trade-off between analysis performance and quality to evaporate.

Sure it does. Adaptix doesn’t need any initial training because modelling occurs automatically from the moment it’s deployed. The system learns without any supervision, directly from the streamed data as they’re received. It’s able to respond to queries from day one and progressively augments accuracy over time.

With Adaptix, there’s no need for time-consuming initial training based on perfectly-balanced datasets. Instead, on-site modelling ensures the system automatically adapts to the environment where the sensors are deployed, improving accuracy. Training is continuous and perfectly incremental, updating the model each time new samples are ingested to ensure the model is always updated with the latest knowledge. This learning process occurs silently in the background, on the fly and accounts for every tiny event without any loss in generalisation.

This novel approach revolutionises machine learning, where traditionally training occurs every once in a while and necessitates a fair amount of resources. These systems quickly become outdated and less accurate between training.

In fact, our approach eliminates any loss of precision whenever there’s a sudden change of context or environment. Adaptix is soon able to understand the new status quo; it progressively ignores knowledge from the previous context and adapts to the new conditions – continually maximising result accuracy without intervention and downtime.