Multi-sensor Visual Analytics Supported by Machine-Learning Models

Video: In this video we focus on the time series visualization. The visualization queries LSTM Recurrent Neural Network trained to identify patterns trained across a long temporal input data source. The visualization shows intuitive way of communicating similar patterns in a timeseries visualization. Users can input pattern or specify the pattern visually. In this demonstration we focus on how users can input a pattern in our platform.

About the project: Machines, such as engines, vehicles, or even aircraft, go through extensive controlled trials during their development. Each machine is typically instrumented with hundreds of sensors that produce voluminous time-series data. Engineers analyze suchdata to improve their understanding of how machines are used in practice, which in turn helps them in taking design decisions. Most often they study operational profiles various sensors fora given day of operation using histograms, or examine time-series from multiple sensors together. However, when confrontedwith data from dozens of sensors, over many years of operation, they are challenged by the large number of histograms toanalyze, and the sheer length of time-series' to explore. Traditional approaches such as hierarchical histograms, time-series semantic zooming etc. often cannot cope with the volume of data encountered in practice. We augment basic data visualizations such as histograms, heat-maps and basic time-series visualizations with machine-learning models that aid in summarizing, querying, searching, and interactively linking visualizations derived fromlarge volumes of multi-sensor data. In this paper we describe our machine-learning augmented approach to visual analytics in thecontext of its actual use in practice for answering questions ofinterest to engineers analyzing large-scale multi-sensor data.