Digital Transformation Strategy

Data Virtualisation

The first stage in the digital transformation is primarily concerned with identifying, auditing and integrating relevant data sources into a common inter-operable schema. Data is integrated using either real time feed, batch importers or data proxies, depending on the access methods available, data redundancy and data volatility requirements. Data is loaded into the TDX (Trusted Data Exchange), which is optimized for high volume high speed analytics. Where necessary data is securely partitioned according to authentication and authorization requirements

Benchmarking

Before any analytics or process improvement is adopted the problem is benchmarked. Critical KPIs are identified that baseline both the current data set and the efficacy of current processes. Where necessary benchmarking analytic code is written and simple dashboard visualizations created to summarize data. Benchmarking helps to provide evidence of improvements as well as providing and empirical feedback loop to power machine learning algorithms.

Analytics

The analytics stage consists of a complex chain of analytics processes. These analytics processes can be both batched (good for historical analysis) or streaming (good for real time analysis). Analytics chains are built up and chained together allowing the entire system to be tuned and tweaked based on arbitrary parameters. Frequently used intermediate data sets will be stored for performance improvement purposes

Workflow

Effective analytics must result in “action in the real world”, people going out and doing things. For that reason the TDX tightly integrates workflow with analytics. Not only do analytics prioritise and help triage the real world activities, but the result of every inspection undertaken is essentially a new training data set, which if captured quickly can help immediately refine the training algorithms.