In the latest edition of Perspectives, Copper Consultancy sat down with Geoff McGrath, Managing Director of CKDelta, to discuss how data science is shaping decision-making and the opportunities this presents.
Throughout your career you have skirted the boundary between data engineering and mechanical engineering. Is there a common thread between the two in terms of how we use data to make decisions, predict outcomes, and model scenarios?
After multiple careers you start to see the commonality in approaches – the common theme starts with design. At McLaren Applied Technologies I had the good fortune of working with designers who knew how to ask the right questions. They were divergent in thought, whereas engineers are convergent, trying to get to a solution quickly. Any good engagement starts with design thinking, trying to empathise with the customer and building an understanding of what performance looks like for your business. Then it comes down to the engineering approach, measuring performance and ultimately prediction which is what CKDelta’s diverse data sets and sector expertise makes us uniquely positioned to do.
What is driving the shift from industrial data models to societal data models?
Data availability and compute power. The ability to capture data has evolved fast. 20 years ago there was no public data, and we became heavily reliant on surveys. The difference is, we still rely on surveys despite mobile devices being capable of acquiring the same – if not more – GDPR-compliant information. On the other hand, computing power and computing cost (equally important) have caught up with data availability. Although historically we knew how to solve a problem, we did not have the computing power to do so but nowadays, artificial intelligence has enabled this.
People are simultaneously excited and worried about artificial intelligence – Is there a distinction between having greater compute power and more data to make decisions on which questions to ask verses the point in which the computer can ask questions itself?
I like to stick to the mantra, ‘keep the human in the loop’. As a catalyst of COVID-19, CKDelta helped run organisational predictions by running forecasts but leaving an element of human judgment. An automated process wouldn’t have taken account of people’s emotional intuition or cultural influences, which models cannot be trained to see or account for. However, by capturing insights from lots of data sources, it enables humans to make the best judgement call. The best use of AI and data modelling is to help the human in the loop make better decisions more often.
What sectors are behind where they could or should use data to help make decisions and which sectors are likely to likely to grow in the data business?
Firstly, the early adopters were advanced manufacturing. They always had a data driven approach and were the first to use complete immersive stimulation environments to design products. Today this is referred to as the digital twin – the model copy of the real world where virtual changes can be made. Industries such as telecoms, utilities and transport systems are just catching up and to create resilient and reactive digital twin systems will take 5-10 years. On the other hand, retail and tourism have not even thought about it yet, and arguably government hadn’t until COVID-19 forced its hand.
What innovative uses for data and insights do you think we will be using in the near future to improve towns and cities?
I am still a firm believer in the Internet of Things (IOT). It has taken longer than I expected to roll out which may be linked to the delay of 5G as people are not aware of its huge potential. But once we have predictive intelligence built into infrastructure, you can start to anticipate things like traffic jams and surges in electricity demand and react and make interventions based on knowledge from stimulations. If you go down the path of decision support, predictive and adaptive intelligence, a smart environment would be one that adapts to the context.
How can data be enabled to help simplify people’s choices?
Design should strip out unnecessary components to deliver the right function. The same goes for data design, just because it is there does not mean you should measure it. I think that’s why Big Data has not lived up to the hype, it focusses to heavily on the volume, velocity, and variety of the data stream. Successful designers will reduce the computing power required to yield the same amount of quality data which is why CKDelta focus on extracting maximum insight from the minimal amount of anonymised data.
To find out more about CKDelta, click here.