Data confidence: Ensuring your intel isn’t fake news

February 8, 2018

Post image for Data confidence: Ensuring your intel isn’t fake news

By Bill Hull

People say we’re living in a post-truth society, but modern businesses know there’s still such a thing as objective truth: it’s called good data.

Whereas fake news and disinformation can lead to an erosion of trust, reliable data can increase a business’s operational competence, can promote efficiencies across the enterprise, and can give leaders the confidence to make better decisions – especially in today’s era of the digital factory. The closer that industrial products (IP) companies move to align their operations with digital, “Industry 4.0” technologies – such as artificial intelligence, robots, cobots, 3-D printing, nanotechnology, and Internet-of-Things-enabled manufacturing systems – the greater the need to ensure the integrity of the data that flows from, to, and between those systems.

Think about the landscape of data collected in today’s digital factories and how it connects a manufacturer’s value chain both vertically and horizontally. Within a company’s four walls, data flows vertically from a rapidly expanding network of sensors, connected devices, and embedded systems; and it washes continuously across the company’s manufacturing, product development, production planning, raw materials procurement, services, maintenance, warehousing, inventory, and logistics – all of which leverage the data in pursuit of achieving greater operational effectiveness and increased efficiency. Data might also flow horizontally – beyond company walls – by connecting upstream suppliers and downstream customers. As companies become more and more dependent on sharing and leveraging data across that full value chain, stakeholders will increasingly require assurances that the data they’re relying on is demonstrably complete, consistent, accurate, and timely – and that it has been subjected to effective governance procedures.

Those kinds of assurances are common objectives among auditors who examine financial statements, but in digital manufacturing, the goal is to close the virtuous loop of industrial products data analytics, which goes like this.

  •       The digital factory produces data.
  •       That data informs more efficient and more effective processes.
  •       Those processes become critical to how the company operates.
  •       That dependence requires the highest level of integrity from new input data.

Data from sensors and networked devices can reveal insights across operations. For example, an analysis of data from the factory floor in conjunction with an analysis of weather data might reveal that beneficial adjustments could be made to specific machinery when temperatures or relative humidities exceed certain levels.

The need for data confidence is especially acute when it comes to data-intensive efficiency strategies such as predictive maintenance, smart supply chains, and connected field services. Let’s use predictive maintenance as an example because it’s one of the most widely adopted Industry 4.0 strategies. The premise of predictive maintenance is simple: Rather than conducting maintenance based on a prevention or reaction – which is the scheduling of upkeep based on history and guesswork as to how long a machine or a component can go before breaking down or waiting until that breakdown occurs – predictive maintenance analyzes equipment and contextual data to identify patterns of wear by flagging anomalies and scheduling maintenance based on all of the variables that can hasten the end of a machine or component’s usable life.

Input data for predictive maintenance systems can flow from both internal and external sources – in the forms of (1) sensor data from factories, infrastructure networks, and transportation fleets; (2) maintenance and failure history records, including contextual history; (3) machinery features such as model, age, and location; (4) environmental data, including temperature, humidity, altitude, and wind speeds; and (5) profiles of equipment operators. Both structured data sets (e.g., spreadsheets or relational databases) and unstructured data sets (e.g., maintenance logs or thermal images, unlocked via text mining and pattern recognition software) are analyzed by using artificial intelligence and machine-learning algorithms, which construct their own models and refine them continuously based on subsequent data inputs, thereby improving their predictive power. Data from all of those sources must be integrated and then transformed into a uniform view on a suitable platform, which then gets coordinated, analyzed, and used for promoting continuous improvement in the system’s ability to predict maintenance needs.

As Industry 4.0 technologies become more and more ingrained across industrial products companies’ operations, risk professionals and audit professionals face the challenge of taking the tools, techniques, and approaches they use to provide assurance around financial statement accounts and turning them into use for providing operational assurance over data and algorithms – the lifeblood of the digital factory. Industry 4.0 data may not sit within the company’s enterprise resource planning system or be accounted for in financial statements, but it is becoming an ever more critical driver of operational decision making that contributes to running a business at maximum efficiency and with maximum effectiveness. That’s why – in the digital factory – reliable data really is the bottom line.

©2018 PwC. All rights reserved. PwC refers to the US member firm or one of its subsidiaries or affiliates, and may sometimes refer to the PwC network. Each member firm is a separate legal entity. Please see www.pwc.com/structure for further details. This content is for general information purposes only, and should not be used as a substitute for consultation with professional advisors.

Print Friendly, PDF & Email

Comments on this entry are closed.

Previous post:

Next post: