Neil Ballinger, head of EMEA at automation parts supplier EU Automation, offers an overview of the pros and cons of predictive and preventive maintenance strategies. According to a new report published by maintenance specialist Senseye, manufacturers experience an average of 27 hours of downtime a month because of equipment failure, resulting in multi-million revenue losses by the end of the year.
With manufacturers striving to cope with rising competition, rapidly changing consumer trends and an unprecedented pressure to deliver high-quality products quickly, these losses can seriously compromise business’ bottom lines.
Historical vs real-time data
Preventive — or preventative — and predictive maintenance are often used interchangeably to refer to maintenance strategies that allow manufacturers to act before equipment fails. Both methods are vastly superior to reactive maintenance, where equipment is run to failure until emergency repairs are needed.
However, preventive and predictive are not the same thing. Preventive maintenance involves carrying out checks at regular intervals, regardless of the equipment’s condition. It relies on best practice guidelines and historical data to give plant managers the best chances to keep machines in good repair, but requires cyclical planned downtime.
On the other hand, predictive maintenance occurs only when needed, relying on real-time data from IIoT-connected equipment to identify potential threats before it’s too late. In this way, repairs address an actual problem and are more targeted, meaning that downtime, when required, will be generally shorter compared to other maintenance methods.
The problem with big data
Predictive maintenance has clear advantages, but relies on technologies for continuously monitoring equipment and gather valuable data — an approach also known as condition-based maintenance. While the devices to collect data can be relatively inexpensive and easy to set up, processing that data to draw relevant information on the machines’ health is the challenging part.
IBM estimates that about 90 per cent of all data generated by sensors never get used. This means that manufacturers miss opportunities to make informed decisions about their equipment, while still paying to collect and store unused data. Data that are collected but not processed or used in any way are known as dark data and represent a huge challenge for the industry.
Another issue with big data is the presence of data silos, where data is processed and relevant patterns are discovered, but the resulting insights are not shared among the different departments of an organisation. This can happen because the business doesn’t have the necessary technology in place for data visibility, for example it might lack a unified integrated data management (IDM) tool, and every team might rely on different platforms.
For example, data suggesting that an electric motor is overheating might be available, but if the C-suite don’t share it with the maintenance team on the production floor in a timely manner, the motor might be doomed to failure.
Technology that helps
Having sensors to gather significant information on your equipment is the first step to incorporate predictive technique into your maintenance strategy. Recent machines normally come with a variety of options for real-time data acquisition, but legacy equipment can also be retrofitted with inexpensive add-on sensors. As a matter of facts, predictive maintenance can be a huge help when dealing with aging assets, which require careful planning when sourcing obsolete spare parts.
However, the impressive amount of dark data in the industry, coupled with the pervasive issue of data silos, shows that gathering data is not enough. To predict equipment failure effectively, manufacturers should implement technologies that facilitate real-time data processing and that allow all relevant personnel to have access to the resulting insights.
In this sense, edge computing can be a valuable solution. One of the problems with data from industrial equipment is that the older it gets, the less relevant and accurate it becomes. By analysing data as close as possible to the source, rather than sending it all to the cloud, edge computing minimises latency and supports real-time decision making.
Edge computing can also help deal with another issue that comes with connecting more equipment to the IIoT — cybersecurity. When data travels back and forth from the cloud, there is an increased risk that it might be compromised. Processing data closer to the source reduces this risk, offering the advantages of increased digitalisation without opening up more potential attack surfaces. This doesn’t mean that processing or storing data in the cloud should be avoided at all costs, but simply that the two options can go hand in hand to maximise results.
Another priority should be the convergence of information technology (IT) and operational technology (OT). These used to be managed by separate teams with different skillsets, but the increased digitalisation of manufacturing processes, including maintenance, means that there is now the need to bring these areas together. OT collects raw data from PLCs, motors, sensors and other key equipment, while IT gives the data meaning by uncovering relevant patterns. However, for this to work, both equipment and teams must communicate and collaborate proactively.
Implementing technology for data acquisition and processing can require a sizable initial investment and, most importantly, a radical cultural shift in manufacturing plants. However, the results in term of reduced downtime and increased efficiency will give manufacturers the competitive edge they need to prosper in an increasingly digitalised world.