Effective Predictive Maintenance needs strategic automation and human insight

New-age technologies like Artificial Intelligence (AI), Machine Learning (ML), Internet of things (IoT), and predictive analytics are automating processes and augmenting human capabilities. Together, they set the stage for innovations in different sectors. Manufacturing is leveraging Predictive Maintenance (PdM) that takes preventive maintenance several notches higher.

PdM changes the approach from reactive to proactive maintenance, empowering enterprises to anticipate changes in the system and preemptively manage them. In other words, it helps enterprises predict and avoid machine failure and resultant downtimes. These analytics-led predictions optimize maintenance efforts and facilitate frictionless interdependence.

According to Deloitte, PdM increases equipment uptime by 10-20% and reduces overall maintenance costs and maintenance planning time by 5-10% and 20-50% respectively. With a CAGR of 25.2%, the global predictive maintenance market is set to grow from USD 4.0 billion in 2020 to 12.3 billion by 2025. The growth is fueled by the continued demand for reducing maintenance costs and downtime.

In the current Industry 5.0 environment, the role of maintenance has evolved from merely preventing downtimes of individual assets to predicting failures and creating synchrony between people, processes, and technologies. Predictive maintenance plays its part well, though it does bring along certain challenges that necessitate human intervention.

The PdM advantage

As mentioned earlier, predictive maintenance helps eliminate unplanned downtime and related costs. In an IoT-driven world where sensors, devices, systems, etc. are connected, McKinsey believes that the linking of physical and digital worlds could generate up to $11.1 trillion annually in economic value by 2025.

Maximized runtime also means better profits, happier customers, and greater trust. Predictive maintenance can ease logistics by choosing maintenance time slots outside of production hours or at a time when the maintenance personnel is available. It contributes to supply chain resilience, material costs savings, and increased machine lifespan.

However, PdM is only as good as the data it relies upon. Due to IoT technology, data comes from different sources and needs to be duly analyzed before it can be harnessed to make predictions.

The PdM limitations

We need to consider several elements to translate the information PdM provides into positive outcomes. For instance, depending on usage and maintenance history, it may advise you to replace a certain part or component. But this information can lead to further questions. You may need help in deciding which brand and vendor to consider, whether replacement of the component is a good option, or would it make better sense to replace the equipment entirely.

The forecast is often prescriptive and based on statistical models. While optimizing the operational efficiency of a particular line of business, PdM often fails to consider how it impacts other lines. For instance, when it suggests particular equipment is due for maintenance, it may not be able to offer advice as to where the production/processing needs to be shifted when it’s down. The value it offers will therefore be shaped by how decision-makers respond to predictive data.

Data quality and coverage are critical to make predictive maintenance work for the organization. For data to be suitably collected, integrated, interpreted, and transformed, we need dashboards, notification systems, and a bunch of other things to get started. This requires considerable research and planning to go into its implementation for it to start providing the insights we need.

The key lies in the way you respond

Decision-makers typically respond to predictive data with either hypothesis-driven or data-driven responses. The former stems from past business experiences and determines the plan based on a limited scope of response actions. Data-driven responses, on the other hand, aim to find solutions based on real-time business realities and consider several optimization scenarios to determine the way forward.

In contrast to hypothesis-driven decision-making, optimization ensures that all possible paths are explored and evaluated, relevant constraints are taken into consideration, and cross-functional interdependencies are looked into. A workable scenario based on business realities is thus created with no scope for purely intuitive responses.

Despite the analytics-driven insights, predictive maintenance is incomplete without human judgment. Smart decisions come from the ability to visualize the physical and financial outcomes before enforcing them. High-risk situations might arise, and thus they are best left to human discretion.

A predictive maintenance model for Industry 5.0

Manufacturers need clarity on several variables to understand the implications of failure. A false alarm triggered due to inaccurate predictions can result in a lot of unwarranted chaos and anxiety. However, a missed detection might often prove to be a costly error, sometimes resulting in loss of humans and property. Therefore, while understanding variables, they need to first know how often the variable behaviors occur on the factory floor. Strong domain knowledge along with solid data based on previous failures and scenarios is the key to understanding a machine.

Prediction accuracy will improve if we have adequate data on the behavior of machines when they are very close to failure. Only skilled personnel can determine this; some data sets, despite being important, are harder to collect and yet very critical for decision-making.

If we need data on a machine that breaks just once in a year or two, we need to work closely with machine makers who already possess a large pool of relevant data. Alternatively, we may choose to create a digital or a simulation model to create relevant data sets. The most expensive failures are usually the ones we never expect and hence relevant testing for different scenarios should also be considered.

Looking ahead

The way forward into Industry 5.0 is to create a predictive model that uses analytics, machine learning, and Artificial Intelligence (AI) in conjunction with human insights.

Manufacturers are now relying on predictive models to facilitate smart manufacturing as they struggle with quality issues more often than machine failures. Unusual temperatures, random vibrations, are all telltale signs that a machine may be in dire need of maintenance. Simple data sets can be a good starting point as we scale up with the right predictive maintenance solution. But, in the end, it’s the human insight that can give predictive maintenance its winning streak.

Predict business success with Trigent

At Trigent, we are helping organizations benefit from Industry 5.0 We help them build value with predictive analytics and rise above maintenance challenges. With the right guidance, we help them foster the man-machine symbiosis to harness new levels of operational efficiencies.

Call us today for a consultation. We’d be happy to help with insights, solutions, and the right approach to predict better business outcomes.

Why Advanced Analytics is the Future of Healthcare Organizations

Research and Markets announces that the global market for advanced analytics totalled $207.4 billion in 2015, and should total nearly $219.3 billion by 2020, a five-year compound annual growth rate (CAGR) of 1.1%, through 2020. According to them, the advanced analytics market comprises applications for the following industries: banking and financial services, telecommunications and IT, healthcare, government and defense, transportation and logistics, and consumer goods and retail.

Focusing on the healthcare industry, their larger-than-life problem today is the need to provide value to patients, while remaining cost effective and competitive. They need to move from volume-based services to value based services, by providing more for less and become more patient-centric. But how is this possible in a typical scenario where medical professionals are often overworked due to lack or shortage of staff. Where complex illnesses, longevity and lack of knowledge are contributory factors, upsetting the equilibrium of the industry!

Superimpose this scenario with the Internet era, where patients have more access to information, and their expectations from their healthcare providers is also higher. Where they demand more accountability from doctors, nurses and even their health plans and you know the magnitude of their woes.

If healthcare organizations are able to manage all these problems, they still need to find ways to differentiate themselves from competition to attract and retain people.  Where are the resources, the time and the people to achieve all this in a fast moving scenario?

Moving away from internal issues, healthcare organizations are stressed to differentiate themselves to attract and retain people.

Maybe then analytics can be a solution as it provides better insights into treatments and technologies. It can help to improve efficiencies, reduce risk and provide a means to gather and decipher critical data to provide better services.

Seeing the potential of information technology, there has been a proliferation of clinical research systems, electronic health records and devices since the last five years or so. Information explosion and an abundance of data exists today as a result of these devices, but this is resulting in clutter more than intelligence. It is an added dilemma for health organizations to sift through this information to find real value from the same. Already overworked and understaffed, healthcare organizations find data daunting rather than determining.

Luckily the trend is already changing and analytics in healthcare is paving the way for predictive intelligence where healthcare organizations can use data to make intelligent predictions.

Healthcare analytics is not a destination, but a journey that is never completed. If we were to look at an example of analytics in healthcare, we can say, that retrospective analytics is most common, where a hospital looks at its records to see the number of patients who were admitted, causes for admission and so on and so forth. Predictive analysis would require taking this data and looking for common trends to predict the future and finally optimizing the results to save costs and provide greater value to patients will complete the cycle.

Advanced analytics requires the help of a software company which has deep domain knowledge. While most healthcare companies, due to security and fraudulence fears may believe that managing data is an in house task, the fact remains that it requires in-depth technical and domain knowledge to convert data into intelligence.

Info-graphic on Business Intelligence for Manufacturers

A Picture is worth a thousand words” – I would have been lucky if It had struck my mind before drafting a 1000 + words post. But then I realized why not convey it through an info-graphic. As they say an enlightening idea comes after you had put all your efforts, I seem to experience similar fate. Anyways, all’s well that ends well. So, here’s my info-graphic depicting a typical scenario in Manufacturing.

If you are one among those readers who have plans for BI particularly in Manufacturing sector and using Microsoft Stack, this info-graphic can help you make a sound choice.

BI-infographics for Manufacturing Companies

EXCEL vs BI tools – Towards a Data Driven Culture!

The Great BI vs Excel Debate

As Business intelligence vendors slog it out to flex their muscles on enterprise stage and has experts talking lengths on BI, one thing that is often dropped out is the real protein aka Excel; from where the BI concept got ripped and is now flaunting its curves on enterprise’s Dashboards.
One cannot undermine the fact that, the DNA of “business analytics” which is the core part of business intelligence still remains deeply entrenched in analytical capabilities of Excel (that has long been and still is the mainstay of enterprises’ analytics needs). However, at present, the real power of any tool or application is evaluated on the basis of its ease of use and intuitive features that can help even non tech savvy people get relevant insights as per their needs
without relying too much on IT staff.

The Great BI vs Excel Debate

So, let’s uncover which areas, Excel needs to flex its muscles in its run up to the Business intelligence race.

Types of Users and Familiarity

Proponents of Excel are those power users who can perform almost all Excel analytical stunts. But, when it comes to less tech savvy frontline executives or other personnel, mastering Excel would be a tough pursuit. Here’s where modern BI tools makes it easy for even the non-techie employees perform slicing and dicing, do some data mash-ups with intuitive visualization features.

Timely re-configuration and processing

One of the areas where BI tools scores over Excel is providing real time data which is outside the purview of Excel. The ability to connect directly to the databases and heavy under the hood plumbing makes real time data monitoring easy and actionable. Updating and re-configuring data also consumes a lot of time in Excel and sometimes runs out of memory as well. Due to these inconsistencies working with Excel becomes difficult. The amount of time taken to download data in Excels sheets and tons of manipulation done in order to get relevant insights from disparate applications seems quite a daunting task.

Visualization and Web access

Modern BI tools provide rich interactive visualization, web access to analytics as well as new forms of interactive data visualization which still remains a far cry for Excel.

Error Probability

Mismanagement of data, corrupt files and inadequate security makes Excel more vulnerable to errors. Even, power users who can build their own macros might still run the risk of making hidden errors. Here, BI tools that are integrated with the databases help in master data management and these tools being rigorously tested, come as a saving grace for many users.

Complex Decision Making

When it comes to complex decision making that requires manager to access information from various applications like SAP, CRM, ACCOUNTING etc. and then downloading them to Excel sheets and performing analysis becomes a complex procedure. On the other BI tools are far more sophisticated and can be easily integrated with cross functional applications to provide meaningful insights without banging heads on the Excel walls.

Complex Marketing Research and statistical analysis

In order to perform complex research and statistical analysis, Excels fails to leave a mark as compared to the like of Spss and Sas. These softwares provide domain driven data mash up capabilities and are far easy to use then Excel. Further, managing unstructured data becomes difficult in Excel.

Expense

BI tools are more expensive then Excel, this is one area where Excel holds sway over BI tools. For smaller organization Excel can help solve much of these BI challenges in cost effective way, when it comes to big organization, Excel doesn’t stand a chance against mighty BI providers like COGNOS, QLIKVIEW, TABLEAUE and the like.

Size of data

Though Microsoft has made significant changes in Excel to enable power users to manage large amount of data, Excel still falls short of space while handling large amount of data. This is one major problem with Excel comparatively with BI tools that come with high storage capacity to handle large data sets.

Computed measures KPIs, Etc.

Though Excel can perform complex analysis, when it comes to performing analysis to measure KPIs using disparate data sources, BI tools are miles ahead.

Compliance

BI tools offer better compliance management capabilities than Excel. BI tools provide much more reliable data for auditing various standards where quantification of process and information flow is reliable and streamlined.

Operational BI at Banks

Embedding Operational Business Intelligence (BI) tools to several banking operations can help operational managers get actionable insights on operational bottleneck, historical data etc. for managing, monitoring and for controlling contingencies.

By getting actionable and real time  insights on operational bottlenecks, managers can take corrective measures to eliminate operational flaws.  Let’s consider a simple example of a cheque clearance process at banks and see how Operational BI can help you get valuable insights to improve functional inefficiencies.

Operation BI for Banking and financial operations

‘A’ deposits a cheque in favor of ‘B’s account on day one. Afterwards, B’s bank processes the cheque which includes validation of cheque’s credentials and passes it on to A’s Bank. The cheque validation processes include a series of steps to identify instances of frauds, validation of  the sufficient amount of money at A’s account,  signatures,credentials, etc. and other issues that may obstruct the fund’s transfer process. The settlement process is over when A’s bank clears the payment to B’s Bank.

The cheque clearance process might seem simple at first sight, but there are several intricacies involved within the process. Firstly, there are many instances of frauds and tamperings such as counterfeit, forgery or alterations. Secondly, there are unpaid instances such as insufficient funds at customers account or there are several fields wrongly entered such as dates, wrong signature etc. It would be far too simple to manage these cases if there were less number of transactions, but consider how difficult it would be if there are millions of cheques being deposited and cleared every day. Besides, it is difficult to give answers to customers who inquire about why their cheque’s being bounced or other queries related to their settlement processes.

Where does Operational BI fits in?

Since the settlement process is complex and takes a lot of time in terms of the number of days to clear a cheque, wouldn’t it be nice if fraud cases could be detected real time to reduce operational risks?  Well, here’s a tip on how you can do this? Business rules can be applied by analyzing history of frauds that would signify potential fraud while managing daily operations and reducing the number of frauds. Operation BI can also help customers get real-time information about their queries thereby reducing user reaction for any issue. It can also be integrated with functional processes and functional applications to get actionable insights. In the cheque clearing process, an operational BI can help managers know how many cases of frauds appeared per day, cases of insufficient amount, cases of halt, or number of cheques that were forged or tampered.

Interesting Video by Logi Analytics themed around The BI Chocolate Cake Problem”.”

Here is a very interesting video by LOGI analytics. The video metaphorically describes enterprises’ end-users as kids and toddlers demanding for variety of chocolate cakes (figuratively representing data reports) and their increasing reliance on IT staff to get personalized reports on time. It also features IT staff’s pathos on how they are flooded with the demands for variety of cakes(aka reports) and their inability to provide reports on time apart from focusing on their workday routine jobs.

So watch out for yourself before I spill the beans in this piece.

Click on the Image To Play

‘The BI Chocolate Cake Problem’