Apromore Blog Post V2

Apromore named a Leader in the 2025 Gartner® Magic Quadrant™ for Process Mining Platforms — for the third consecutive year

Download the report today!

X
Maintaining Control In Uncertain Markets

Maintaining Control In Uncertain Markets

Today's stark red headlines are showing us that maintaining control during uncertain markets is now a mission-critical capability for any and all organizations. But where does this begin? How do you go from a full red wall of uncertainty to proactive and protected motions? To start: There are two dimensions to consider, run-time control and change control. For the former operational resilience is key, a fact recognized by many financial services regulators. Integrated supply chains, fragmented processes and a heavy reliance on technology and legacy infrastructure means that it’s not easy, hence the regulatory interest. From a change control perspective, the ability to design and implement fast and flexible processes, which can be adapted quickly, has been the goal of many organizations’ digital transformations – transformations that have been underway in large banks for over a decade and with still several more years to go.

Trend-based Change of the Past

But the turbulence of the environment we are operating in today is very different to the dynamic environment that we’ve known for the last 20 years. While the first two decades of this century brought significant change, there were known trends that followed a relatively identifiable trajectory with relatively consistent patterns. For example, interconnected global supply chains, the role of big data and tech in general, the importance of China both economically and geo-politically. To a certain extent the Covid pandemic presented significant challenges and exposed a number of weaknesses in our operational resilience, but by and large, organizations were able to adapt.

Change is No Longer Predictable

Today, however, the market and trade turbulence that we are experiencing is upending many of the norms that made the previous trajectory of change relatively predictable. When the rules are changing on a daily basis running an Agile change program is not enough. The requirements can change several times in the course of a 2–4-week sprint. What is required is an ability assess the impact of any proposed change rapidly, propose options and then execute them– within minutes/hours not days /weeks. In many cases, this may be a band-aid solution, in place until the dust settles, and so an ability to compare “before” and “after” and the impact of a permanent fix becomes a critical aspect of change management.

Execution Risk has Risen Exponentially

Execution risk has risen exponentially. Changing mature, established processes so rapidly and so frequently, by default tends to concentrate attention on the “happy path”, but it is the outliers that tend to create the risk and the outliers that are forgotten about when using traditional process change tools.

Managing execution risk during turbulent times requires a different kind of toolset. A toolset that is predicated on analysing all relevant event data from a process in a way that is visual, fast and easy to interpret for users. It also requires a dynamic “what if” capability that can be applied in near real-time, that is linked to the process’ risks and controls, so that the true impact on the customer, the shareholder and the regulator can be evaluated quickly. Moreover it requires precision when planning the change, too often a change is described at a high level of abstraction and then over a period of days and weeks, analysts drill down to specify the requirements more precisely. In a turbulent environment, there isn’t time to gradually peal back the layers of the requirements onion. Precision is needed almost immediately to maintain control.

Process Intelligence Is The Answer

Process intelligence applications provide the suite of tools necessary to navigate turbulent times. Their ability to discover a process and view it at different levels of abstraction from millions of data events , within minutes, means that the outliers are visible and the process can be zoomed in to precisely the area under review knowing that the description of the process is based on the most recent data. Mapping risks and controls to the process helps to quickly pinpoint the key vulnerabilities and threats that may arising from the changed environment. Simulating the process with parameters based on real process data not aggregate data, means that a wide range of options can be considered quickly and their impact on cost, customer experience and risk can be evaluated holistically.

Doing all this, in a near real time, with key stakeholders and decision makers in the same room, working with a “no code” solution is one pathway to maintaining control during turbulent times.

Take the first steps towards operational excellence!

 

Ready to thrive not just survive the market volatility? Request a Demo below and subscribe to our newsletter where we explore case studies, industry trends, and best practices to keep you ahead of the chaos. 

Request-a-Demo-Button


 
Nigel AdamsNigel Adams
Senior Advisor at Apromore
Nigel is a thought leader in service operations excellence, with deep experience in the banking sector. He has nearly 25 years of experience focused on creating enterprise value from operational improvement, risk management and performance optimization. Nigel is known for driving transformational change at pace while leading large teams in complex delivery networks. In addition to a consulting career at KPMG, he has brought his skills to bear for leading banks, including NAB and ANZ, focusing on global payments and cash operations, financial crime, and business performance.  

 

Sign up to receive latest Apromore content