Institutional investors are buckling under the operational constraint of processing hundreds of data streams from unstructured data sources such as email, PDF documents, and spreadsheets. These data formats bury employees in low-value ‘copy-paste’ workflows and block firms from capturing valuable data. This article demonstrates how machine learning (ML) paired with a better operational workflow can enable firms to more quickly extract insights for informed decision-making, and help govern the value of data.
According to McKinsey, the average professional spends 28% of the workday reading and answering an average of 120 emails – on top of the 19% spent on searching and processing data. The issue is even more pronounced in information-intensive industries such as financial services, as valuable employees are also required to spend needless hours every day processing and synthesizing unstructured data. Transformational change, however, is finally on the horizon. Gartner research estimates that by 2022, one in five workers engaged in mostly non-routine tasks will rely on artificial intelligence (AI) to do their jobs. And embracing ML will be a necessity for digital transformation demanded both by the market and the changing expectations of the workforce.
For institutional investors that are operating in an environment of ongoing volatility, tighter competition, and economic uncertainty, using ML to transform operations and back-office processes offers a unique opportunity. In fact, institutional investors can capture up to 15-30% efficiency gains by applying ML and intelligent process automation (Boston Consulting Group, 2019) in operations, which in turn creates ‘operational alpha’ with improved customer service and redesigning agile processes front-to-back.
Operationalizing machine learning workflows
ML has finally reached the point of maturity that means it can deliver on these promises. In fact, AI has flourished for decades, but the deep learning breakthroughs of the last decade has played a major role in the current AI boom. When it comes to understanding and processing unstructured data, deep learning solutions provide much higher levels of potential automation than traditional machine learning or rule-based solutions. Rapid advances in open source ML frameworks and tools – including natural language processing (NLP) and computer vision – have made ML solutions more widely available for data extraction.
However, the first wave of ML investments frequently disappointed, with many early adopters yet to reap the rewards. A recent survey of more than 2,500 senior executives, conducted by MIT Sloan Management Review and BCG, highlights the challenges of deploying ML solutions into production. More than 7 in 10 executives found that ML had not delivered the expected business results, and 40% of organizations making significant investments in ML have yet to report business gains from ML.
The critical gap has been in planning for how to operationalize ML for specific workflows. ML solutions should be designed collaboratively with business and process owners and target narrow and well-defined use cases that can successfully be put into production. To paraphrase BCG, successful ML deployments are 10% about algorithms, 20% about technology, and 70% about business application. ML used to automate manual workflows in operations must ensure that the manual task is automated end-to-end and implemented with a ‘human-in-the-loop’ design that routes lower confidence exceptions to employees, generating a critical learning feedback loop so models are constantly improving. The user experience (UX) should give employees an intuitive way to accelerate their specific workflow with transparency, visibility, and reporting every step along the way.