News & Insights

Rise of the Machines?

Thoughts on AI and Due Diligence

It’s not often that a throwaway comment gives you pause for thought. Yet that’s exactly what happened to me at a recent client event. I’d been talking on the sidelines with an in-house buyer of due-diligence services; someone who’s been using firms like ours for decades and who long since stopped believing that membership of the right club somehow bestowed better access to information. No, this is a sophisticated buyer: they know what they want, what it should look like and what they should be paying for it.

The topic of conversation was turnaround times – the time it takes firms like ours to deliver the results of our enquiries back to our clients. And the buzzword seemed to be “compression.” Compressed internal deadlines, compressed deal fuses, compressed deal margins. All leading, of course, to a conclusion that due diligence turnaround times must be “compressed” too.

Now here’s the thing: as a sophisticated buyer, they know full well that basic on-line reports can be machine generated just by hitting an enter button and letting the computer do its thing. But then they also know that tabling a report like that at a risk committee is a non-starter too. So here was the dilemma: they wanted the quality of an in-depth report produced by a professional analyst and they were happy to pay for the human overlay, they just wanted the delivery to be — using the buzzword of the day — compressed. And here’s where their throwaway line came in: “why not use ChatGPT to do the analysis?”

Not for a second do I believe they really meant it. Open AI’s ChatGPT was only launched in November 2022, and while now its fourth iteration, it’s still essentially a nascent application – you need only ask it for a profile of your own name to see its current limitations. But the key word here is current. Less than a year after its global release, it’s already writing essays, authoring poetry and writing computer code. Better than a suitably talented human can do? No. But that’s not the point. Its learning grows deeper with its increasing access to large language models using massive amounts of text from all over the internet. Just imagine what Version 12 will be capable of.

Global financial organizations can see the potential already and they’re on a hiring spree to build-out their own proprietary AI systems as soon as they can. Quoted in a recent Bloomberg article, Alexandra Mousavizadeh, the Chief Executive Officer and co-founder of consultancy Evident, likened the drive for talent to an “AI arms race.” Bloomberg went on to note: “the potential prize for businesses is that everyday tasks will be handled more efficiently and effectively while complex analysis and risk modeling are made easier and faster. That’s particularly tempting in banking, where reams of data underpin increasingly complex investment decisions, despite uncertainties about AI’s eventual capabilities and concerns about how to regulate it.”

It’s the power of AI to sift through massive amounts of data, distill it and ultimately present information that represents its biggest opportunity. But how much is too much? Large language models are, by their nature, unwieldy beasts; and while machine learning algorithms are advancing at a rapid pace, those making the heaviest investments in AI are doing so with a much more defined use case in mind, one tailored specifically to their sector and the business outcomes they want to achieve. For most that will inevitably include market efficiencies and of course headcount savings. There’s little doubt that in the not-too-distant future many of the white-collar roles we take for granted today will simply disappear in the same way that weavers did with the onset of the Industrial Revolution. Only this time there won’t be a factory to migrate to because the machines inside will know how to operate themselves.

So how does all this bring us back to the topic of due diligence and will our analysts go the same way as the weavers? Investing in AI is something that we are all doing, the drive for ‘compression’ almost mandates it and anyway most analysts are only too glad to pass over the task of wading through on-line registry data, litigation records and hundreds of websites in favor of adding the real value that a human brings to the process: their intuition.

Time and again it’s this intuition that’s led to bullets being dodged. Not just the intuitive feeling that something looks strange or out of place, but the intuition to know that this line of enquiry is worth pursuing because the result is likely to be important to the client. To illustrate, a couple of months ago we were looking into a UK-based subject on behalf of a London client. We’d noticed from our initial scoping that this person seemed to be affiliated with a significant number of companies aside from their main business. While it’s quite normal for entrepreneurs to hold Directorships in several companies at the same time, the sheer number in this case stuck out.

We’d pointed this out to the client at the outset and suggested that we take a close look at each of these affiliates, particularly for litigation. At first pass all seemed fine, except for one matter where an affiliate company was named as a party in a dispute over land. From the judgement, it became clear that our subject had behaved in a less than reputable manner to the extent that they were singled out for criticism by the presiding judge. The thing is the court record had their name wrong. Not by much, but enough for them not to show up on a meta name search. If we hadn’t looked at the affiliates as closely as we did, it’s quite possible this important finding would have been missed. And why did we look? Analyst intuition that something was wrong.

So, in our space, until AI is capable of replicating those most fundamental human characteristics of curiosity and intuition it will continue to play a highly valued supporting role. Doing the heavy lifting to partner with humans to help them achieve those compressed deadlines, but not instead of them.

Well at least not yet.

Jul 27, 2023 | Thought Leadership