IT Pro is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

AI is now powerful enough to automate the back office

But that doesn't mean it's as easy as flicking a switch

This article originally appeared in issue 28 of IT Pro 20/20, available here. To sign up to receive each new issue in your inbox, click here

The business world is fascinated with artificial intelligence (AI), with most of the excitement centred around customer-facing services. The technologies used to make a Tesla stop or embedded in the soothing voice of your virtual assistant have grabbed most of the headlines, but what about AI’s role in supporting relatively unglamorous back-office functions?

According to industry figures, the use of AI in supporting the HR, IT, legal and financial departments within organisations is not only well underway but maturing. In the last few years, these automations have become increasingly widespread, with some organisations now assessing whether it’s viable to automate vast swathes of the business.

The automation industry is growing, with the market expected to grow from $140 billion in 2021 to $234 billion by 2028, according to the Insight Partners. Gartner, meanwhile, estimates the robotic process automation (RPA) software market, a subsection of automation used primarily in the back office, could reach $3 billion in the next two years.

Even as automation expands, though, some aspects of a business lag behind. The most automated department within an organisation tends to be finance, due to the huge potential for cost and time savings. Legal departments, meanwhile, face barriers due to legacy attitudes as well as the complexity of legal processes, while HR teams are the most likely to still need people given the sensitivity of the issues. Recruitment, on the other hand, is ripe for automation, with many businesses incorporating AI, psychometrics and digital twins into the hiring process.

Instigating such programmes of change, however, isn’t as easy as flicking a switch. There are myriad factors to consider, including what kind of technology is needed, whether it’s financially viable, and to what extent it's desirable to eliminate the human touch. 

Office workers in a well-lit office space

Getty Images

Lost in translation

The first question organisations must ask themselves often centres on clarifying the terminology itself, according to the president of financial automation firm Beanworks, Karim Ben-Jaafar. “What are you looking for, machine learning or AI? Do you know the difference? Most companies don't. They think machine learning and AI are just synonyms – but they're not.”

Related Resource

Tackling our world’s hardest problems with machine learning

A core, transformative technology for organisation across industries and categories

Whitepaper cover with image of sunrise over the earthFree Download

For his fintech company, machine learning is an entirely passive – and far more expensive – tool, while AI allows for processes to be taught to a tool. He believes machine learning is falsely seen as a perfect solution that requires minimal effort to implement. In fact, choosing it for tasks it’s not fit for is prohibitively expensive and akin, he says, to being an early adopter of laser eye surgery (LASIK). “That's monstrously expensive right now,” he says. “That's like getting LASIK when it first came out at $100,000 an eye. The good news is the price is going down to the point where you're going to look at it like it's $500.”

For Jay DeWalt, chief operating officer of Arria NLG, meanwhile, the three key terms that frame his conversations about AI are natural language understanding (NLU), natural language processing (NLP) and natural language generation (NLG). He uses the analogy of children learning to speak as a way to frame the technology his company uses in a wide array of industries and departments. First, they learn how to understand commands from their parents, before learning the sentiments of those commands, and then beginning to understand meaning. It isn’t until later, though, that they articulate those words in a meaningful and insightful way.

Despite criticisms, DeWalt is betting on AI, and NLG specifically, as a world-changing tool. “Someday, I'm going to just talk to my machines and they're going to talk back to me and give me the information I'm seeking. I think NLG is a revolutionary technology that's going to change the world and how we interact with systems, how we get information from our data.”

Automation can be a runaway train

In the case of large-scale enterprises, like PricewaterhouseCoopers (PWC), it’s not just about what AI can do for their clients but how AI can help them scale their own business. Suneet Dua, products and technology chief growth officer at PwC US, says that the process that led to the 7,000 automations now at work within PwC started in earnest four years ago, and accelerated due to COVID-19. He adds the biggest hurdle in any business is getting executives to understand these automations can reduce routine tasks and efficiently redeploy the workforce. 

“I use this analogy where [the] automation train is going at like 100mph, human skills are going at like 10mph, and those two trains need to eventually converge. The problem that's holding back human skills is executives at the respective companies. They're not investing in human skills to upskill the future.”

Somebody walking up to the PwC office

While the most commonly cited indicator of AI’s value in automating the back office is the number of hours saved, Dua says, at the core, should be an improved environment for employees. He adds the training and education around the use of AI should include how the automation of simple tasks allows employees to work on more fulfilling and skills-based work, even if the six million hour reduction in work due to automations is profit-positive. 

“What happens then, is when you hire a tax person, or a finance person, or an HR person, you hire them for the top of the tech stack skills,” Dua says. “You don’t hire them to do rudimentary, mundane tasks.”

Augmenting, not eliminating, the human touch

Brian Green, the director of technology ethics at the Markkula Center for Applied Ethics at Santa Clara University says a major concern in implementing such projects anchors on the data being fed into these systems. Central to this is the biases created by humans. “The main issues that are hitting right now have to do with bias and fairness, and whether the AI is just automating human biases and prejudices, which is certainly something that we've seen,” he explains.

Green adds that while the data may not seem biased initially, AI tools tend to locate proxy variables that can heavily skew the data towards areas that are traditionally affluent or otherwise homogenous. This is in addition to the human-based concerns, like what workers are supposed to do when their job ceases to exist. “The goal of that kind of software's pretty much to eliminate people. And so we really are faced with the question of how we approach these sorts of problems and try to fix them.”

While the prospect of removing humans from the workforce is a concern, industry figures tell IT Pro they’re looking to augment jobs rather than eliminate them. For Jerry Levine, chief evangelist and co-general counsel at ContractPodAi, a company working on automating legal contracts, removing the people is a non-starter.

Related Resource

Recommendations for managing AI risks

Integrate your external AI tool findings into your broader security programs

Yellow whitepaper cover with two flying robots, with desktop computers inside their headsFree Download

“We’re an augmentation tool, not a replacement for the human being,” he explains. “It's not possible right now [to completely remove humans], I don't know if it'll ever really be possible, and, to me, I want to work with humans as a lawyer.”

He likens a lawyer’s work to that of an engineer or developer. “When I talk with a lot of engineers and technical [staff], and especially developers, I always point out that just as your job as a developer is to write code that tells computers how to operate, the job of a lawyer is almost to write code that tells human beings, organisations, entities, how to interact and how to operate in their relationship.”

Meanwhile, PwC’s AI cognition tool maintains a list of company holidays and can organise an out-of-office reply for employees who ask. The tool, Dua says, tells employees where their coworkers are sitting in the office on any given day and suggest possible times for lunch. Still, there are aspects of the business that Dua has no interest in automating. The exceptions, he says, are ethical or disciplinary matters. Issues normally handled by HR teams, in particular, are highly delicate situations. “To me,” he says, “those are the ones that need to have some sort of high touch environment to solve for them.”

Featured Resources

Accelerating AI modernisation with data infrastructure

Generate business value from your AI initiatives

Free Download

Recommendations for managing AI risks

Integrate your external AI tool findings into your broader security programs

Free Download

Modernise your legacy databases in the cloud

An introduction to cloud databases

Free Download

Powering through to innovation

IT agility drive digital transformation

Free Download

Most Popular

LockBit 2.0 ransomware disguised as PDFs distributed in email attacks

LockBit 2.0 ransomware disguised as PDFs distributed in email attacks

27 Jun 2022
Open source giant Red Hat joins HPE GreenLake ecosystem

Open source giant Red Hat joins HPE GreenLake ecosystem

28 Jun 2022
Carnival hit with $5 million fine over cyber security violations
cyber security

Carnival hit with $5 million fine over cyber security violations

27 Jun 2022