In 2013, two University of Oxford professors published a study analyzing 702 different occupations. Of those, they determined that the role of chief executive fell within the 10% they deemed "not computerizable."
There's reason to think twice about that. Here's why.
Broadly speaking, the push toward democratization is arguably one of the most potent social, technological, and economic forces today—one of the few, in fact, that runs powerfully through each of those fields. And while the motives and manifestations of this trend necessarily vary, examples abound.
Consider, for example, 42, a tuition-free coding school originating in France and now in Silicon Valley. 42 is essentially a university without instructors where students learn through what the founder calls "collaborative education." Then there’s the U.S. Army’s recent science and technology Futures Project, "aimed at leveraging the collective wisdom and ability of the American public," to help influence how the Army will use research and development investments to prepare defense forces for the world of 2040.
Another example is the "new breed of human organization" that the blockchain-governed Decentralized Autonomous Organization, or DAO, model propounds. In these organizations, token holders vote on which proposals to accept from contractors that build products and services on the DAO's behalf. No elite decision-makers required.
It's far from clear whether these ventures, in all their diversity, are exceptions to the rule or harbingers of more anti-hierarchical approaches to business, innovation, and governance to come. Certainly from last week's Brexit to the passions roused by the Trump and Sanders campaigns in the U.S., Western governments are facing waves of anti-elitist sentiment that may not prove a passing fad. While a recent Hewlett Packard Enterprise study found that 79% of teenagers today want to lead a company by the time they reach 29, it's genuinely worth wondering what kinds of C-suite roles might be left for them by then.
After all, the belief that certain problems required "an executive type of mind" was itself the product of a distinct historical moment. In the early 20th century, the management theorist William Henry Leffingwell applied his day's Taylorist obsession with efficiency to the practice of office management.
Workforce automation, wrote Leffingwell, would enable the "scientific manager" to earn a salary by freeing up time for handling only exceptional cases, rather than constantly sorting out routine, day-to-day processes. And despite a lively and growing debate around automation, what few still seem reluctant to question in any practical sense is how much better the "executive mind" really is at even those higher-order challenges Leffingwell alluded to.
Indeed, researchers at McKinsey Global Institute have estimated that "the percentage of a chief executive’s time that could be automated by adapting currently demonstrated technology" is 25%. And while the responsibilities of any given CEO vary considerably from one company to the next, there are at least three key areas where removing human decision-making might actually make sense.
Cognitive scientist Daniel Dennett once described the human brain as an "anticipation machine." Yet accurate, long-range foresight is pretty rare because most of us find it hard to imagine a future radically different from the past or present (one likely reason why we're so entranced with innovation and breakthrough thinking).
That’s especially true for those considered "experts" based on occupational status, or who’ve operated in the same business for many years. Psychologists call this the "curse of expertise": The more narrow our focus, the more we think we know, leading more to overconfidence than to enhanced competence. As AOL cofounder Steve Case writes in his book The Third Wave, Don Logan, the former chairman of AOL Time Warner’s Media and Communications Group, was "someone who didn’t believe the Internet had much of a future."
Since artificial intelligence is already being used for predictive modeling and data analysis, it isn't difficult to imagine how the visioning role of the CEO could be next.
Managing financial decision-making and risk are two C-level tasks where hubris abounds. Stanford Business School and Wharton researchers report that "top corporate decision-makers persistently overestimate their own skills relative to others," a finding that's been echoed elsewhere. Daniel Kahneman highlighted the illusion of financial skill in his best-selling book Thinking, Fast and Slow. After visiting a financial services firm and analyzing the investment outcomes of 25 wealth managers over eight consecutive years, Kahneman concluded that "the results resembled what you would expect from a dice-rolling contest, not a game of skill."
These are far from the only studies showing that humans generally act irrationally and make poor financial decisions, especially in complex situations. You only need to consider the criticism of leaders like Edward Lampert of Sears—himself a former banker—to wonder whether it might not be a smart idea for future CFOs to be algorithms.
When it comes to directing a company's talent-acquisition strategy, often a key C-suite responsibility, it could be that computers prove better here, too. One of the goals of Goldman Sach’s "robo-recruiting" initiative, for instance, is to help diversify the business's workforce to a degree that humans at Goldman haven't shown themselves capable of doing on their own.
The business benefits of providing opportunities for more qualified candidates tend to be considered through the usual channels are already as clear as the need to tap into them. In a recent PwC study, only 8% of senior executives were found to have the skills it takes to lead organization-wide transformations. Not only might using technology to root out unconscious bias help close the skills gap at the leadership level, it could also further the wider trend toward democratization.
Ultimately, that doesn't mean we should expect to find the C-suite completely vacant in the future. But what it does mean is that we should probably temper our widely held conviction that CEOs' most critical attributes are things they've shown time and again not to be all that good at.
This way we can start focusing on prizing the sorts of qualities robots don't seem as likely to take over but that democratic-minded organizations seem more hungry for than ever—and this one in particular, which the business leaders whom PwC researchers polled last year unwisely ranked the least critical of all: Collaboration.
Liz Alexander is a futurist and cofounder of Leading Thought, which passionately helps prepare human beings (only) for remarkable futures and collaborates with companies that want to be future-smart. Connect with Liz on LinkedIn or follow her on Twitter at @DrLizAlexander.