Roughly two years ago, a post on workforce question-and-answer site Workplace StackExchange generated a lively debate. The post, which has since been viewed nearly a half-million times, was by a user named Etherable, who designed an algorithm that transformed the substance of a 40-hour workweek into a two-hour project. The poster was using the remaining time to tend to personal issues and spend more time with their son. Etherable asked, “Is it unethical for me to not tell my employer I’ve automated my job?”
The question seems to be a straightforward yes/no question. But the responses were often qualified. Some thought that lack of disclosure was unethical but warranted given the worry that the poster would lose their job. Others thought that the answer hinged on whether the company was paying the poster for hours or results. And at least a few gave pats on the back: One of technology’s great promises was that it would free people from rote tasks and give them back time that could be devoted to more meaningful pursuits. This was just an illustration of that promise.
Future of work issues, now
While the employment contract or company policies should be the deciding factor in the situation described, the post also illustrates some of the more complex ethical issues emerging as technology advances, says John Hooker, professor of business ethics and social responsibility at Carnegie Mellon University and author of Taking Ethics Seriously: Why Ethics Is an Essential Tool for the Modern Workplace.
“Rather than a transfer of tasks from humans to automation, we’re going to see a fusion of intelligent agents, computers, [and] algorithms working together with humans,” he says. That fusion is going to shine a light on some ethical and other problem areas companies and workers already struggle to navigate.
Part of the question relates to metrics, says “computer psychologist” Tim Lynch, president of gaming computer company Psychsoftpc. If the individual is being paid by output rather than hours, then they are fulfilling their agreement.
However, other factors complicate matters. Based on the individual’s employment status and agreement, the company may own the algorithm because it owns the employee’s work product, Lynch says. Also, Etherable built in “bugs” to make the work imperfect, as it would be if a human did it. That indicates an effort to mislead the employer. Plus, if the employer is relying on algorithms of which it is unaware and did not vet, the employee may be leaving the company open to security breaches or other liability.
“If you don’t know that Susie over there in the corner wrote this program that’s smashing out 10 million emails a day and has got some bug in it that’s sending the wrong thing to the wrong people, then it’s going to be pretty hard to track that thing and stop it,” says Ryan Duguid, chief evangelist for process automation platform Nintex.
Progress and its impact on trust
In the original post, Etherable’s primary concern about disclosure was that it may lead to the company simply replacing them with the program. It’s a valid concern, because that’s what many companies might do. However, this viewpoint shows a fundamental lack of trust between employer and employee–and that’s a problem, Duguid says.
The fear that people are disposable if their productivity can be bested by technology may lead to a confidence of trust as well as unpleasant and unintended consequences. High-pressure environments with a focus on results rather than people can lead to workplace cheating and unethical behavior, according to a 2017 report from the University of Georgia. And according to the Ethics & Compliance Initiative’s “2018 Ethics and Compliance in the Workplace” report, more employees than ever are experiencing pressure to cut corners and violate ethics.
Companies serious about fostering greater trust need to focus on transparency and investment in training for new skills, says Hooker. “There are ways to manage moving to automation,” he says. “They don’t necessarily displace employees. You can reassign people for the job, retrain them for the job,” he says.
But without seriously addressing the trust and culture issues fueled by technology and automation, companies will incur costs of which they may not even be aware. Duguid points to the opportunity cost Etherable’s employer is incurring because of lack of trust. What other problems could this talented employee be solving? How else could similar algorithms be applied? As automation changes jobs, people who can find creative technology solutions are going to be the ones who win, he says.
“I think two things are going on,” he says. “One is a lack of trust in the employer. And two is the employees themselves, not having the foresight to realize that they’ll likely be the ones that get recognized. Like you turn up to your employer and say, ‘Check out what I’ve done.’ Most places I know, you’d be getting a promotion off the back of that.” As long as they got rid of the bugs, of course.