This article is part of the New New Rules of Business.
How do we ensure that technology is used for good? That was the question presented to an esteemed group of technologists at a meeting of the Fast Company Impact Council, an invitation-only group of business leaders from across industries, which was held at the end of June.
In this roundtable discussion, led by deputy tech editor Katharine Schwab, seven tech leaders discussed what a manifesto for better tech might look like. Participants in this session included Arthur Filip, head of sales transformation for HCL Technologies; Jeff Hennion, cofounder and managing partner at Woodside Ventures; Kathy Hibbs, chief legal officer for 23andMe; Michael Kanaan, author of T-Minus AI and Director of Operations for MIT Artificial Intelligence Accelerator at the U.S. Air Force; Matthew Putman, the CEO of Nanotronics; Karen Silverman, cofounder of The Cantellus Group, a consultancy leading business helping businesses use AI; and Michelle Zatlyn, cofounder and chief operating officer at Cloudflare.
In the following conversation, these luminaries identified whether or not tech companies can write their own rules for serving the public good or if they need guidance from regulators. Excerpts have been edited for length and clarity.
Karen Silverman: Everyone in this room and every other room is very tech forward. We live and breath this stuff. I take the perspective that all companies are tech companies right now. It’s not enough to cabin this conversation in these sorts of rooms. We have to find a way to think about this as a broader skill set. A car is neither good nor bad, tech is neither good nor bad. But we need to figure out how to service them and how to operate them. We have to make sure that people know how to drive.
Arthur Filip: I think to be competitive and to be a leader going forward there has to be the IQ side and building great technology, but there’s also the EQ side. We have to be connecting the technology company and the technology to the people and getting involved in economic and geopolitical issues.
Kathy Hibbs: I have to go and click through the privacy settings on every single thing I read and look at—I know that that’s driven by the California Law. Can you get to the content you want without subscribing to something? A lot of people who are a lot less savvy than [you or me], are just clicking okay to get to whatever they need to get to and aren’t being informed about how their data is being used. I don’t think it’s particularly ethical to put your [information] in a data consent form that way. Each one is different. And a lot of companies are only going to put up the California Consumer Privacy Act requirements if you’re in California. The federal government is not going to put up a national policy. This is a basic thing right now that could be made better and probably wouldn’t be a loss to business.
Karen Silverman: I think we need to elevate this to a board-level conversation about stakeholder capitalism. The benefits and opportunities are big enough and the risks profound enough.
Even if you get an aspirational artificial intelligence policy or General Data Protection Regulation, it’s still not going to answer for a business what it should do in particular. What does it mean to be trustworthy as an individual company? Businesses are making decisions about how they’re making technologies, but even when there is consensus around it, it’s not going to be that instructive.
I think the focus on data privacy is right.
Arthur Filip: Doctors take the Hippocratic Oath, but I think in the tech industry there is a need for that kind of . . . Do the Right Thing. And Do The Right Thing for all the communities that we live and work in. The whole industry gets mischaracterized as people from Silicon Valley. Like we’re all getting our first Ferrari at 24. That is not a large portion of the tech industry. Do the Right Thing could be something that could be embraced across the entire industry.
Michael Kanaan: Codifying culture is a hard thing to do. We need to bring it back to the technology and the data piece.
Kathy Hibbs: One of the advantages that 23andMe has because we do health is we have the Institutional Review Board. What they’re looking at is: Is this a reasonable purpose and are the subjects protected? Now of course, the reason medicine has that is because of the Tuskegee studies. In that instance, scientists, even with their Hippocratic Oath, still didn’t have the ability to look at the studies they were doing from the perspective of the subjects.
Matthew Putnam: A review board would be good, but we have to be realistic about the complexity of [artificial intelligence] models.
Michelle Zatlyn: If a company doesn’t do the right thing, then what happens?
Karen Silverman: It would be useful to have a certification mark or something consumers (business and end users) can easily assess certain ethical standards in their products.
Michelle Zatlyn: We could think of it like an accounting standard—that’s another place where I’ve seen it proposed.
- How to educate the next workforce
- The power of narrative and role of storytelling in business
- These are the new rules for the future of the planet
- How to make sure tech companies do the right thing
- Companies are just now starting to figure out remote work
- How to build a more human workplace
- These are the new rules of capitalism
- These are the new rules of giving
- How to build more equitable public space after COVID-19
- Design for social good has a new framework
- For business leaders in media, this moment is a marathon
- Here are the new rules of leadership
- 10 next-generation leaders on leadership