Increasingly, technology companies have been focusing on their corporate social responsibility as well as their products, as customers demand both good tech and good conduct.
The “State of the Connected Customer Report,” an annual report from Salesforce, includes insights from 8,000 consumers and business buyers worldwide. Four data points, plus many others, spell out the importance of ethics in customer relationships and buying decisions. According to the report:
- 73 percent of respondents said trust in companies matters more than it did a year ago
- 80 percent are more loyal to companies with good ethics
- 68 percent won’t buy from companies with poor ethics
- 88 percent believe that companies are responsible for ensuring they use technology ethically
Good ethics is good business
Companies are responding to these needs in different ways. Hewlett Packard Enterprise, for example, focuses on the entire value chain, from product design to manufacturing and beyond—starting with creating diverse teams to design and build products that avoid bias in artificial intelligence. In addition, its “Living Progress Report” details the company’s work to combat climate change, inequality, and human rights violations. Intel’s four focus areas are environmental sustainability, supply chain responsibility, diversity and inclusion, and social impact. VMware’s goals, under its overall Force for Good program, include becoming carbon neutral (which it achieved in 2018).
At VMworld, panelists from VMware, the University of Ottawa, Salesforce, and Omidyar Network, all of whom deal with different aspects of the ethical use of tech, presented their areas of expertise before joining in a discussion about responsible use of technology. The panel was moderated by Ryan Jenkins, a senior associate at Principia Advisory.
Integrating a sense of responsibility into company culture
VMware’s focus is Force for Good—or Engineering for Good, said Ray O’Farrell, executive vice president at VMware, whose office manages the Force for Good initiative. But unfortunately, one person’s “good” may not be another’s. Despite this, technologists have a responsibility, he said, since everything they do will end up affecting some industry or some group of people in fundamental ways.
“One of the key phrases that appears when we think about this comes from our chief research officer, [David Tennenhouse],” said O’Farrell. “And one of the things he said, which is quite interesting, was that we as technologists, we get the opportunity to play with a lot of cool stuff.” But that privilege comes with a price: “That bargain is that we get to do all of this stuff, but we need to begin to think more and more about the responsibility associated with those things. What can we do to start making sure that sense of responsibility, a sense of force for good, is integrated into the culture?”
Connecting ethics with engineering practices
Jason Millar, assistant professor and Canada research chair in ethical engineering at the University of Ottawa, said he faces the same challenge as he tries to teach engineering students about ethics in robotics and AI. He became interested in the subject while working as an engineer and returned to school, earning a doctorate in ethics and governance of robotics and AI. Millar has spent the past five or six years, first in postdoctoral research and then as a professor, delving into ethics in robotics and AI, including bias, explainability, fairness, transparency, and privacy.
Millar said he found that, as a philosopher, when he spoke to engineers at academic events, he’d get a lot of questions, but his answers didn’t lead to action.
“When I asked them, ‘What are you going to do to change the way that you do your work?’ they would say, ‘Well, you know, really nothing. I don’t see how this really connects the stuff that I do day to day.”
Mlllar’s current job includes raising awareness of ethics topics and the importance of ethics in engineering and computer science, to really make changes in engineering. “Something has to happen here, because there’s some sort of disconnect,” he said. His current research is on how to connect ethics with engineering practices.
Ethics is not outcome; it is process.
Showing moral math
Making that connection is something that’s also important to ethical investment firm Omidyar Network. The company had a wake-up moment about technology and its role in society in 2016, when it started thinking about the unintended consequences of tech.
“We’ve all come to that awakening that there are real considerations when building and developing technology, and thinking about not just the potential power and the force for good but also about the potential downsides,” said Yoav Schlesinger, a principal in the company’s tech and society solutions lab. Since then, Omidyar has been considering how it can shift its culture to a position of responsibility and accountability in its innovative practices.
“Culture means a lot of things,” Schlesinger continued. “Culture in the broadest terms—in terms of tools, processes, norms, narratives—is bringing all of the things that ladder up to creating the kind of organization that is needed to then build the kind of products, features, and tools that society can benefit from.”
Today, we’re at the point with ethics in technology that we were with automobiles in 1966, he noted, after Ralph Nader’s 1965 book, “Unsafe at Any Speed,” exposed and heightened awareness around the dangerous engineering practices involved in building cars at the time, resulting in new safety initiatives.
“We’ve all awakened, and it’s kind of unique that we’re even having this conversation at a mainstream tech conference,” Schlesinger said. “We’re at the beginning stages of this evolution toward that kind of informed, just, rewarding culture in tech that holds itself accountable for the kinds of things we want to build. And ultimately, that is about showing our moral math.”
However, Paula Goldman, chief ethical and humane use officer at Salesforce, argued that we’re not in 1966 but the early 1900s, with its waves of innovation and new norms. They were contentious, they required action from companies and government, but in the end, they resulted in a safer society, she said.
What do we mean by ethics?
Schlesinger said he finds the term ethics unhelpful though uses it a lot as a placeholder. “I will try to reframe it: It is to talk about what’s good for people, society, communities. Yes, that is relatively defined, and that’s what’s good about ethics,” he said. “Because ethics is not outcome; it is process. So that conversation of weighing the trade-offs between your values and mine, that conversation that ensues when you actually are forced to contend with the things that matter to you and to assess how to do the thing of upholding what is good for people and society, that is ethics.”
“Good ethics is good business,” added Goldman. “The reason we’re having this conversation gets back to, we’re in this moment where everyone is all of a sudden waking up and realizing this isn’t a ‘nice to have’ for the tech industry. This is deeply impacting our brands, it’s deeply impacting how our customers see us, it’s impacting customer decisions, it’s impacting regulatory conversations. And I think folks have understood that in the last year or two and are just starting to grapple with what that means and what is the role of a company in that process.”
She added, “If we don’t do this, and we don’t lead, it’s going to kill our bottom line.”
You can also build competitive advantage for yourself, as Volvo did by being known as the safest car on the road, Schlesinger noted. “It’s not just risk mitigation,” he said.
But, Millar added, “there’s a problem with disconnecting talk about ethics from talk about engineers’ work. If you want to connect with engineers, you have to talk about their technology.”
Ethics as a process
However, O’Farrell said a lot of cloud applications are general purpose and, as an engineer, you don’t influence the applications. You’re just building a cloud. It becomes a business problem.
That’s where policy comes in, Goldman said. It’s about setting guidelines around how the technology should be used to help customers navigate products and use them responsibly. And vendors can discourage various use cases through product design, Millar added.
Ethics as process is a hot topic today, Schlesinger noted. There are ethical premortems that look at what could go wrong and how it could be prevented or mitigated, as well as ethical postmortems that look at what went wrong and why. Ethics as a process involves embedding ethical requirements in design documents and can be coupled with an ethical review. If certain criteria aren’t met in the review, the product can get bounced back for rework.
“All of these are attempts to take existing process within companies and layer responsible considerations on top of it,” Schlesinger said.
Millar concluded, “Just like you saw the emergence of environmental engineering as an area of specialization, or safety engineering, I think that you’re going to see the emergence of a new domain of engineering that is focused on understanding these issues a little more and figuring out ways to bring them into practice.”
Useful link:
Trust and responsibility in the digital age
This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.
For more information contact United Imaging Technology Services today