‘AI arms race highlights the urgent need for responsible innovation’

The recent frenzy over language processing tools such as ChatGPT has sent organisations scrambling to provide guidelines for responsible usage. The online publishing platform Medium, for example, has released a statement on AI-generated writing that promotes “transparency” and “disclosure.” My own institution has established an FAQ page about generative AI that calls on educators to […]

by Marcel O ’Gorman - March 21, 2023, 1:03 am

The recent frenzy over language processing tools such as ChatGPT has sent organisations scrambling to provide guidelines for responsible usage. The online publishing platform Medium, for example, has released a statement on AI-generated writing that promotes “transparency” and “disclosure.” My own institution has established an FAQ page about generative AI that calls on educators to make “wise and ethical use” of AI and chatbots. These ethical measures seem quaint, given this week’s release of the more powerful GPT-4, which runs the risk of being a disinformation and propaganda machine

Unchecked innovation
ChatGPT is powered by a supercomputer and powerful cloud computing platform, both of which were funded and created by Microsoft. Perhaps coincidentally, GPT-4 was released less than two months after Microsoft laid off an ethics and society team. Frustrated team members said the decision was based on pressure from Microsoft’s C-suite, which stressed the need to move AI products “into customers hands at a very high speed.” But it seems appropriate to ask what responsible innovation means as this high-speed, high-profit game of unchecked innovation rages on.

Responsible innovation
When asked ChatGPT what responsible innovation is, it wrote: “The process of developing and implementing new technologies, processes, or products in a way that addresses ethical, social and environmental concerns. It involves taking into account the potential impacts and risks of innovation on various stakeholders, including customers, employees, communities, and the environment.”ChatGPT’s definition is accurate, but bereft of context. Whose ideas are these and how are they being implemented? Put otherwise, who is responsible for responsible innovation?
Google founded a responsible innovation team in 2018 to leverage “experts in ethics, human rights, user research, and racial justice.” The most notable output of this team has been Google’s responsible AI principles. But the company’s ethical profile beyond this is questionable.
These lingering issues, along with Google’s parent company’s recent antitrust indictment, demonstrate that a focus on responsible AI is not enough to keep large tech companies from being “evil.”

Ethics-washing
The Association for Computing Machinery’s Code of Ethics and Professional Conduct states that tech professionals have a responsibility to uphold the public good as they innovate. But without support from their superiors, guidance from ethics experts and regulation from government agencies, what motivates tech professionals to be “good”? Can tech companies be trusted to self-audit?Another issue related to self-auditing is ethics-washing, where companies only pay lip service to ethics. Meta’s responsible innovation efforts are a good case study of this.
This shift from responsible innovation to social innovation is an ethics-washing tactic that obfuscates unethical behaviour by changing the subject to philanthropy.
Responsible innovation vs. profit
Unsurprisingly, the most sophisticated calls for responsible innovation have come from outside corporate culture.The ICTC’s principles call for tech developers to go beyond the mitigation of negative consequences and work to reverse social power imbalances.
One might ask how these principles apply to the recent developments in generative AI. When OpenAI claims to be “developing technologies that empower everyone,” who is included in the term “everyone?” And in what context will this “power” be wielded?

Value tensions
There is a persistent tension between financial valuation and moral values in the tech industry. Responsible innovation initiatives were established to massage these tensions, but recently, such efforts are being swept aside. The future of responsible innovation may depend on how so-called “common sense business practices” can be influenced by so-called “woke” issues like ethical, social and environmental concerns. If ethics can be washed away by dismissing them as “woke,” the future of responsible innovation is about as promising as that of the CD-ROM. (The Conversation).
Marcel O’Gorman, University of Waterloo, Canada.