Chris Heckman
Assistant Professor
Shilo Brooks
Director, Engineering Leadership Program
Shaz Zamore
Head of STEAM outreach at ATLAS
Ethical technology requires new approaches to education, research and inclusion.
For the first few decades of the computing age, computers were monolithic machines in big places, out of reach for most of the general public.听
When personal computers came along, everything changed, according to 抖阴旅行射 Boulder鈥檚 Bobby Schnabel. Since then, the growing ubiquity of computing has compounded both the number of devices and the ethical issues inherent in their development and use.听
鈥淲hen people started being able to interact with those computers, they became two-way devices,鈥 said Schnabel, external chair of the Department of Computer Science and former CEO of the Association for Computing Machinery. 鈥淎ll sorts of things have arisen that impact people鈥檚 lives.鈥澨
Today, the field is grappling with many of those impacts, like bias in machine learning algorithms and social media networks that are easily manipulated.听 听
鈥淎s a discipline, we need to take ownership of that and go fix it,鈥 said department Chair Ken Anderson. 鈥淐omputer science has to mature as a discipline and start to say, 鈥楬ow do we bake in discussions of what鈥檚 important first before the technology starts to roll out?鈥欌
鈥楤iases as bad as ours鈥
At 抖阴旅行射 Boulder, some of those discussions are happening at the research stage.
Assistant Professor Chris Heckman works with advanced autonomous systems as director of the Autonomous Robotics & Perception Group. Though he sees great promise in technology as an augmented of human ability, he is concerned by the use of AI to make moral decisions.听
鈥淚 can鈥檛 say that humans are beyond reproach when it comes to this decision-making, and our autonomous systems that we build will have biases as bad as ours, if not worse,鈥 Heckman said. 听
For technologists, dual-use concerns are often brought to the forefront. A system designed to connect can isolate. A system built with good intentions can be weaponized. Unfortunately, human ingenuity makes the task of designing meaningful technology that could never be used in a dangerous manner next to impossible.听
Technologists can, Heckman argues, choose what systems they do or do not work on and choose to partner or not partner with certain entities, but once the technology floats further downstream, it becomes the responsibility of managers and end-users.听
鈥淚t is an organizational process that needs to ensure that autonomous systems are actually behaving according to the values and the mission that we have as a society 鈥 and that means a much more听robust education for organizations and end-users,鈥 he said. 听
听It is an organizational process that needs to ensure that autonomous systems are actually behaving according to the values and the mission that we have as a society.听
Chris Heckman
Educational opportunities
But what about technologists like Elon Musk, Jeff Bezos and Mark Zuckerberg, who are engineers-turned-business-leaders? When you create a technology and also implement it, how do you develop that ethical foundation?听
Since 1989, 抖阴旅行射 Boulder has been answering that question with a program that educates engineers in both ethics and technology, the Herbst Program for Engineering, Ethics & Society. The program introduces the 鈥済reat books鈥 of Western civilization, which have been used in the humanities for centuries to spark inquiry into ethics.
From the Herbst program tradition also came the Engineering Leadership Program, led today by Shilo Brooks. Brooks believes that in the modern era, engineers often become leaders in business. To that end, looking at classical ethical dilemmas helps them make better decisions in the future.听
鈥淭he best way to equip these future leaders is to think through some of these problems. It gives a foundation of curiosity and an intellectual agility that provides a map for how they ought to think through problems confronting them,鈥 Brooks said.听
The value of varied perspectives
As valuable as the age-old struggle for moral excellence is, it is also important to consider what viewpoints have been left out that provide valuable context for difficult ethical dilemmas we face today.听
For Shaz Zamore (they/them), head of science, technology, engineering, art and math (STEAM) outreach at the ATLAS Institute, the greatest ethical question today is how to increase space for different, equally valued perspectives.听
鈥淲hen you鈥檙e all working together in an equitable system with parity, with everyone鈥檚 background, experience and knowledge valued equally, that is where you鈥檙e going to see truly genius developments and life-changing knowledge come about,鈥 Zamore said.
Zamore thinks about ethics in relation to who has access. Who can make technology? Who can use it? Who learns about it, and how? 听
鈥淲hen it comes to outreach and engagement, one of the biggest barriers with underrepresented and severely underserved populations is that they are not told what their options are,鈥 they said. 鈥淭hey don鈥檛 know that you can ask questions and do experiments and get paid to do it.鈥澨 听
听One of the biggest barriers with underrepresented and severely underserved populations is that they are not told what their options are.听
Shaz Zamore
If students with different backgrounds are continually left out of the tech pipeline, their valuable insights are minimized, and the technological considerations built will not be as robust, Zamore said.
Anderson agrees and said that鈥檚 why the department has invested so heavily in diversity efforts, like creating the Bachelor of Arts in Computer Science, building logic and ethics courses into its curriculum, and partnering with groups like ATLAS and the National Center for Women and Information Technology.听
鈥淚t鈥檚 all intertwined,鈥 Anderson said. 鈥淭he diversity programs that we started are going to help us change these things over time so that the systems, as they鈥檙e being designed, have more diverse thinking behind them. We鈥檙e going through this phase in which the exclusionary practices that made this a white man鈥檚 world, people are now working to try to dismantle those as best they can.鈥