People of ACM - Don Gotterbarn
August 22, 2013
Don Gotterbarn, a Professor Emeritus at East Tennessee State University, is a leading author of the Software Engineering Code of Ethics and Professional Practice, which promotes ethics among software engineers. Active in professional computer ethics for over 20 years, Gotterbarn received the 2005 Outstanding Contribution to ACM Award for his leadership as both an educator and practitioner, and for promoting the ethical behavior of computing professionals and organizations. He received the ACM SIGCAS Making a Difference Award in 2002 for his research and work regarding computer and software engineering ethics. Gotterbarn is also an ACM Distinguished Speaker and chairs the ACM Committee on Professional Ethics. In the mid-1970s, he left a career teaching philosophy and entered the computing field as a consultant for clients that included the US Navy and the Saudi Arabian Navy. He has also worked on the certification of software for vote counting machines and missile defense systems.
How do you react to news reports about computer hacking in light of the ethical standards you have developed through research, implementation, and advocacy of ACM's Codes of Ethics on an international scale?
I am optimistic about computer practitioners and their basic motivations. Hacking covers a wide range of activities, motivations, and consequences and that colors my reactions. Hacking into an open source tic-tac-toe program seems innocent enough unless it is done by my employee who is getting paid to be testing the avionics of an airliner. The University of Michigan group that hacked into the D.C. Board of Elections Internet voting test did not violate the Codes of Ethics and helped improve Internet voting.
When the hacker's intention is cruelty as in the destruction of a traffic signal system, I am disappointed at an attitude which spawns such nastiness. The existence of people who do not value other human beings is unfortunate and adds to our security concerns with safety-critical devices like pacemakers and insulin pumps. This malicious hacking is also unethical and inconsistent with any code of practice in computing.
Codes of Professional Practice, acting as a conscience of a profession, shed a clear light on malicious hacking. When morally good software professionals hack without thinking of the consequences, I feel I should have done more to help promote these Codes. When we lose focus on the contexts of our systems and just concentrate on time-to-develop and cost, we risk producing a product which is less good for society even though it may meet its technical specifications.
When people are aware of the ethics of a situation and make a conscious choice not to follow the standards of the profession, it is disappointing. But I am encouraged by the broad international agreement on how to be an ethical professional. After ACM and IEEE-CS officially adopted the Software Engineering Code of Ethics ( Software Engineering Code of Ethics Is Approved, CACM 42, 10), I received translations from computing societies around the world who wanted to use the code. I heard from companies that had adopted it as a standard of practice.
When did you realize that the products developed by software engineers, and the way they are developed, embody ethical and social values that impact society?
Computing professionals are trained that divide and conquer is a good way to simplify and solve complex problems. And in many contexts it is. We compartmentalize, modularize, use state diagrams, address a particular use case, and identify the single most space-efficient algorithm. In school and in the field, the competent completion of each project—getting the computer to do what was specified in the customer's requirements—is considered a personal success. The notion that simply meeting the technical requirements is sufficient for a computing professional is a mistake. This "competent completion" trap prohibits the true professional from seeing the broader picture of consequences and responsibilities, as I saw in my own work many years ago.
Last century (1976) as a young man working on programs to catch Medicare cheats, I introduced a simple date of birth (dob) scan, rejecting as invalid any claims with future birth dates. Unfortunately my simple scan generated many false positives. Because birth years only used the last two digits, anyone who was 77 or older had their claim rejected because the 77 of 1877 or the 78 of 1878 are greater than the 76 of 1976 (an early instance of the Y2K problem). The common date edit was wrong for this context and hurt many people whose refund and necessary medical services were delayed. It was then that I read Joseph Weizenbaum's Computer Power and Human Reason: From Judgment to Calculation, which gave clear expression to the need for responsible judgment in software development.
The question "When did you realize…" implies a single one-time realization about your work's impact. It is true for me, and may be true for many computing professionals that we are often distracted by the complexity of our work, and are tempted to narrow its focus to the "technical" issues. We forget the contexts of our systems and box out broader concerns. Psychologists even have a name for this: "bounded ethicality." (See Bazerman, M. and Tenbrusel, A., Blind Spots: Why we fail to do what is right and what to do about it, Princeton University Press, 2011.)
In the early 80's I was helping a bank develop its own ATM system and was worried about deadlock and communications issues. After tests showed it met the specified functionality we rolled it out in a retirement community, and then found that the system timeouts were too short for those with limited dexterity. The failure to consider both dexterity and visual acuity of the primary users was a mistake of technical tunnel vision.
The XP operating system has an accessibility aid for the visually challenged. The software fits a competent creation model—when used as directed, it will increase the font size. Unfortunately, normal vision is required to locate this functionality. Moreover, after clicking on the SIZE FONT YOU CAN READ, the rest of the instructions are written in small font. Competent creation was not enough; thinking of the users in a broader context is also required.
Software embodies the values of the developers. When we set the time allowed for entering a phone number and what to do when the time is exceeded, we are making socio-technical decisions. (Gotterbarn, D. and Miller K. "The Public Is the Priority: Making Decisions Using the Software Engineering Code of Ethics" IEEE Computer 42, 6, June 2009.)
What is entailed in your new approach to the identification of software risk to reduce the rate of system failures in software development as it applies to e-voting?
Simon Rogerson (UK) and I developed a Software Development Impact Statement (SoDIS) to proactively analyze potential negative effects of software outside a narrow technical box. The SoDIS process (including an automated tool) uses project tasks, ethical issues, and stakeholders to form a question for the risk analyst: "Might reducing dial completion time restrict access by senior citizens?" Answering these questions helps identify potential concerns which can be addressed before the software is implemented. The process has been used on several projects including an analysis for the UK Deputy Prime Minister's electronic voting plans. We identified numerous ethical concerns that led to modified requirements for bidders to address specific ethical concerns.
What advice would you give to budding technologists who seek and exploit weaknesses in a computer system or computer network for profit, protest, or challenge?
To technologists who want to rob, harm, and be cruel, I would advise them not to do it. Unfortunately they are unlikely to listen to me or anyone else. In general the exploitation of system weaknesses or violation of professional standards is not done for good purposes. I do think that there are cases of hacktivism which have overall positive consequences and some which have largely negative consequences.
Most of us want to do what is right and be proud of our work. Doing this is not a simple technical problem. A review of ACM's General Code of Ethics and Software Engineering Code of Ethics helps identify some issues that are frequently missed. Every time we undertake a task we need to relentlessly try to identify potential impacts of our work and how they are integrated into society. To ignore the social and ethical effects of our work is to do half a job.