Computing Cases Header, Picture of a Keyboard with the text "" printed over it


Teaching Tools

Teaching with Cases

Social Impact Analysis

Computer Ethics Curriculum

Curricula Index

Case Materials



Hughes Aircraft

Ethics in Computing Links

Contact Us

Why a Socio-Technical System?

You will find us using the phrase "socio-technical system" a great deal in this web site. It is not just because we like big words (though we do). The idea of a socio-technical system (abbreviated as STS) is an intellectual tool to help us recognize patterns in the way technology is used and produced. Identification of these patterns will help us to analyze the ethical issues associated with the technology-and-its-social-system.

It is by now a truism to say that any single technology can be used in multiple, and sometimes unexpected, ways. But we need to add to this observation that, in each different use, the technology is embedded in a complex set of other technologies, physical surroundings, people, procedures, etc. that together make up the socio-technical system. It is only by understanding this system that we can parse out the ethical issues.

Let’s take as an example a relatively simple technology: a set of 10 microcomputers connected to by a network. The social and ethical issues associated with these networked computers will change dramatically depending upon the socio-technical system in which they are embedded. For instance, are the networked computers:

  • part of the intake unit of an emergency room
  • a small, public lab at a university
  • the computing lab of an elementary school
  • a risk analysis office in an insurance firm
  • a military supplier testing manufactured parts

The networked computers in each of these different circumstances are part of different socio-technical systems. The "ethical issues in computing" arise because of the nature of specific socio-technical systems, not because of the computers in isolation. Many of these ethical issues are intimately related, however, to the technology: issues of reliability of the system in the emergency room, data privacy in the insurance company, free speech and misuse in the public university lab. These are not just social systems, they are socio-technical systems, and the ethical issues associated with them are based in the particular combination of technology and social system. It is the technology, embedded in the social system that shapes the ethical issues.

Top of Page

What is a socio-technical system?

You have divined by now that a socio-technical system is a mixture of people and technology. It is, in fact, a much more complex mixture. Below, we outline many of the items that may be found in an STS. In the notes, we will make the case that many of the individual items of a socio-technical system are difficult to distinguish from each other because of their close inter-relationships.

Socio-technical systems include:

  • Hardware Mainframes, workstations, peripheral, connecting networks. This is the classic meaning of technology. It is hard to imagine a socio-technical system without some hardware component (though we welcome suggestions). In our above examples, the hardware is the microcomputers and their connecting wires, hubs, routers, etc.
  • Software Operating systems, utilities, application programs, specialized code. It is getting increasingly hard to tell the difference between software and hardware, but we expect that software is likely to be an integral part of any socio-technical system. Software (and by implication, hardware too) often incorporates social rules and organizational procedures as part of its design (e.g. optimize these parameters, ask for these data, store the data in these formats, etc.). Thus, software can serve as a stand-in for some of the factors listed below, and the incorporation of social rules into the technology can make these rules harder to see and harder to change. In the examples above, much of the software is likely to change from the emergency room to the elementary school. The software that does not change (e.g. the operating system) may have been designed more with one socio-technical system in mind (e.g. Unix was designed with an academic socio-technical system in mind). The re-use of this software in a different socio-technical system may cause problems of mismatch.
  • Physical surroundings. Buildings also influence and embody social rules, and their design can effect the ways that a technology is used. The manager's office that is protected by a secretary's office is one example; the large office suite with no walls is another. The physical environment of the military supplier and the elementary school are likely to be quite different, and some security issues may be handled by this physical environment rather than by the technology. Moving a technology that assumes one physical environment into a different environment one may cause mismatch problems.
  • People Individuals, groups, roles (support, training, management, line personnel, engineer, etc.), agencies. Note that we list here not just people (e.g. Mr. Jones) but roles (Mr. Jones, head of quality assurance), groups (Management staff in Quality Assurance) and agencies (The Department of Defense). In addition to his role as head of quality assurance, Mr. Jones may also have other roles (e.g. a teacher, a professional electrical engineer, etc.). The person in charge of the microcomputers in our example above may have very different roles in the different socio-technical systems, and these different roles will bring with them different responsibilities and ethical issues. Software and hardware designed assuming the kind of support one would find in a university environment may not match well with an elementary school or emergency room environment.
  • Procedures both official and actual, management models, reporting relationships, documentation requirements, data flow, rules & norms. Procedures describe the way things are done in an organization (or at least the official line regarding how they ought to be done). Both the official rules and their actual implementation are important in understanding a socio-technical system. In addition, there are norms about how things are done that allow organizations to work. These norms may not be specified (indeed, it might be counter-productive to specify them). But those who understand them know how to, for instance, make complaints, get a questionable part passed, and find answers to technical questions. Procedures are prime candidates to be encoded in software design.
  • Laws and regulations. These also are procedures like those above, but they carry special societal sanctions if the violators are caught. They might be laws regarding the protection of privacy, or regulations about the testing of chips in military use. These societal laws and regulations might be in conflict with internal procedures and rules. For instance, some companies have implicit expectations that employees will share (and probably copy) commercial software. Obviously these illegal expectations cannot be made explicit, but they can be made known.
  • Data and data structures. What data are collected, how they are archived, to whom they are made available, and the formats in which they are stored are all decisions that go into the design of a socio-technical system. Data archiving in an emergency room it will be quite different from that in an insurance company, and will be subject to different ethical issues too.

Top of Page

Socio-Technical Systems change over time

So far, we have been talking about differences between different socio-technical systems. In this section we address the changes that can occur over time within any particular socio-technical system.

An STS is configurable in all its elements, and this allows for change over time. By configurable, we mean that particular items in an STS can change over time, and that even among those items the configuration of one element may change. For instance, the particular mix of hardware and software within an elementary school’s computing lab may change as the school gets access to the internet, or as more teachers begin to use the lab for their classes. But this change might also be reflected in changes in procedure (e.g. rules about access to certain sites) and people (someone may need to take the role of censor in approving or disproving sites) and data (downloaded software, music, cookies, etc. on the machines hard drives).

Change in an STS has a trajectory.

As the above example indicates, the change from a stand-alone computer lab to a lab connected to the internet may produce a coordinated set of changes within the socio-technical system. This coordinated series of changes in an STS is called a trajectory. These changes can occur at the level of the STS itself, as in the internet example, or they can occur at the level of the individual parts of the system. For example, a different (but overlapping) socio-technical system supports the rapid evolution of microcomputers and their regular obsolescence. Elementary schools that do not keep up with this trajectory of the hardware in their system will find themselves quickly beset with problems.

These trajectories are most influenced by those with social power.

Since these trajectories are coordinated, who coordinates them? Research by psychologists, sociologists, and anthropologists in social informatics has led to the conclusion that trajectories are most influenced by and usually support those with social power. A few minutes reflection will make this statement seem self-evident. Social power if measured by money, influence, and other forces available to actors to help them influence change in a way that is in line with their goals. So, saying that trajectories are most influenced by those with social power is saying, in essence, that those with social power have power to influence trajectories. Not too surprising.

But the point is more than this. Trajectories usually support the status quo—those who already have power in a system. These are the individuals who get most influence in the construction of the technical specification of a technology, who pay for its implementation, and who guide its use so that it serves their purposes.

There is still an ongoing debate among those who study such things about whether social power always wins in the contest to influence technological trajectories. There is, for instance, clear evidence that struggle groups, groups with much less political power than governments, have been able to effectively use computing technology (specifically the internet) to augment their power. On the other hand, many repressive governments use technology in finely crafted ways to control the information their populations may access.

Research on the use of technology in organizations has not supported earlier claims that the technology itself will produce a "leveling effect" in organizations (ref to Attwell & Rule). The idea, appealing at first, was that since technology enables easy communication among all levels of an organization, it will have the inevitable effect of "flattening out" the hierarchy in organizations that adopt it. By and large, this has not turned out to be true. Organizations can adopt computing technology with the intent of flattening their hierarchy, and it will help do this. But organizations can adopt computing technology with the intent of strengthening the hierarchy (by, for example, installing keystroke level work monitoring on all computers). Again, it is the socio-technical system that produces the effects and structures the ethical problems, rather than the technology alone.

Top of Page

Trajectories are not value neutral.

A moment's reflection should lead you to the conclusion that trajectories are rarely value-neutral. Trajectories have consequences and these consequences may be good or ill (or good AND ill) for any of the stakeholders in a socio-technical system. This is why ethical reflection should be a part of thinking about socio-technical systems.

Socio-technical systems and our ethical cases

Why should we use the language and approach of socio-technical system in analyzing our cases? There are really two questions here:

  • What does socio-technical analysis add to the standard software engineering approach? Standard software engineering approaches certainly focus on hardware, software, and procedures and rules that should be incorporated into the software. To the extent that they concentrate on people and roles, they are mostly interested in the explicit interaction a person has with the technology and on the place in the hierarchy the person occupies as a result of their role. The concentration here is most clearly on the visible and the documented. A socio-technical analysis adds to this picture those aspects of work that are implicit, not documented, and based in informal political systems and actual (rather than ideal or documented) work practices. For the purpose of designing systems, a socio-technical analysis adds to standard concerns of efficiency concerns about skill development and social context. For the purpose of ethical analysis, a socio-technical analysis adds a concern for discovering the hidden practices and possible undocumented effects of a technological implementation.
  • How does socio-technical system analysis differ from standard analysis in ethics? Standard stakeholder analysis in ethics spends most of its time looking (surprise) for stakeholders. This is really only one element of what we have identified as a complex socio-technical system. Procedures, physical surroundings, laws, data and data structures, etc. all interact to structure the particular ethical issues that will be relevant to a particular socio-technical system. Thus, a socio-technical analysis provides a more comprehensive and system oriented approach to identifying ethical issues in a particular implementation of technology.

Top of Page