Dr David Douglas in Delft: defining ethical responsibilities in new technologies  

June 25th, 2023

Presenting at the 2023 forum for Philosophy, Engineering, and Technology, DR DAVID DOUGLAS shared insights from current research defining ethical responsibilities and risks in relation to new technologies.

Dr David Douglas presents his current research in Delft.

Presenting at the 2023 forum for Philosophy, Engineering, and Technology, DR DAVID DOUGLAS shared insights from current research defining ethical responsibilities and risks in relation to new technologies.  

In April 2023, Dr David Douglas represented the Responsible Innovation Future Science Platform at the 2023 forum for Philosophy, Engineering, and Technology (fPET 2023). FPET 2023 was held in Delft, in the Netherlands. His presentation focussed on a current research collaboration with Dr Justine Lacey. The aim of the research is to define the concept of ‘ethical risk’ for new technologies.  

Dr Douglas argued that ethical risk needs to be clearly distinguished from other forms of risk, such as legal or environmental risk. His account focuses on how the ethical risks of new technologies are interconnected with the responsibilities of different people who create and use them.

These might be individuals or groups of people who develop or use a technology, such as users, developers, engineers, drivers, and doctors. Or, they could be people who are affected by others using a technology, such as passengers, pedestrians, and patients. The technical risks of a new technology – that is, characteristics that may result in unwanted outcomes – can create ethical risks if they result in someone failing to meet their ethical responsibilities. 

Defining ethical responsibilities

Defining what counts as an ethical responsibility is another challenge Dr Douglas is addressing with his research. These responsibilities can be identified by considering each person’s role in relation to the new technologies. For example, a doctor has a responsibility to provide the best possible care for their patients. So if they decide to use a new technology, they need to ensure that it is the most effective means available of treating the patient. They also need to make sure that any risks of harm are comparable to other treatment options.

Similarly, an engineer has a responsibility to ensure their solutions are safe and robust. They can do this by mitigating technical risks, or making sure they comply with accepted engineering standards. Regulators and institutions also have responsibilities to establish and maintain standards for the risks associated with new technologies.  

The power of decision-makers

Another way of identifying ethical responsibilities could be to examine how a technology is used within a larger process, and how one person’s decisions and actions can affect others. For example, the decisions a radiologist makes in preparing medical scans of a patient will affect both the patient and the doctor treating them. This is because the patient is affected by the medical scan itself, and by the doctor’s decisions for treating them based on how they interpret these scans. 

Examining who makes decisions about how acceptable a technology’s technical risks are – and who is affected by these decisions – can also help us define ethical responsibilities. If a person making decisions about technical risks is not affected by them, but others will be, the decision-maker has an ethical responsibility to the people who will be impacted. For example, a developer creating an AI system for evaluating job applicants in another industry has an ethical responsibility to the job applicants that the AI will not make decisions that are biased by the applicant’s race or gender. 

Surgical robots and AI design

Previous research has also highlighted the importance of context when determining technical and ethical risks. The research was led by a team including Dr Douglas. It focused on the use of 3D printed robotic tools for surgery designed by AI. The researchers interviewed people who would be involved in creating and using these tools. Participants included surgeons, radiologists, and patients. The researchers found that peoples’ responsibilities extended beyond the specific stages of the process they were directly involved in.

For example, the creators of the AI system designing the tool were responsible for clearly defining the requirements for the medical scans that the AI would use to design the tool, and for ensuring that the tool designs created by the AI would be safe to use by surgeons.  

The forum for Philosophy, Engineering, and Technology (fPET) is a conference where philosophers, engineers, and other scholars come together to discuss philosophical and ethical issues in engineering and technology. Topics discussed at the conference included moral responsibility in engineering, autonomous vehicles, AI ethics, and the philosophy of technology.