Digital Ethics Of Autonomous Vehicles

Are corporations rushing to win the autonomous vehicle race at the expense of their user’s security and privacy?



, 9 min.

In 2017, forty nine percent of the global population were connected online, and an estimated eight and a half billion connected things were in use worldwide (Gartner, 2017). These connected devices make up the Internet of Things (IoT) and are increasingly becoming interwoven into our daily lives. As the sprawling environment of the Internet of Things continues to grow (PEW, 2014), it presents us with distinct ethical and moral issues that need to be considered. This is no more apparent than in the development and future of connected and autonomous vehicles (CAVs). This report will consider the implications of CAVs on our society and the risks to our privacy, security and safety. CAVs are vulnerable to the possibility of privacy intrusions as much as any other device connected to the internet. Privacy can be explained through theories such as Tavani’s, Moore’s and Westin’s. What’s more is the Utilitarian and Deontological approach can help further these arguments.

“CAVs are vulnerable to the possibility of privacy intrusions as much as any other device connected to the internet.”

In 2015 (Wired), a connected Jeep and 2016 (Wired) a semi-autonomous Tesla were hijacked by hackers that were successful in taking over control of a moving vehicle by remotely hacking into the car’s internal computer network. Risks such as this pose a detrimental threat to the immediate safety of individuals using CAVs but also to the increased risk of large-scale cyber-attacks. Poorly secured IoT products have already experienced Distributed Denial of Service (DDoS).

Concept design of interior of self-driving cars.

In 2016, Mirai, a DDoS malware was discovered which used publicly and easily available exploits of unsecured IoT devices (Guardian, 2016). This shows that internet connected vehicles need to be constantly updated in response to ensure better security, data protection and privacy. If we apply this statement to computers and mobile phones, a survey found that fourteen percent of people never update their phone’s operating system, and forty two percent only do so when it’s convenient, despite these updates sometimes containing urgent security fixes (PEW, 2017). There is an inherent need in CAVs to not fall into the same trap, but maintain these updates, either automatically or through a new system of insurance that will make claims invalid without the latest update. The general public seemingly get frustrated with updates and I can apply Monahan’s (2001) theory here that states that the lack of education within society likely exacerbates problems. Additionally, there needs to be a consideration of the duration updates will be maintained for. When a newer model car enters the market, this may disadvantage those who can’t afford to upgrade. Manufactures may take the utilitarian approach here, but this may benefit the rich over the poor and perpetuate the digital divide further and enforce Monahan’s belief.

The Internet of Things

Moreover, individuals may need to be more educated in order to regulate the security of these systems in the computing profession. If the BCS is ‘committed to making IT good for society’ (BCS, 2017) they need to ensure that the IT professions they represent are doing their best to prevent hacking of these vehicles. In addition, accessibility and informational privacy (Tavani, 2008) can be related to the idea of hacking in which the personal space of an individual – in this case, a car – is intruded on as well as the transfer and exchange of their personal information. If we look at the result of hacking and what is done with this information. In a situation where the hacked information is being sold to third parties a Utilitarianism approach (BBC, 2014a) would mean this is not ethical due to the nature of the way the information was collected (without permission) and how it is being used. This is a problem that needs to be considered with fully automated cars.

Fortune (2017) estimate by 2040, ninety five percent of new vehicles sold will be fully autonomous. These cars will likely have no manual driving controls except as a means to instruct the vehicle about the desired destination. Vehicles such as the self-driving Google car, currently in development aim to take the drivers out the loop and free the driver to do something other than drive. But free to do what exactly? Lipson and Kurman (2016) envision cars with built in entertainment systems and even beds that “could offer a comfortable new viewing experience for fans of pornography”.

Waymo Google Self-driving cars. Two cars and one HGV lorry parked in a line.
Google self-driving fleet.

Often IoT devices are marketed as providing you with ‘more time’, but is this just pretence for corporations to collect our data and use it as a means to sell their products? Do they value profit over privacy or vice versa (The Conversation, 2018)? It’s not hard to imagine corporate surveillance is going to become more common whether the user is aware of it or not. Smart TV’s have already been found to watch you (Federal Trade Commission, 2017) and personal assistants could be listening (Guardian, 2018), what’s stopping similar companies to do this in a car? Here they can have access to conversations or gain an understanding of what you like to listen to, watch or read in your ‘free time’. This sort of invasion of privacy can be aligned with the idea and commonality of surveillance (Tavani, 2008) and questions whether it’s something people are going to have to adjust to and will slowly become more common in society. Deontological ethics (BBC, 2014b) would argue that if these companies were not following the rules and listening without permission then it is unethical and therefore and invasion of privacy.

Telenav (2018) is a company already developing in-car advertisement software and plan to monetize your data. But not all drivers may understand what privacy rights they’re signing away. A US Government Accountability Office report (2017) found none of the 13 carmakers in the study that collected data from connected vehicles had easy-to-read privacy notices and most don’t explain data sharing and use practices. This raises the question of consent of the user in terms of the use of personal information. Under the recently introduced EU GDPR legislation (2016), this completely disregards the principle stating information must be used in a transparent way. Along with the possibility of riders/drivers unknowingly renouncing one of their basic human rights: the right to a private life (Human Rights Act, 1998).

Telenav UI

Although it may appear, we are years away from a mass network of fully-autonomous cars on our roads, the race to be the first and therefore the market leader has already begun. As with any new technology, it raises the question: who will benefit most? Network effects support entrenchment of digital sovereigns that maintain control of our data and therefore pivotal points of control over our virtual lives (MacKinnon, 2012). Big decisions about people’s lives are increasingly made by software systems and algorithms; the more users these digital sovereigns retain, the more power they appear to acquire. But can they be trusted?


VW recently exhausted its reputation with the emergence that programmers inserted code into its cars central computer that could detect when the car was being tested for emissions in a lab (Guardian, 2015). It emerged that their cars were emitting emissions up to twenty times the allowed limit. So here we have evidence of a piece of software developed by a company knowingly that has an impact on health and the environment in the physical world. Albeit computers are digital entities, human beings are behind writing the code and therefore this scandal provides a possible insight into the future of CAVs: tighter regulations may need to be introduced. Independent boards like ACM could ‘enable professional development; and promote policies and research that benefit society’ (ACM, 2018). These bodies should carry out proper checks and regulations on the individuals who are developing systems and could promote internal whistleblowing for those working on unlawful practices. As the infrastructure and standards for IoT develop, car manufacturers need to work through ethical issues prior to CAVs becoming mainstream on our roads. The only problem is that these standards are often not applied and there are no defined laws.

Bird-eye view of a motorway intersection.
Photo by Denys Nevozhai

As technology outpaces law, computer professionals have the option to design secure and private systems. It may appear that machines make the decisions, especially in the trolley scenario (Quartz, 2018), but the reality is that humans program such systems to ‘think’ in a defined way. If these systems are to be designed around corporate interests, how will they promote a socially responsible deployment? Codes of conduct that outline the importance of securing IoT devices and CAVs are currently voluntary.

This year, the UK government Department of Digital Cultures, Media and Sport (2018) released guidelines that provided ‘recommendations in draft’ and shifted the burden from consumers to ‘service by design’ of IoT products. These are merely advised suggestions but could prove as useful groundwork for future legislation. Professional conduct by it’s very nature harvests professionals to act collectively, for example Johnson (2008) presents three examples: life (in medicine), safety (in engineering), and accuracy (in auditing). Tougher regulation could be a possible option to address concerns and allow professionals to refuse to do anything inconsistent with those standards and values outlined. Some argue that there is time for manufacturers to work through the ethical and moral issues prior to mainstream implementation of fully-autonomous vehicles being deployed (Kirkpatrick, 2015). What’s more is predictions vary widely for when this will happen (ibid).


Despite the UK government attempting to be seen as a leader of secure IoT developments (Department of Digital Cultures, Media and Sport, 2018), other state plans foster conflicting and problematic messages. Thereasa May’s ‘snoopers charter’ would effectively reduce security of IoT and CAVs by forcing companies to produce a ‘backdoor’ into systems that will allow authorities to surveil users (Wired, 2017). This case has profound interest for criminals who could exploit weaknesses. Access to lawful interception by the police as means to pull over vehicles could be detrimental to people living under dictatorship rule and unstable or controversial governments. China for instance favours ‘internet sovereignty’ in which states govern the internet through international law and organizations (Fidler, 2016). Bashar al Assad’s secret police or radical groups that control regions are other potential threats to including such capabilities. You could argue this is too much control, but is it any different to control already available to corporations? By looking at the possibility of state access through the deontological lens, the government has a duty to keep its people safe and to prosecute those who are unlawful and pose a threat to society.

Silver truck driving driving along a road. The driver is on a mobile tablet computer.
Dailmer self-driving truck.

The driverless cars of the future are likely to outperform humans, with better reaction times and the ability to not get distracted or tired and therefore create a safer experience. Many sources such as McKinsey & Co. (2015) claim that automation will reduce accidents by as much as up to 90%. Its likely manufacturers will take a utilitarian stance by arguing that machine control benefits the larger population by maximising their safety. As these vehicles become more intuitive, this poses the potential devaluation of practical knowledge and skillsets among a proportion of society.

“driverless cars of the future are likely to outperform humans, with better reaction times and the ability to not get distracted or tired”

In 2015, Dailmer were revealed to be the first self-driving truck to be approved for use on roads in the United States (Rutkin, 2015). The freight industry isn’t alone, Uber and Lyft are currently working on automative taxi infrastructure (Medium, 2018). Driving careers would most definitely have an affect but to counter this the demand for technicians, mechanics and customer service teams would increase as the implementation of commercial AVs expands. In the eighties and nineties there were similar concerns that the invention of automatic teller machines (ATMs) would take the place of human tellers (The Atlantic, 2015). Although branches had fewer tellers, instead, more branches and the number of tellers grew. But the replacement of freight drivers could be significant economically. In the US alone, there are about 1.7 million heavy truck drivers (BLS, 2017a) and 1.4 million delivery drivers (BLS, 2018b). There will therefore need to be low-skilled jobs available and not just software and engineering high-skilled opportunities. Without this range in place, the digital divide could perpetuate within different strands of society.


To conclude, Wolmar (LabourList, 2017) argues this technology is forcefully building pace as manufacturers compete to be the first success in the market but also established companies don’t want to be left behind in the development of what seems like a definite future. CAVs and eventually fully autonomous vehicles will shift the infrastructure, environment and culture of society, but the preparations for such an event seem to be murky. As IoT devices become ever-more present in our lives, privacy and security concerns are likely to increase. Although the advantages of CAVs seem appealing, it will be professionals that design these systems that need to consider better regulation or ethical issues in regard to user’s data and right to a private life. Both deontological and utilitarianism ethical approaches have been applied here along with Tavini’s theories of privacy.


  1. ACM. (2018) What is ACM? Available from: [Accessed 25 November 2018].
  2. BBC. (2014a) Consequentialism. Available from: [Accessed 24 November 2018].
  3. BBC. (2014b) Duty-based Ethics. Available from: [Accessed 24 November 2018].
  4. BCS. (2017) About Us. Available from: [Accessed 24 November 2018].
  5. BLS (2018a) Occupational Employment and Wages, May 2017. Available from: [Accessed 12 December 2018].
  6. BLS (2018b) Delivery Truck Drivers and Driver/Sales Workers. Available from: [Accessed 12 December 2018].
  7. Department of Digital Media and Sport (2018) Secure by Design: Improving the cyber security of consumer Internet of Things Report.
  8. European Parliament and Council Regulation 2016/679 of 27 April 2016 (General Data Protection Regulation) OJ L 119/1.
  9. Federal Trade Commission. (2017) What Vizio was doing behind the TV screen. Available from: [Accessed 25 November 2018].
  10. Fidler, D P. (2016) Just & Unjust War, Uses of Force & Coercion: An Ethical Inquiry with Cyber Illustrations. Deadalus Journal of the American Academy of Arts & Sciences. 145 (4) pp.37-49.
  11. Fortune. (2017) Here’s When Having a Self-Driving Car Will Be a Normal Thing. Available from: [Accessed 23 November 2018].
  12. Gartner. (2017) Gartner says 8 billion connected things will be in use in 2017, up 31 percent from 2016. Available from: [Accessed 23 November 2018].
  13. Guardian. (2015) The Volkswagen emissions scandal explained. Available from: [Accessed 24 November 2018].
  14. Guardian. (2016) DDoS attack that disrupted internet was largest of its kind in history, experts say. Available from: [Accessed 26 November 2018].
  15. Guardian. (2018) Shhh… Alexa might be listening. Available from: [Accessed 25 November].
  16. Human Rights Act 1998 [online]. Chapter 42. (1998) Westlaw UK. Available from: [Accessed 27 November 2018].
  17. Johnson, D G. (2008) Computing ethics: Computer experts: guns-for-hire or professionals? Communications of the ACM. 51 (10), pp.24-26.
  18. Kirkpatrick, K. (2015) The Moral Challenges of Driverless Cars. Communications of the ACM. 58 (08), pp.19-20.
  19. LabourList (2017) Christian Wolmar: Driverless cars not the solution to political and social problems. Available from: [Accessed 12 December 2018].
  20. Lipson, H. Kurman, M. (2016) Driverless: Intelligent Cars and the Road Ahead. MIT Press.
  21. MacKinnon, R. (2012) Consent of the Networked: The Worldwide Struggle For Internet Freedom. New York: Basic Books Inc.
  22. McKinsey & Co. (2015) Ten ways autonomous driving could redefine the automotive world. Available from: [Accessed December 12 2018].
  23. Medium. (2018) Forget self-driving cars: Uber and Lyft have bigger ambitions. Available from: [Accessed December 2018].
  24. Mepham, T. (2005) Bioethics for the biosciences: an introduction. Oxford University Press, Oxford.
  25. Monahan, T. (2001) The analog divide: technology practices in public education. ACM SIGCAS Computers and Society ACM SIGCAS Computers and Society. 31 (3), pp.22-31.
  26. PEW Research Center (2014) The Internet of Things will thrive by 2025. Washington D.C.
  27. PEW Research Center (2017) Americans and Cybersecurity. Washington D.C.
  28. Quartz (2018) Philosophers are building ethical algorithms to help control self-driving cars. Available from: [Accessed 12 December 2018].
  29. Rutkin, A. (2015) Long road to autonomy. New Scientist. 226 (3021) pp.22.
  30. Subcommittee on Research and Technology, Committee on Science, Space, and Technology, House of Representatives (2017) Vehicle Data Privacy. Washington D.C: United States Government Accountability Office (GAO-17-656).
  31. Tavani, HT. (2008) Informational Privacy: Concepts, Theories and Controversies. In: Himma, K.E. and Tavani, H.T., eds. (2008) The handbook of information and computer ethics[online]. USA: Wiley, Chichester; Hoboken, N.J. pp. 135-139. [Accessed 24 November 2018].
  32. Telenav (2018) Products. Available from: [Accessed 25 November 2018].
  33. The Atlantic. (2015) Scarce Skills not Scarce Jobs. Available from: [Access December 2018].
  34. The Conversation (2018) Cambridge Analytica used our secrets for profit – the same data could be used for public good. Available from: [Accessed 28 November 2018].
  35. Wired. (2015) Hackers remotely kill a Jeep on the highway – with me in it. Available from: [Accessed November 2018].
  36. Wired (2016) Tesla responds to Chinese hack with a major security upgrade. Available from: [Accessed November 2018].
  37. Wired (2017) What is the IP act and how will it affect you? Available from: [Accessed December 2018].

Tags: Digital Ethics, Privacy and Security