Would it be acceptable to use Siri as a surveillance device?

Group 10

Introduction

Statistics have shown that around 50% of the UK’s population owns an iPhone [1]. These smartphones make up half of the revenue of Apple, the most valuable company in the world, which managed to get its striking reputation from protecting the privacy of its users. In 2011, Apple has launched Siri [2], a digital assistant that makes interacting with Apple devices easier. Siri can perform tasks such as making calls, providing weather forecasts, setting alarms or answering miscellaneous questions. Despite Siri’s wide scope and diverse applications, is it ethical, and will consumers object to a more pervasive level of surveillance from their smartphones? 

Argument against

Today, we live in a society where everyone’s data is less and less private. In this context, even Apple, which claims that “what happens on your iPhone, stays on your iPhone” [3] seems to be no exception to the rule. 

Based on an article written by The Guardian [4], Siri has been listening to confidential information in order to enhance response quality.

This argument was further supported by a whistleblower working for Apple, who claimed that there are numerous recordings of private discussions, conversations between doctors and patients, criminal dealings, business deals and other highly sensitive details which are gathered both accidentally and deliberately. Recordings are sensitive to accidental activations, which could be triggered by negligible stimuli, such as the sound of a zip [5].

From the Federal Trade Commission which promotes consumer protection, it is unethical to inform consumers that their information is protected, while secretly eavesdropping, even in the case of an anonymous individual [6]. This seems to conflict with care ethics, as the relationship between the two parties will be affected. From a utilitarian perspective, and considering normative judgement, the action of eavesdropping is immoral as individuals would be unhappy knowing that their information is shared among others. 

In a more recent incident, occurring in 2022, users’ conversations were recorded without the activation of Siri & Dictation settings. The bug was fixed on most iPhones and the company claims that the data collected has been deleted [7]. A year earlier, Apple was sued, with plaintiffs claiming that unauthorised recordings were shared with advertisers. Users in the lawsuit declared that they had been receiving targeted ads, based on their private conversations. These advertisements included branded treatments, sunglasses and sneakers [8]. The company is violating virtue ethics and is not living up to its promises, as its word about protecting clients’ privacy, as published on its website [9], does not match its actions. Invading people’s privacy should not be seen as an excuse to grade responses and users should have transparency about the actions taken.

After these revelations, Apple apologised. This lack of transparency isn’t in accordance with care ethics. Following that, the company made a decision to change Siri’s data policy which shows that they are aware that any monitoring of its customers could lead to a loss of trust on their part [10]. Moreover, the absence of disclosure might indicate a mysterious fear of telling the customers the truth.

While digital surveillance might have many advantages, the population is still not ready to accept it as it invades their privacy and questions their freedom and rights to use their devices. This was supported by the articles against the company’s actions, the lawsuits issued and the ethics breached.

Bangkok, Thailand – July 30, 2019 : Siri, Apple’s voice-activated digital assistant, tells iPhone user to ask her by showing the text “Go ahead, I’m listening” on the display.

Argument for

Jeremy Bentham’s Panopticon – a conceptual prison where inmates can be observed from a single surveillance tower – is an archaic example of the unfeeling nature of utilitarianism. Broadly recognised today as unethical and of psychological detriment to inmates – who were unable to determine when they were being watched – the Panopticon offers a template for a contemporary, and more liberal, means of mass surveillance [11].

Siri, a virtual assistant native to all Apple devices, could provide these means. Whereas Bentham’s Panopticon was conceived to discipline inmates primarily through fear, users of Siri – who must purchase Apple products to access Siri – give implicit consent to be listened to. This places the use of Siri as a means of surveillance squarely within the scope of deontological thinking; should prospective users wish not to be surveilled themselves – or consider the surveillance of others as unethical – they need not purchase Apple devices.

Siri overcomes certain human limitations of mass surveillance systems. Modern, programmable technology inherits the biases of its creators [12] but can – through proofing, scaled trial implementation and even state intervention [13] – be refined to operate independently of human prejudice. Where once a prison guard would make wholly human judgements, now similar judgements can be made, instantaneously, by AI. AI is also a far more suitable subject for virtue ethics than humans. Whilst it is reductive to think that a virtuous person is capable only of performing virtuous acts, an AI – programmed exclusively to collect audio data for public protection – can be considered to operate in a way compatible with virtue ethics. Ultimately, Siri – if able to listen to its surroundings – could alert authorities to cases of household violence, suspected terrorist activities, and medical emergencies, all the while collecting a comprehensive portfolio of evidence to be securely transferred to the appropriate authorities should the need arise.

On the surface, such intimate surveillance may seem unethical. In reality, if the general public believes that domestic abusers and potential terrorists should be comprehensively surveilled then – by Kant’s Deontology – it is ethical to employ such methods of surveillance.

Siri-based surveillance could also remove select individuals from gruelling, high-pressure jobs. Where Apple has a duty of care to its customers, state surveillance institutions have a duty of care to their employees. Removing a portion of the responsibility for identifying bomb threats – known to be highly unforeseeable events [14] – from human hands would be in line with such a duty of care.

Even if Siri-based surveillance proves an imperfect solution – which it almost certainly will [15] – the nature of AI dictates that improvements to such systems will be rapid and robust. Whilst humans are limited to the constraints of Darwinian evolution, technological progress – specifically in AI – is Lamarckian in nature [16]. As in many other fields, AI will be dominant within the surveillance sector in the not-so-distant future.

Initial Decision

We are against using Siri as a surveillance device.

References

[1]: Harry Brown, “How many people own a phone in the UK”, Gadget cover, Apr 26, 2019, Accessed Mar 23, 2022, https://www.gadget-cover.com/blog/how-many-people-own-a-phone-in-the-uk#:~:text=If%20you%20guessed%20Apple%2C%20you,followed%20by%20Huawei%20(7.3%25).

[2]: Jennifer Allen, “10 years of Siri: the history of Apple’s voice assistant”, Techradar, Oct 04, 2021, Accessed Mar 23, 2022, https://www.techradar.com/uk/news/siri-10-year-anniversary

Arguments against:

[3]: Gary NG,  “Apple’s Las Vegas Ad Touts “What Happens on your iPhone, Stays on your iPhone””, Iphone in Canada, Jan 05, 2019, Accessed Mar 26, 2022, https://www.iphoneincanada.ca/news/apple-las-vegas-iphone/#google_vignette

[4]: Alex Hern, “Apple overhauls Siri to address privacy concerns and improve performance”, The guardian, Jul 07, 2021, Accessed Mar 26, 2022,

https://www.theguardian.com/technology/2021/jun/07/apple-overhauls-siri-to-address-privacy-concerns-and-improve-performance

[5]: Alex Hern, “Apple contractors ‘regularly hear confidential details’ on Siri recordings”, The Guardian, Jul 26, 2019, accessed Mar 26, 2022, https://www.theguardian.com/technology/2019/jul/26/apple-contractors-regularly-hear-confidential-details-on-siri-recordings

[6]: Ted Greenberg, “We need a full investigation into Siri’s secret surveillance campaign”, The Guardian, Aug 14, 2020, Accessed Mar 26, 2022, https://www.theguardian.com/commentisfree/2020/aug/14/apple-siri-secret-surveillance-campaign-investigation

[7]: Emma Roth, “Apple says a ‘small portion’ of iPhones recorded interactions with Siri even if you opted out”, The verge, Feb 09, 2022, Accessed Mar 26 2022, https://www.theverge.com/2022/2/8/22924225/apple-ios-15-bug-recorded-interactions-siri

[8]: Jonathan Stempel, “Apple must face Siri voice assistant privacy lawsuit – U.S judge”, Reuters, Sep 21, 2021, Accessed, Mar 26, 2022, https://www.reuters.com/technology/apple-must-face-siri-voice-assistant-privacy-lawsuit-us-judge-2021-09-02/

[9]: Apple, “Improve Siri and Dictation & Privacy”, Apple, n.d., Accessed Mar 23, 2022, https://www.apple.com/legal/privacy/data/en/improve-siri-dictation/

[10]: Alex Hern, “Apple apologises for allowing workers to listen to Siri recordings”, The Guardian, Aug 29, 2019, Accessed Mar 26, 2022, https://www.theguardian.com/technology/2019/aug/29/apple-apologises-listen-siri-recordings

Arguments for:

[11]: The ethics centre, “Ethics Explainer: The panopticon”, Ethics, Jul 18, 2017, Accessed Mar 30, 2022,

Ethics Explainer: The Panopticon

[12]: Gideon Mann and Cathy O’Neil, “Hiring Algorithms Are Not Neutral”, Harvard Business Review, Dec 09, 2016, Accessed Mar 30, 2022,

https://hbr.org/2016/12/hiring-algorithms-are-not-neutral

[13]: Laura Hudson, “Technology Is Biased Too. How Do We Fix it?”, FiveThirtyEight, Jul 20, 2017, Accessed Mar 30, 2022, https://fivethirtyeight.com/features/technology-is-biased-too-how-do-we-fix-it/amp/

[14]: Timme Bisgaard Munk, “100,000 false positives for every real terrorist: Why anti-terror alogorithms don’t work”, FirstMonday, Aug 5, 2017, Accessed Mar 30, 2022,

https://firstmonday.org/ojs/index.php/fm/article/download/7126/6522

[15]: Ben Lovejoy, “Siri better at responding to medical & personal emergencies, say researchers, but more work needed”, 9to5Mac, Dec 28, 2016, Accessed Mar 30, 2022, https://9to5mac.com/2016/12/28/siri-medical-emergency-personal-crisis/amp/

[16]: Milan jelisavic, Kyrre Glette, Evert Haasdijk and A. E. Eiben, “Lamarckian Evolution of Simulated Modular Robots”, Frontiers, Feb 18, 2019, Accessed Mar 30, 2022, https://www.frontiersin.org/articles/10.3389/frobt.2019.00009/full

2 thoughts on “Would it be acceptable to use Siri as a surveillance device?

  1. It is an interesting topic and poses a good question concerning privacy and how AI deals with our information today. I think the utilitarian argument in the against case could have been made more clear by saying which majority is being affected. I like the arguments for using Siri as a surveillance tool. However, I guess the question now is how would we moderate using Siri to stop these threats while respecting and protecting privacy in the real world. It may be hard to reach a win-win solution but i believe one should exist

  2. Feedback
    1. Clarity of problem/dilemma
    I wasn’t too clear on the dilemma as you didn’t give too much detail on the ‘more pervasive level of surveillance’ – is this Siri’s ‘always on’ function?

    2. Use of ethical theories in the For Case
    The first paragraph seems to support the ‘Against’ argument. Am I mis-reading it? I recall from talking to you an example where Siri detected someone had fallen and alerted a care-giver – this seemed like a good use of Siri’s surveillance ability. Could you expand the ethical reasoning here as it doesn’t seem as strong as the ‘Against’ case.

    3. Use of Ethical theories in the Against case
    Effective reasoning, which makes the case against strong. Well done.

    4. Advice on Assignment Two
    a. Identifying stakeholders
    b. Courses of action
    There are lots of beneficial ways Siri can be used, so looking for a win-win may be the best course to take BUT Black/White could also be used in terms of removing the surveillance function.
    Beyond Apple, its customers and regulatory authorities, I can’t think of any other stakeholders. Maybe the wider public in the case of terrorism prevention?

    5. Personal remarks
    It’s an interesting topic, but it does seem like there’s a stronger argument against than for, please check this so that the nature of the dilemma is clearer.

Comments are closed.