Group 42

In this new age of digitalisation, Smart Home Assistant (SHAs) technology such as Alexa, Google Home and Echo Dot brings voice command to a whole new level. Want to make a phone call? Want to play some music? All can be done with just a word to SHAs. It is predicted that 1.8 billion people will be using digital assistants by 2021 and £10.8 billion will be spent on smart home devices in 2019 alone. Although SHAs eases our lives, are we sure it aligns ethically?

(Smart) Home (Assistants) Is Where The Heart Is

The Intergovernmental Panel on Climate Change (IPCC)’s 2018 report stated that we only have 12 years to prevent disastrous and irreversible climate change consequences. From a utilitarian perspective, it is paramount for us to work towards achieving the greatest amount of benefit for the society and in this case, protecting the world we live in. Clearly, it is imperative for us to act now and the first step begins from our homes.

A 2018 study by techUK found that 75% people thought that the smart energy aspect of connected homes are appealing. This indicates the increasing awareness of environmental responsibilities and more people are playing their part in reducing carbon footprints through the integration of SHAs in homes. A household’s heating and cooling system makes up 42% from the total utility costs, with electricity and natural gas fuelling the demand. Novel solutions like zoned HVAC systems can direct the heat or cool air only in necessary areas, while smart thermostats learn a household’s heating habits and adjust temperature from it. Incorporating these measures will reward homeowners with lower utility expenditures and more significantly, reduction in energy consumption. With the added advantage of remote monitoring, homeowners can optimise the digital system to reduce energy consumption while they are away.

According to the World Health Organization, about 350 million people around the world are affected by depression at one point in their lives. From that number, nearly 50 percent of them are also diagnosed with anxiety disorder. These statistics show a worrying increase every year and SHAs can be a potential solution to keep these numbers down.

One way SHAs could help the people with depression is by allowing patients to share their feelings. They can be honest with SHAs without feeling the burden of shame. This helps them to share most of their stories regarding of their problems. The theory was backed by a computer scientist and psychologist, Jonathan Gratch from the University of Southern California’s Institute for Creative Technologies. He stated that by engaging with a SHA, patients tend to reveal intimate feelings, as virtual assistants possess no prejudgments and personal feelings. One can simply say “I feel depressed” to their SHA and the device will respond with a series of solutions. Taking Alexa as an example, it will react with calming words as well as providing a hotline support to further assist them whenever needed.

Ethics of care was behind the idea of helping those in need to improve their social life and mental health problem. By assisting them, SHAs can aid the affected people from going through a downward spiral and ensuring them to have a better, happy and healthy life, supporting the principle of care ethics.

Or Maybe It’s Not…?

SHA system has the capability to gather voice data, hence unauthorized entities can access the system to identify users or extract data to produce voice artefacts from these individuals for impersonation purposes. In 2016, DDoS attacks against Dyn LLC compromised home-embedded devices, such as DVRs and webcams. Similar problems can affect SHA systems as the gateway used in the device are also of embedded system.

A 2017 study on privacy threats outlined two major threats in the usage of SHAs. From the figure below, case 1 illustrates the vulnerability scenario of a constant voice recording. SHAs tape the user’s voice when it catches the ‘wakeup word’ and transmit the audio into a virtual storage. If the SHA device is compromised by a cyber-attack, it can act as a spy which allows all sounds to be taped and sent to an attacker in real time. Case 2 illustrates danger from malicious voice commands. It is difficult for SHAs to perfectly recognize user’s voice, tones and accents even voice training feature is provided. Hence, SHA is susceptible to commands from unauthorized parties with malicious intent. If the attacker is within the device’s range, they can easily manipulate the system into thinking the real owner is the person speaking. However, the privacy issues do not align with Kantian theory. The theory stated that there are moral laws that all rational beings are bound to, simply because they are rational beings. In this case the said law is privacy, which is a vital human right.

Figure 1 shows the cases that related to privacy

SHAs also has limited capabilities in understanding speech which can prevent them to function perfectly. If we cannot rely on these SHAs due to these limitations, it can be contradicting with care ethics previously mentioned, where the users’ needs are not taken care of when it is the SHA’s responsibility to fulfill them. Voice is the only medium that makes it possible for human and SHA to interact, but device’s understanding of human speech is limited by the way they are programmed. The way humans construct their speech and understand them is more complex than how SHA can comprehend. Some SHAs cannot capture contextual clues from previous conversations to understand the following conversation, and also, they cannot understand the same instruction if they are phrased differently. They also tend to have limited response to questions given to them, such as giving opinions and handling self-reflection. All these limitations can be annoying to the users since they rely on SHA to communicate and perform tasks efficiently.

Initial Decision

All in all, we believe that the integration of Smart Home Assistant into households will lead to greater good to our daily lives.

11 thoughts on “ALEXA, ARE YOU MY FRIEND?

  1. Great article! I have heard of smart home assistants before but have never understood the extent of their functionalities. But I do agree and relate to the concern of how vulnerable our data will be with the use of smart home assistants, especially with the more important aspects of our home such as energy and security. Personally, I would not opt for smart home assistants for the aspects mentioned as it’s not entirely proven to be safe against unauthorised parties and also because it’s not a necessity to my life. I might consider in the future if more evidence comes out that smart home assistants are indeed safe and our data is not compromised, and if the benefits of having one is greater than not having one.

    1. Hi thanks for your comment! I agree, security issues are always a main concern when it comes to technology, to a point that it might lead to paranoia. On that note, I believe that both developers and consumers carry the responsibility when it comes to the data content shared. Developers have the ethical duty to keep their consumers’ data safe while consumers should keep sensitive information offline, although I understand how easily they can be transferred in the mix. Comparatively, our smartphones are as vulnerable, if not more, in terms of security yet we continuously rely on it. Considering they have only entered our lives 5 years ago, I do believe that there is much room for improvement in reducing the vulnerability of SHAs.

  2. While the concerns are not baseless, the chances of your SHA being hacked is low if you’re someone relatively unimportant. After all, your phone’s microphone is always on too and it’s way more vulnerable to privacy threats because of the constant internet connection. Your banking data are most probably on it too, so SHAs are not much different than smartphones in terms of vulnerability. In my opinion, it shouldn’t be a problem with using the device if you’re not in a high-risk group.

  3. this is a great article and did really comes with strong opinions upon the usage of SHAs. the argument here is to what extent that SHAs could help people with depression. I do believe that not all people with depression have a similar problem. some might because of study, some might because of marriage problem or some might be because of the love relationship. as already mentioned in your article, SHAs might have the limitation on voice recognition and questions stored in the system. due to that, I’m afraid of the counteract by these group of people if the device did not give an accurate response. if you are saying that the device is only set to a listener, then how does it will help those people? in order to put this device as another alternative solution to help people with depression or for counseling purposes, I guess a lot of research still needs to be done.
    so yeah, are you still sure that Alexa can be your friend?

    1. Hello! It is true that depression varies from one person to another and in this context, we are talking about how SHAs be able to help if not much, at least give them with some sort of temporary solution. Instead of being just a listener which some people may find this really helpful , the focus here is to detect the problem itself rather than to cure it. Because as mentioned, most of them choose not talk about the problem with family and friends, due to the burden of shame. There are few programmes which integrates with SHAs in the market. As example, “DepressionAI” which aims to help people with these problem to perform daily activities and to detect the suicidal intentions (and provide the number of the national suicide prevention hotline) and recommend a local therapist based on the users location.

      However, working towards helping people with depression especially via SHAs still in an early stage and I agree there are still more improvements and research need to be made until SHAs can really solve the problem. Thank you for you comment!

  4. A nice article about the pros and cons of SHA. As usual, there is nothing in the world that can fulfil everyone needs and desires, SHA included. The privacy breach by Amazon through its Echo devices may be counterintuitive in helping anxiety patients. A few months back, the news about audio recordings of people using Alexa assistant being reviewed by Amazon employees without consent caused an uproar in its customer base. If the anxiety patients are informed about this, they will be more anxious and more difficult to gain their trust.

    These employees reviewing the recordings supposedly are not doing anything bad; they are improving their voice recognition system, but without the permissions of the user, should they be allowed to do so? What if there is one employee who would use the recordings for ill-will? However, without this progress in improving the voice recognition system, the technology will become obsolete sooner. In my opinion, there should be a balance on how much privacy can be sacrificed with the improvement in technology. What is your view on this?

    1. Thank you for your thoughts! You have brought up fair points on the usage of SHAs. Although the potential is promising, I agree that there is a lot more to explore in the avenue of anxiety treatment. Paranoia could trigger anxiety attacks, which is something we wish to avoid. Developers could tap into this market and create a specific plans and programmes that could cater to patients with mild or recovering mental health issues.
      As for the recording issues, it should be made explicit that developers wish to keep voice recordings for improvement purposes. Sure it will be unappealing and probably deter potential buyers but maybe a compensation scheme of sorts would attract users to voluntarily share their data. Honesty is always the best policy, right?

  5. I totally agree that smart home assisstant can generally be able to bring more benefits to the humankind than the harm in the present day. From not just as a mean in tackling the depression, the development of AI in general would also help to benefit us as an extension of ourselves. Although further debates ensuing the development of AI would be another story.

  6. Interesting article. I look forward to SHAs helping me manage my finances by regulating heating and cooling systems of my home, since energy bills are usually rather expensive. However, I doubt whether SHAs can actually cure depression because, as mentioned in another comment, the cause of depression varies from person to person. While I agree SHAs can help mental health patients during emergencies, actually treating the illness itself requires human interaction in my opinion. It takes more than spoken words to understand a person’s inner heart and to comfort someone. In this case, a human is still needed to fulfill the emotional needs of these patients that requires so much unconditional love and support.

  7. Interesting topic indeed!
    It’s interesting to know that how SHAs could help people with depression. Some people might feel more comfortable and confident in expressing their feelings to the virtual assistants without feeling worried of being judge by the other people. Of course, as a human I strongly believed that people with depression will still need a full support and help from the expert such as the psychologists but sometimes by sharing their feelings to the device whenever they need to could hopefully lessen their pain. Thus, I agree with your point that how SHAs could help people with depression although it might depend on certain people too as depression has many stages.

Leave a Reply