In this new age of digitalisation, Smart Home Assistant (SHAs) technology such as Alexa, Google Home and Echo Dot brings voice command to a whole new level. Want to make a phone call? Want to play some music? All can be done with just a word to SHAs. It is predicted that 1.8 billion people will be using digital assistants by 2021 and £10.8 billion will be spent on smart home devices in 2019 alone. Although SHAs eases our lives, are we sure it aligns ethically?
(Smart) Home (Assistants) Is Where The Heart Is
The Intergovernmental Panel on Climate Change (IPCC)’s 2018 report stated that we only have 12 years to prevent disastrous and irreversible climate change consequences. From a utilitarian perspective, it is paramount for us to work towards achieving the greatest amount of benefit for the society and in this case, protecting the world we live in. Clearly, it is imperative for us to act now and the first step begins from our homes.
A 2018 study by techUK found that 75% people thought that the smart energy aspect of connected homes are appealing. This indicates the increasing awareness of environmental responsibilities and more people are playing their part in reducing carbon footprints through the integration of SHAs in homes. A household’s heating and cooling system makes up 42% from the total utility costs, with electricity and natural gas fuelling the demand. Novel solutions like zoned HVAC systems can direct the heat or cool air only in necessary areas, while smart thermostats learn a household’s heating habits and adjust temperature from it. Incorporating these measures will reward homeowners with lower utility expenditures and more significantly, reduction in energy consumption. With the added advantage of remote monitoring, homeowners can optimise the digital system to reduce energy consumption while they are away.
According to the World Health Organization, about 350 million people around the world are affected by depression at one point in their lives. From that number, nearly 50 percent of them are also diagnosed with anxiety disorder. These statistics show a worrying increase every year and SHAs can be a potential solution to keep these numbers down.
One way SHAs could help the people with depression is by allowing patients to share their feelings. They can be honest with SHAs without feeling the burden of shame. This helps them to share most of their stories regarding of their problems. The theory was backed by a computer scientist and psychologist, Jonathan Gratch from the University of Southern California’s Institute for Creative Technologies. He stated that by engaging with a SHA, patients tend to reveal intimate feelings, as virtual assistants possess no prejudgments and personal feelings. One can simply say “I feel depressed” to their SHA and the device will respond with a series of solutions. Taking Alexa as an example, it will react with calming words as well as providing a hotline support to further assist them whenever needed.
Ethics of care was behind the idea of helping those in need to improve their social life and mental health problem. By assisting them, SHAs can aid the affected people from going through a downward spiral and ensuring them to have a better, happy and healthy life, supporting the principle of care ethics.
Or Maybe It’s Not…?
SHA system has the capability to gather voice data, hence unauthorized entities can access the system to identify users or extract data to produce voice artefacts from these individuals for impersonation purposes. In 2016, DDoS attacks against Dyn LLC compromised home-embedded devices, such as DVRs and webcams. Similar problems can affect SHA systems as the gateway used in the device are also of embedded system.
A 2017 study on privacy threats outlined two major threats in the usage of SHAs. From the figure below, case 1 illustrates the vulnerability scenario of a constant voice recording. SHAs tape the user’s voice when it catches the ‘wakeup word’ and transmit the audio into a virtual storage. If the SHA device is compromised by a cyber-attack, it can act as a spy which allows all sounds to be taped and sent to an attacker in real time. Case 2 illustrates danger from malicious voice commands. It is difficult for SHAs to perfectly recognize user’s voice, tones and accents even voice training feature is provided. Hence, SHA is susceptible to commands from unauthorized parties with malicious intent. If the attacker is within the device’s range, they can easily manipulate the system into thinking the real owner is the person speaking. However, the privacy issues do not align with Kantian theory. The theory stated that there are moral laws that all rational beings are bound to, simply because they are rational beings. In this case the said law is privacy, which is a vital human right.
SHAs also has limited capabilities in understanding speech which can prevent them to function perfectly. If we cannot rely on these SHAs due to these limitations, it can be contradicting with care ethics previously mentioned, where the users’ needs are not taken care of when it is the SHA’s responsibility to fulfill them. Voice is the only medium that makes it possible for human and SHA to interact, but device’s understanding of human speech is limited by the way they are programmed. The way humans construct their speech and understand them is more complex than how SHA can comprehend. Some SHAs cannot capture contextual clues from previous conversations to understand the following conversation, and also, they cannot understand the same instruction if they are phrased differently. They also tend to have limited response to questions given to them, such as giving opinions and handling self-reflection. All these limitations can be annoying to the users since they rely on SHA to communicate and perform tasks efficiently.
All in all, we believe that the integration of Smart Home Assistant into households will lead to greater good to our daily lives.