Digital assistants no longer operate only on the acoustic level. In case language doesn’t work and things have to be shown, smart displays can also present images. Which is rather practical, like for a cooking recipe. Other assistants help recognize actors in a film in response to the question: “Who is that?” Or they can display appropriate shopping offers.
But in their practical use, there are still many open questions. For example, how to prevent the makers of a smart assistant from becoming a kind of “gatekeeper” that only present or read out certain search results and not the usually wide variety of results, as is typically the case with search engines.
How is our speech data archived and above all, who owns this data? One thing is clear: Whoever decides to talk to intelligent language assistants is revealing a lot about him- or herself. All voice commands are stored on the servers of the provider. It’s easy for them to store and evaluate everything that is said. This in turn allows for personal and location profiles to be made to represent the users.
That’s why privacy protection is so important. Every spoken command results in the processing of personal data, which generally requires user consent. But people must be informed about further use of their data. However, German or European data protection suddenly hits its limitations when the providers locate the data in Asia or the USA.
Digital language assistants don’t always do what they are supposed to, and the consumer protection bureau of North Rhine-Westphalia certainly found that out. In a reaction check, they criticized that the user cannot be sure that the digital language assistant was only recording and sending what it heard back to the service provider when the user really intended for it to.
Voice control is now also showing up where the user might not even expect. It’s finding its way into children’s rooms in the form of smart toys. The “Hello Barbie” doll, for example, has a microphone in its head and sends all that it hears in audio recordings back to a server in the USA. Once a week, parents get a little recording of the things their children have talked to the doll about. It might be interesting or fun for the parents, but it’s a huge breech of the child’s privacy.
Another doll called “My friend Cayla” has already been banned in Germany. It was recording everything, without even the parents’ knowledge or consent, and this data was sent back to the provider via WiFi. The toy allowed for an unsecured connection via Bluetooth and was categorized as a forbidden transmitter type in a legal opinion. The German Federal Network Agency then banned the doll.
Thus, all technical developments brought along in the digital world have their price. How the digital society will deal with it is yet to be seen; it will certainly be a long process in regards to data and consumer protection. This of course also means that users have to approach it themselves and think about how they want to deal with their private and personal data, and with whom they are willing to share it.