Home » Privacy

Category Archives: Privacy

Want yet more email?

Enter your email address to get notifications of new posts.

Request contact
If you need to get in touch
Enter your email and click the button

– It’s All Too Creepy

As concern about privacy and use of personal data grows, solutions are starting to emerge.

This week I attended an excellent symposium on ‘The Digital Person’ at Wolfson College Cambridge, organised by HATLAB.

The HATLAB consortium have developed a platform where users can store their personal data securely. They can then license others to use selected parts of it (e.g. for website registration, identity verification or social media) on terms that they, the user, is in control of.

The Digital Person
The Digital Person
This turns the table on organisations like Facebook and Google who have given users little choice about the rights over their own data, or how it might be used or passed on to third parties. GDPR is changing this through regulation. HATLAB promises to change it through giving users full legal rights to their data – an approach that very much aligns with the trend towards decentralisation and the empowerment of individuals. The HATLAB consortium, led by Irene Ng, is doing a brilliant job in teasing out the various issues and finding ways of putting the user back in control of their own data.

Highlights

Every talk at this symposium was interesting and informative. Some highlights include:


  • Misinformation and Business Models: Professor Jon Crowcroft
  • Taking back control of Personal Data: Professor Max van Kleek
  • Ethics-Theatre in Machine Learning: Professor John Naughton
  • Stop being creepy: Getting Personalisation and Recommendation right: Irene Ng

There was also some excellent discussion amongst the delegates who were well informed about the issues.

See the Slides

Fortunately I don’t have to go into great detail about these talks because thanks to the good organisation of the event the speakers slide sets are all available at:

https://www.hat-lab.org/wolfsonhat-symposium-2019

I would highly recommend taking a look at them and supporting the HATLAB project in any way you can.

– Ethics of Eavesdropping

It has been recently reported (e.g. see: Bloomberg News ) that the likes of Amazon, Google and Apple employ people to listen to sample recordings made by the Amazon Echo, Google Home and Siri, respectively. They do this to improve the speech recognition capabilities of these devices.

Ethical Issues

What are the ethical issues here? The problem is not with these companies using people to assist in the training of machine-learning algorithms in order to improve the capabilities of the devices. However there are issues with the following:


  • While information like names and addresses may not accompany the speech clips being listened to, it seems quite possible that other identification would potentially enable tracing back to this information. This seems unnecessary for the purpose of training the speech recognition algorithms.

  • It has been reported that employees performing this function in some companies, have been required to sign agreements that they will not disclose what they are doing. To my mind this seems wrong. If the function is necessary and innocent then companies should be open about it.

  • These companies do not always make it clear to purchasers of devices that they may be recorded, and listened to, by people. This should be clear to users in all advertising and documentation.

  • The most contentious ethical issue is what to do if any employee of one of these companies hears a crime being committed or planned. Another situation arises if an employee overhears something that is clearly private, like bank details, or information that, although legal, could be used to blackmail. In the first situation, are these companies to be regarded as having the same status as a priest in a confessional or any other person that might hear sensitive information? A possible approach is that whatever law applies to human individuals, should also apply to the employees and the companies like Amazon, Google and Apple. So in the UK for example, some workers (such as social workers and teachers) who are likely to occasionally hear sensitive information relating to potential harm to minors, are required to report it. In the second case, companies could be legally liable for losses arising from the information being revealed or used against the user.

It seems likely that companies are reluctant to admit publicly that interactions with these devices may be listened to by people, is because it might affect sales. That’s does not seem a good enough reason.