Threatening our Privacy
19. Is Personal Data Protection Sufficient?
We are all data for sale.
Customer data is digital gold. Data brokers are amassing billions and trillions of data points worldwide and are creating massive personalized digital profiles about each of us. Harvesting of our personal details goes far beyond what many of us could imagine. Our personal details are sold for advertising, influencing, tracking or surveillance purposes.
- A major reason for the success of companies like Google, Uber and Amazon is that they have embraced the idea of “data as an asset”.
- Facebook and Instagram are gathering data from under-18s by using software that tracks users’ web browsing activity.
- The European Commission has presented several legislative proposals as part of its digital and data strategies that will facilitate the use and sharing of (personal) data between more public and private parties. But according to the European Data Protection Board this will “significantly impact the protection of the fundamental rights to privacy and the protection of personal data”.
Giants as targets for hackers.
Our privacy is compromised. Our personal data are not safe from data leaks and security breaches even though they are stored in large companies.
- The personal data of over 500 million Facebook users have been posted in a low-level hacking forum (2021). It includes phone numbers, full names, locations, email addresses, and biographical information. Security researchers say hackers could use the data to impersonate people and commit fraud.
- Personal data for 700 million LinkedIn users from the period of 2020-2021 has been harvested and put up for sale online.
- A german insurance company was the target of a criminal cyber attack in July 2021. The perpetrators succeeded in overcoming the high security standards and, among other things, copying the bank details of insurance customers and business partners and publishing them on the so-called Darknet.
Your privacy is compromised.
The introduction of Smart cities, buildings and devices promises comfort and security, but will cost us our privacy and will increase surveillance immensely.
- A new report made for the Greens in the European Parliament “Biometric and Behavioural Mass Surveillance in EU Member States” states that:”Private and public actors are increasingly deploying “smart surveillance” solutions including remote biometric identification technologies which, if left unchecked, could become biometric mass surveillance.”
- H&M has paid a fine of €35,258,707.95 for violating GDPR rules (2020). After sick leave employees were required to attend a return-to-work meeting, which was recorded. Senior H&M staff gained ”a broad knowledge of their employees’ private lives… ranging from rather harmless details to family issues and religious beliefs”. This “detailed profile” was used to help evaluate employees’ performance and make decisions about their employment.
20. Is Wireless Data Transmission Secure?
No. Wireless transmissions from credit cards, mobile phones, smart watches, fitness trackers, pacemakers and personal wearables pose a risk of data misuse.
We have particular concerns about private banking data and data covered by medical confidentiality. Dangerous threats regarding wireless medical devices violate conﬁdentiality of these devices. Hacking of medical devices could create a backdoor into hospital networks.
Also the wearable industry is booming nowadays. Users themselves contribute to privacy and device security breaches as they are not aware of different threats and vulnerability of the devices.
Minimization of data transmitted wirelessly is crucial for the security of medical patients. The principle of data minimization involves limiting data transmission and storage to only what is required to fulfil specific purposes.
- 465,000 pacemakers recalled due to cybersecurity vulnerabilities.
- The wireless communication between medical devices or other smart devices is not secure.
- Fitness data is an attractive target for health insurance companies
21. But we do have Data Protection Authorities, don't we?
Data protection authorities do not provide assessments of discrimination and digital rights violations.
There are two levels of supervisory authorities for the protection of personal data:
— the national data protection supervisory authorities
— the European Data Protection Board (EDPB)
However, they are not sufficiently independent of the States and companies, do not have sufficient resources of their own and their scope of control remains limited.
Current data protection is insufficient in an Internet of Bodies and Things scenario in which all devices are collecting our data 24/7, to be processed as Big Data by artificial intelligence which has been proven to reproduce and aggravate discrimination.
The GDPR clearly mentions cases of discrimination as falling within the competence of supervisory authorities. However, these authorities have made little use of this competence, and to date do not provide any regular assessment of such discrimination, neither at national nor at European level.
22. Is Artificial Intelligence supporting us?
Artificial Intelligence is a good servant but a bad master. Artificial intelligence reproduces and aggravates discrimination.
Every second, millions of predictions of human behavior are made by corporations on the basis of the data collected. In an Internet of Bodies and Things scenario all devices are collecting our data 24/7, to be processed as Big Data by artificial intelligence (AI).
Various studies have shown that AI actually automatically reproduces previous discrimination in machine learning processes. The machine “learns” according to the choices it has already made in the past by systematising them. AI can then multiply these prejudices and opinions and forward political extreme positions.
This is not science fiction, AI has already been proven to reproduce and aggravate discrimination.
Examples of how AI could reinforce social, racial and economic inequalities:
- Facebook allows housing advertisements not to be shown to certain categories of users, such as mothers of high school kids, people interested in wheelchair ramps, Jews.
- Software tool for the police can predict who is likely to behave outside the rules which leads to injust arrets and harrassment of minorities.
- AI recruiting tool exists that is biased against women.
23. Debates on Digitalization are Supplanted by Advertising
Digital innovations are promoted uncritically by politicians and advertisements. There is no public discussion about potential problems.
Digital technologies, especially 5G in conjunction with the Internet of Things and of Bodies, may lead to discrimination and disrespect of human dignity.
The European people have not been asked whether they want their lives to be totally digitalized and run by algorithms. In particular they were not involved in the pros and cons of wireless rollout. Presently, the 5G roll-out is in the hands of uneducated politicians and lobbyists who have not engaged in any dialogue with the EU public.
So far there has been no unbiased discussion in the EU community.
Organiser des débats publics, menés par des scientifiques ayant une expertise biomédicale et exempts de conflits d'intérêts, pour déterminer si ou dans quelle mesure les innovations numériques pourront être autorisées : nommer un nouveau comité d'éthique ou étendre les activités du Groupe européen d’éthique des sciences et des nouvelles technologies (GEE).