The Data Protection Law Is Not for Children
Pop the champagne corks. In a few weeks, GDPR is going to take effect throughout all of Europe. And at the last moment, some member states have decided to lower the digital age of consent to 13 – under the radar: without information and debate. It’s a blank cheque for tech giants, such as Facebook and Google. Haven’t we learned anything from Facebook-gate yet?
Katrine K Petersen and Mie Oehlenschläger When is your child old enough to be monitored, systematised, have their information collected, registered, used and disclosed by commercial companies? <
Article 8 of the GDPR, which contains specific requirements concerning children and consent, recommends that the age of consent be set at 16. Nevertheless, at the last minute, 10 Member States – Belgium, Estonia, Finland, Latvia, Poland, Portugal, Spain, Sweden, Denmark and the UK – have chosen to take advantage of special rules, which make it possible to set the age of consent lower than recommended. It will be passed as part of the Data Protection Act in Denmark, which from May 2018 will supplement the EU Data Protection Regulation (GDPR) with particular Danish rules for data protection. According to the Danish state, it is in the interest of the child that the age of consent be set as low as 13 years of age. The argument is the freedom of the child. The freedom to express themselves and have free access to so-called “information society services” (tech companies platforms, such as Google/YouTube, Facebook/Instagram and many others). The argument cited by those, who believe the consent age should be 13, is that if a child is refused access by their parents, the child is then deprived of their freedom.
Freedom to sell
The question is whether or not the opposite is the case. The internet was born as a democratic tool for connecting people globally, and providing people with free access to knowledge and information. But with new technologies, big data and machine learning, the opportunity to act and express yourself without being monitored is not really possible. It is not primarily the people, who have access to knowledge and information, but rather powerful tech monopolies that have free access to the people. The result is – unsurprisingly – a restriction of the individual’s freedom.<
When we swipe, post and share, we become a permanent part of data activity. The company, which has the greatest and best knowledge of the individual’s mind, wins access. Companies collect data about people, and machine learning is used to predict and influence behaviour. Thus, it is now possible for tech-giants to use machine learning algorithms to track mental conditions, such as incipient mania, depression and suicidal thoughts and tendencies.<
In May 2017, it was revealed by The Australian that Facebook had been monitoring adolescents’ mood swings. This information, entitled “sentiment analysis” by Facebook, could be used by advertisers to capture young people in situations where they are at their most vulnerable and, thus, also particularly susceptible to advertising. According to the newspaper, Facebook could see if adolescents as young as 14 felt “defeated”, “overwhelmed”, “stressed”, “anxious”, “nervous”, “stupid”, “silly”, “useless” or a “failure”. In light of this, the argument of child freedom for lowering the digital age of consent sounds hollow.<
Sensitive information, which is collected, can also be used to influence policy decisions. Facebook carried out an experiment in 2010, and again later in 2012, involving 61 million people in the USA. According to The New York Times, Facebook was able to generate 370,000 more voters in 2010 and 270,000 in 2012 by using small messages. The US elections in 2016 were reportedly determined by 100,000 voters. It has been proven that certain sections of the population were exposed to political messages based on psychological profiling through the company Cambridge Analytica.
Hidden algorithms are fundamental for manipulation and discrimination The big tech companies’ treatment and intentions for using data happen behind closed doors. It is a problem giving rise to hidden manipulation and discrimination. For instance, the American media ProPublica documented that via Facebook’s self-service ad-buying platform, campaigns could be targeted to “people who hate Jews”. Following the publication, Facebook chose to remove categories for anti-Semitism. But the same mechanism exists at Google. According to techno-sociologist Zeynep Tufekci, Donald Trump’s social media manager usedFacebook’s so-called “dark posts” to convince African-American men in cities, such as Philadelphia (key for the elections), not to vote. During the Cambridge Analytica hearing, Zuckerberg was asked about the protection of children’s rights in relation to Facebook’s app Messenger Kids, which, according to MIT Technology Review, “should freak parents out”. It was Joe Barton who asked on the second day of the hearing: “Is there any reason that we couldn’t have just a no-data-sharing policy, period, until you’re 18? Nobody gets to scrap it; nobody gets to access it. It’s absolutely, totally private . . . What’s wrong with that?” In his reply, Zuckerberg also cited the freedom argument: “Children and young people want to share their opinions publicly”. And that makes complete sense from a business and consumer perspective, where the purpose is consumption, efficiency and data. But from a democratic perspective, it is not sufficient to give children “freedom to” – there is also a need for “freedom from” surveillance, profiling, manipulation and discrimination. That dimension seems to have been omitted in the making the age of consent decision, and that should raise concern.
Massive lobbying for a low digital age of consent “Does the Senate that questioned Zuckerberg understand what the internet is at all?” wrote The Guardian in conjunction with the Cambridge Analytica hearing. Several media outlets – including Danish ones – pointed out the problem that responsible bodies, which are responsible for regulation and control, are ignorant of the fundamental mechanisms of the business model behind corporate entities, such as Facebook and the other “Big Five”. And it is not a problem confined to the USA. After the Cambridge Analytica scandal, DI Digital’s new digital director, Lars Frelle-Petersen, stated to the Danish newspaper Politiken: “We have had a naïve relationship with the internet and Facebook is a wakeup call”. But is it really? How are we supposed to react when, in Denmark, we apparently – and uncritically – follow the arguments that strengthen organisations, such as The American Chamber of Commerce, which represents more than 700 American companies, including tech giants, such as Facebook, Google and Twitter?
In February 2017, The Times revealed that AmCham had targeted lobbying the Irish government to set the digital age of consent to 13 years of age. It caused an outcry in Ireland, and Professor Barry O’Sullivan, Director of the Insight Centre for Data Analytics and forensic and cyber psychologist Dr Mary Aiken have launched an awareness campaign. It seems to be affecting the decision for a higher age of consent in Ireland. Whereas, in Denmark, there is resounding silence, no information, no debate.
Children should be able to enjoy their leisure time free from hidden manipulation Professor of Computer Science at Washington University, Pedro Domingos points out that “you cannot control what you don’t understand”. We cannot let algorithms remain in a “black box” kept by the big tech companies. As of yet no algorithm declarations clarifying what intentions and purposes they contain exist – they remain hidden. And it is a serious democratic issue if we overlook or ignore the fact that algorithms contain political as well as economic and social power. Technological development has brought great opportunities, but also challenges, and we have to relate to the possibilities of development with a qualified forward-looking view rather than from a naïve and reactionary perspective.
If children are to enjoy freedom, we cannot avoid regulating tech companies. And when we, hopefully in the future, have online platforms that guarantee democratic rights, data security and not just freedom to but also freedom from, then the age of consent can be lowered and children can be given the freedom to move freely. The Cambridge Analytica case should have taught us something. Now we would like to see what that really is? And we can start by discussing digital freedom and the digital age of consent.