Guest Article for EDRi-gram: CEO Coalition To Make The Internet A Better Place For Kids

http://www.edri.org/edrigram/number10.5/ceo-coalition-freedom-of-speech

Following an invitation by Commissioner Kroes in the summer of 2011, and founded on 1 December 2011, the CEO "Coalition to make the Internet a better place for kids" covers the whole industry value-chain. Its 30 members include Apple, BSkyB, BT, Dailymotion, Deutsche Telekom, Facebook, France Telecom - Orange, Google, Hyves, KPN, Liberty Global, LG Electronics, Mediaset, Microsoft, Netlog, Nintendo, Nokia, Opera Software, Research In Motion, RTL Group, Samsung, Skyrock, Stardoll, Sulake, Telefonica, TeliaSonera, Telecom Italia, Telenor Group, Tuenti, Vivendi and Vodafone.

Its statement of purpose and working plan mentions five working areas: simple tools for users to report harmful content and contact, age-appropriate privacy settings, wider use of content classification, wider availability and use of parental controls and the effective take down of child abuse material.

The CEO Coalition has recently opened itself to interested third parties. During the last two weeks, consultations have been held on all five working areas. Of course, from the viewpoint of the civil society, having such a big part of industry working for child online safety appears laudable, but there are some significant points where this endeavour could fail. At the moment, the Commission is putting a lot of pressure on the industry to come up with "results". For example, in the session about effective takedown of child abuse material (CAM), a Commission representative went as far as complaining about the resistance he perceives, complaining there was too much talk about civil rights and too little talk about what can be done. The only problem with the Commission's demand for "effective" takedown of child abuse material is that it has (and it confirmed this in response to a parliamentary question) failed dismally to provide any evidence whatsoever that takedown is not functioning effectively already. This failure is all the more abject when we consider that it has paid for statistics to be prepared.

Also, the Commission representative felt it necessary to point out of course that member states cannot force access providers to use deep packet inspection (DPI) but, of course, access providers could do so "voluntarily". It seems long overdue for the Commissions legal service to assess the appropriateness of promoting "voluntary" adoption of measures that would be in contravention of the Charter and the ECJ cases Scarlet/Sabam, Netlog/Sabam if implemented in law. As a side note, the Commission is now seeking to create new meanings for "takedown" and "removal" of illegal or allegedly illegal online content, with "removal" meaning definitive removal of specific content from all locations on the Internet, even though this interpretation was never discussed during the preparation of the recently adopted child abuse Directive where this issue is regulated. The "efficient takedown" emphasis totally overlooks the fact that takedown is the removal of a symptom - unconfirmed reports from the US suggest that as many as 80% of takedowns of allegedly criminal child abuse websites are not followed up by a police investigation. The EU does not collect statistics on this point. So instead of fighting the abuse that is the source of such images, this policy, on its own, serves only to hide the representation of the abuse. This approach is similar to what tends to happen in families where abuse happens, where everybody prefers to look away rather than act, putting all the energy into denial instead of helping the child victim.

The whole process is also burdened by political baggage that pre-dates its launch. From the outside, it looks as if, at the moment, the Commission's Safer Internet Unit appears to be under pressure both to resolve the quite deep problems that developed before its current management took over and to produce "something" before the end of the term of office of the current Commission. The most obvious approach would have been to collect the experiences from different countries regarding the problems identified and the outcomes of various options that have so far been tried.

At the same time the Coalition appears to have a lack of focus on specific, known problems that might need to be solved. Instead, each discussion appears to start from scratch, as if no experience existed. For example, the action on reporting tools which led to discussions about the style and placement of reporting buttons, but to no discussions about how the reports, especially of harmful content and bullying are to be dealt with. In light of recent revelations about how Facebook deals with reports of potentially harmful content, this is a very serious matter, regardless of the Commission's unwillingness to speak about civil liberties issues.

The pressure to deliver "something" risks to put the CEO Coalition into a mode where it just wants to deliver anything. This creates the setting were frankly idiotic proposals such as a scanning of all Windows computers for criminal content (child abuse material in this case) initiated by the automatic updates process or the proposal to whitelist the whole of the European web came up. Unfortunately, the Iranian and Chinese governments were not asked to send delegates to the meeting, to explain how this can be done most effectively.

One could also get the impression that part of the coalition's membership is still trying to find out what this exercise is all about (especially as many see it as a reiteration of the several consultation processes within the Safer Internet Program that have previously happened ). Additionally, some players appear to see this as being the chance to get a competitive advantage over other industrial stakeholders.

The discussions around reporting and removal of illegal or (potentially) harmful (two very different categories) deserve particular attention.

1. Implementing parental controls* is being pushed. This includes implementing them into the network (by the access providers) as a one-size-fits-all solution - fitting all religions, all ages, all families. This hardly seems to be an optimal solution, as it will always be easier and more precise for one consumer to set his/her devices as he/she wishes rather than the access provider configuring their network in a way that suits the needs of every family connected to it.

But apart from this fundamental problem of implementing parental controls in the network, this approach will lead to "solutions" that will violate net neutrality and will come with serious privacy issues, as well as endangering freedom of speech. To this end, there is an overwhelming need to properly include in the discussions other parts of the European Commission, such as the Directorates General responsible for Justice and for Consumer Affairs.

Additionally, any network-level restrictions (very much like the DPI mentioned against child abuse material) will certainly attract the copyright industry, which has a strong interest in the implementation of this kind of technology and which has the potential to completely swallow up the initiative if allowed to. The Coalition seems to be aware of this, just it does not seem to take this danger seriously. On the other hand, reading some of the responses to the Consultation on the review of the Directive on Enforcement of Intellectual Property Rights, from some of the members of the Coalition, perhaps this is not seen as a danger at all.

This leaves DG Information Society of the European Commission in a position where it (in the form of the Safer Internet Unit) is applauding the mobile sector for interfering with traffic flows (probably in contravention of Article 52 of the Charter, as there is no evidence that it is genuinely achieving an objective of general interest) and, at the same time, (through the Commission units responsible for ensuring a competitive network environment) urging the mobile operators not to interfere with traffic flows for their own business purposes.

2. There is interest in "age appropriate privacy setting" or the "privacy by default" as Kroes mentioned in October 2011.

Most interestingly, this concern about children's privacy only seems to encompass data shared with other users of the platform, not about data processed (and probably shared) by the platform itself. In the context of the Coalition, "Facebook" seems to be synonymous with "social networks" and is leader of Action 2 "Age appropriate privacy settings". Facebook seems to be unwilling to talk about data it collects about users and also about the tracking of users through social network plugins implemented on third party websites.

Additionally, there seems to be some competitive arm wrestling about "age appropriate privacy settings" versus "informed consent" or "even parental consent" for children in social networking services (SNS). Some industry stakeholders went further than Facebook and implemented schemes where the parents' consent is being actively sought. Facebook seems to wilfully ignore that fact that children lie about their age to get on the platform and thereby bypass what Facebook considers "age appropriate" settings thereby exposing them to risks.

If the Coalition were to start taking privacy seriously, it would soon realise that there are other groups who might also benefit from easier privacy settings or the principle of informed consent - such as mentally handicapped people or people under legal custody - so not only children benefit from getting this done right.

All of these bad practices also leave the door open to elements who argue that positive identification of every individual connected to the Internet is needed (to protect the children, of course).

3. Microsoft has taken the lead on the "takedown" working group, where it enthusiastically supports the use of its "photoDNA" software. While photoDNA (which effectively identifies previously-identified abuse images, even when they have been cropped or otherwise distorted) clearly has some very positive applications - such as allowing hotlines to immediately identify known images, minimising exposure of analysts to the content, no effort (as usual) has been given to examining the potential side-effects of widespread use of the technology. What is the risk, for example, of creating a potentially lucrative market for new images, if "known" images are removed very quickly?

4. There is a discussion about automatic content classification, where there seems to be a strong push for pan-European age-classification schemes even for non-linear media like websites. Generally the coalition could allow itself a little more room for pluralism. There is no "one size fits all". Other means for content guidance, for example descriptive (text) labels seem to be neglected. Research that points to a higher acceptance for parental guidance systems, rather than age-dependent restrictions, seems to be discounted too easily.

Perhaps we should remind the Coalition that its statement of purpose was not meant to be set in stone. Even though there would probably be some resistance, its goals can be amended or abandoned if proven to be impractical or not desirable. We must do this, if the Coalition truly wants to achieve effective, proportionate solutions that will lead to a safer Internet for children.

The several working groups take input from the civil society. The contact can be established through INFSO-SAFERINTERNETCOALITION.at.ec.europa.eu. If that does not work, you can also send feedback via the author to help.me.get.the.ceo.coalition.on.track.at.mogis-verein.de

(*) On a personal side note - as a representative of a victims advocacy group (victims of sexual child abuse): Most children are being abused by their parents or other close relatives. We want children and adolescents to have helpful resources be available for them. These children need less parental control, rather than more. Perhaps we should also be talking about non-overridable whitelists or unlimited access to websites that label themselves as helpful resources for children (also information about family planing, STDs and sexual identity) For example our (MOGiS) website, because it concerns sexuality, violence and abuse, would be rated inappropriate or harmful even for adolescents - even though it might be a helpful resource for them (by putting their own suffering into a context that lets them feel less alien and shows them ways to cope)

Self regulation: responsible stakeholders for a safer Internet
http://ec.europa.eu/information_society/activities/sip/self_reg/index_...

Neelie Kroes' speech at the Safer Internet Forum - Luxembourg (20.10.2011)
http://europa.eu/rapid/pressReleasesAction.do?reference=SPEECH/11/703&...

Digital Agenda: Coalition of top tech & media companies to make internet better place for our kids (1.12.2011)
http://europa.eu/rapid/pressReleasesAction.do?reference=IP/11/1485&...

Coalition to make the Internet a better place for kids - Statement of purpose
http://ec.europa.eu/information_society/activities/sip/docs/ceo_coalit...

Safer Internet Programme :: Policy:: Public Consultation
http://ec.europa.eu/information_society/activities/sip/policy/consulta...

Inside Facebook's Outsourced Anti-Porn and Gore Brigade (16.02.2012)
http://gawker.com/5885714/inside-facebooks-outsourced-anti+porn-and-go...

Facebook in new row over sharing users' data with moderators (3.03.2012)
http://www.telegraph.co.uk/technology/facebook/9119090/Facebook-in-new...

Research on parental guidance (10.2011)
http://stakeholders.ofcom.org.uk/binaries/research/media-literacy/oct2... and
http://stakeholders.ofcom.org.uk/binaries/research/media-literacy/medi...

Summary of child exploitation Directive
http://www.europarl.europa.eu/oeil/popups/summary.do?id=1106483&t=...

Microsoft's response to the review of the IPR Enforcement Directive
(31.03.2011) http://bit.ly/x9AYvy

Commissioner Kroes' speech on privacy (20.10.2011)
http://europa.eu/rapid/pressReleasesAction.do?reference=SPEECH/11/703&...

 

Recent Posts

Archive

2023
2022
2021
2020
2019
2018
2017
2015
2014
2013
2012
2011
2010
2009

Categories

Authors

Feeds

RSS / Atom