Monday, July 16, 2007

European content regulation and the Audiovisual Media Services Directive

The European Union recently agreed on new rules for broadcasting and on-demand content. The catchily termed "Audiovisual Media Services Directive" – formerly known as the "Television Without Frontiers" Directive – will update regulations on broadcasting across Europe and introduce a new framework for content viewed on other platforms, like the internet or mobile phones.

This new set of rules will distinguish between two types of content: television ("linear") and television-like on-demand content ("non-linear"). Anything which looks or feels like the traditional programmes that you'd watch on your TV falls into the first category, and user-generated content, such as on YouTube and Google Video, may fall into the second category.

Online vs. Broadcast Content

When discussions on the directive began there was little distinction between the two types of content, meaning that YouTube and Google Video would have had to comply with a complex set of rules designed to control traditional broadcasting. Thankfully, this was changed in the final draft of the legislation, which must still be voted on by the European Parliament. We believe that on-demand content shouldn't be regulated in the same way as traditional broadcasting because the two are quite different. People control the online content they demand, compared to the content which is broadcast on television.

The directive explicitly states that it will not apply to search engines. We hope that it will not apply to the content created by users themselves – although the language is less clear on this point.

The directive contains important measures to protect users, particularly children, from harmful and illegal content. There are also new rules to make sure that on-demand content doesn't feature material inciting hatred based on sex, racial or ethnic origin, nationality, disability, religion or belief, age or sexual orientation. Google supports these rules, which reflect the rules we already have in place through our user agreements.

We have talked to politicians and decision makers about the importance of empowering people to use the net and other platforms safely, and to make informed decisions through the use of filtering and labeling systems. For example, there is a passionate debate at the moment regarding clips from award-winning films on the YouTube "EUTube" channel promoting European "cinematic heritage." We think there are better ways to safeguard users than introducing unnecessary regulation.

The new directive also includes an important reference to the "country of origin principle," which simply means that content will continue to be regulated by the rules of the country from which it originates. Each country within the European Union will have broadly similar rules but there may be subtle differences. Anyone supplying content to users would only have to worry about one set of rules, rather than differing laws across the EU's Member States.

Next Steps

We expect the European Parliament will vote on the proposed directive in the autumn. After the Parliament has made its final decision, the EU member states will have two years to implement the directive into their own national law.

We will be following this process closely. If we need to, we will step up our advocacy efforts to make sure that politicians and regulators don't impose unnecessary regulations which would stifle the fantastic growth of user-generated content.


pwned said...

Ironic that the Commission themselves publish a video on YouTube that could have contravened earlier versions of the proposed revision to TVWF!
I'd be interested to read on here further Google-thoughts on TVWF/AVMS.
Where does YouTube sit in relation to the eCommerce Directive and the incoming AVMS Directive?
Do you agree with the definition of 'editorial responsibility' within the Directive?
Do you believe the definition of 'programme' exclues user-generated content? When does UGC become merely content or 'professional' content? Do revenue-sharing agreements between platforms and video creators mean such videos would come under the Directive's aegis?
The use of self&co-regulation - which bodies, in which territories will you be looking to join? How effective do you think such services are?
The obligation to contribute to (where practicable) the production of European works. Will Google incorporate this into some sort of CSR programme? Will it argue it would be impractical?
Has the country of origin principle been watered down? Is the procedure within eCommerce stronger or weaker than that within AVMS?
Lots of questions, hopefully we can get some answers going from your side!

Ash Rust said...

There is a serious issue regarding the labeling and filtration of content as a safeguard, it is, unfortunately, very easy for users to simply accept agreements or lie and be exposed to inappropriate content. Perhaps Google might make use of openID in tandem with some form of age and location verification, to ensure user safeguards are properly employed. For example, let us suppose a content publisher in country A publishes a video clip containing mild nudity, in country A it is legal for sixteen year olds to view such content but this is not the case in country B. A genuine age and location verification system would resolve this issue.

Initiatives in this vein would way-lay political fears of online content, promote an open security system and go some way to appeasing the growing body of voices who feel Google is abusing its dominant position. Time and time again Google has delivered technological solutions to problems, it should continue with this policy in politics.

euavmscomments said...

In reply to pwned's comments:

We appreciate that you have many questions, which we will try to reply in summary. As you may know, the proposed EU Audio Visual Media Services Directive has not yet been officially adopted by the European institutions, and it would be premature to specifically comment or speculate on how the proposed Directive may be implemented.

Currently, the EU E-Commerce Directive applies to YouTube as an information society service. We can not comment on how the proposed EU AVMS Directive may apply to YouTube, as the draft legislation has not yet been adopted. We will carefully review the definitions of editorial responsibility and programming when the draft legislation has been adopted, and implemented into national law. The proposed Directive does not include user-generated content, but provides examples of programmes such as feature-length films, sports events, situation comedy, documentary, children’s programmes and original drama. We also can not comment specifically on the reference to contribute to the production of European works, given the current status of the proposed Directive.

Google strongly believes in a self-regulatory, and where appropriate, co-regulatory approach, and we are members of a number of leading industry and trade associations globally. For example, we are Board Members of the European Digital Media Association (EDiMA), which has taken an active role in discussions on the proposed EU AVMS Directive.

In addition, the proposed EU AVMS Directive refers directly to the EU E-Commerce Directive, and in particular, to the importance of the country of origin principle. We do not believe there is any discrepancy on this particular issue.

In response to ash rust's comments:

We agree that labelling and filtering of content are serious issues, and we have a variety of measures in place for both Google and YouTube specifically to protect vulnerable audiences and allow users to notify us when they find content to be inappropriate. On web search, Google’s approach to filtering content uses our product called SafeSearch, which includes a moderate and strict filtering tool. For example, SafeSearch screens for sites that contain explicit sexual content, which can be removed if a user prefers not to have adult sites included in web search results. Although no filter can be 100% accurate, SafeSearch aims to eliminate most inappropriate material. At present, Google is not exploring options for age and location verification as a combined solution, although we can forward your recommendations to the Product team for consideration.

YouTube is used by many individuals of all different ages from around the world, and we require users to agree to Terms of Use, as well as Community Guidelines, before they are able to upload a video. The Community Guidelines, purposefully written in easily understood language, provides advice as to what is acceptable content to upload to these services, and also clearly states what is not allowed. We understand that not everyone may abide by the rules, and so we have enabled users to flag up inappropriate material. If a video has been flagged as inappropriate, a member of our dedicated operations team will review the video to determine if it violates our policies. While flagged videos are not automatically taken down by the system, inappropriate content which violates our Terms of Use and / or Community Guidelines will be removed. We will then notify the user who uploaded the inappropriate content, and we will delete the users’ account for repeated violations. We also offer users the opportunity to directly contact Google / YouTube via an easily accessible on-line form which initiates a direct response from our operations team.

If you would like to review our Terms of Use or Community Guidelines, please refer to and respectively, which is available in various languages.

piaoren2008 said...