The Web 2.0 content can be categorized as a user generated content (UGC) where anyone can say anything about anything. This decentralization of information publishing boosted even more intensive spread of WWW and attracted huge amount of people that generate millions of pages, records, images, videos and other multimedia content every day. Along with obvious advantages (diverse and rich content, availability of discussion platforms, etc.) social Web exposed some serious legal issues related to copyright infringements or publishing of false and detractive information about facts, individuals or legal entities. However, it would not be true to say that legal issues in the Web appeared with the movement to UGC model. Plagiarism and libel existed long before WWW invention but due to the decentralized nature of the Web 2.0 and a huge amount of generated content they became less controllable and, thus, more common. Another issue with a social content is that the majority of users does not consider online publishing as something serious and feel less responsible for the exposed web content. For example, the survey held by YouGov in 2008 revealed that “three quarters of Internet users who comment online realize they could be breaking the libel law”.
Considering what legal issues can be exposed by our application we should take into account that the “MovieIt” allows users to publish reviews on movies and leave comments about cinemas. Below we listed possible legal problems:
- Accessibility problems (According to the UK Equality Act (2010) inaccessible web content is against the law);
- Copyright infringement (in case if user leaves a review on movie or cinema that was copied from another resource);
- Libel (in case if user writes wittingly false information about cinemas or movies);
Risks for the CyberTube team
- The law can consider application owner as a publisher or editor of the illegal content;
- The owner of an application can be accused in inaccessible content;
If you edit or run a group of community, even if you’re not responsible for the hosting of content (such as Flickr group), then choosing not to exercise control and ignoring requests to remove defamatory or illegal content would not be a defence to any claims brought against you.
Associate at Winston Straws
- Use of Accessibility guidelines (WCAG) and specifications (WAI-ARIA) during the design process;
- Test application accessibility using automated tools (W3C HTML and CSS Checkers) and guides (IBM guideline, Section 508);
- Test application accessibility against focus group of people with different sorts of disabilities;
Copyright Infringement and Libel
- Have a clear complaint policy and procedure;
- Remove any material that provokes a complaint upon the request;
- Provide “terms and conditions of use” guidelines that state clearly what content can be considered as a plagiarism or libel and ensure that the user was acquainted with a material;
- Monitor the content;
As it can be seen there are different ways to protect users and owners of the application from being sued for inappropriate content starting from removal of provocative materials and content monitoring and ending with more mild actions as make the user aware about his/her responsibilities and possible consequences. Considering what way is more preferable for our application we should find an optimal solution that will minimize risks and make users comfortable with our system at the same time (e.g. system should not impede free sharing of opinions and views). We are strongly perceived that a good user experience is one of the key features of our system, thus, we prefer to make users aware about their responsibilities and we exclude an opportunity of content monitoring. However, we leave the right to remove materials that provoke a compliant.
1. Presentation on “Is your social media marketing breaking the law?” prepared by Dom Sparkes, MD.
2. Robert P. Latham, Jeremy T. Brown and Carl C. Butzer, “Legal Implications of User Generated Content: YouTube, MySpace, Facebook”.