Regulation of Online Media: A Comparison
Assumptions about censorship?
What is censorship?
History of Censorship
Arguments for censorship
Arguments against censorship
What do we mean by media?
AUSTRALIA
Statutory Regulation
Recent Examples
United Kingdom
● “The current maximum sentence does not reflect the relative ease with Changes to which individuals may, by digital means, disclose vast amounts of sensitive information” the Official ● “We invite consultees’ views on whether information that relates to the economy ought to be brought within the scope of the legislation. One way to define this category is to specify that it only encompasses Secrets Act information that affects the economic wellbeing of the United Kingdom in so far as it relates to national security.” ● “We suggest remodelling the offences so that they focus not upon the consequences of the unauthorised disclosure, but upon whether the defendant knew or had reasonable cause to believe the disclosure was capable of causing damage. It is the culpability of the defendant in making an unauthorised disclosure being aware of the risk of damage that should be the core of the offence; not whether damage did or did not occur.” ● “Such a defence would allow someone to disclose information with potentially very damaging consequences. The person making the unauthorised disclosure is not best placed to make decisions about national security and the public interest; the person would not be guaranteed the defence if they did make the disclosure because the jury might subsequently disagree that it was in the public interest and by then the damage has been done.”
Changes to the Official Secrets Act
DSMA-Notices
“We express grave concern as to the cumulative effects of R v Incedal holding a criminal trial in camera and anonymising the defendants. We find it difficult to conceive of a situation where both departures from open justice will be justified. Suffice to say, we are not persuaded of any such justification in the present case.” “We handed over our telephones to the policeman that are then put in a metal box and locked up. We then collect our “secret” notebooks from a metal safe, and then we walk in front of the judge and the jury in total silence to our desk and start writing. After about 20 minutes we would do the whole thing in reverse.”
Super-Injunctions
“Today we identify the footballer whose name has been linked to a court super-injunction by thousands of postings on Twitter. Why? Because we believe it is unsustainable that the law can be used to prevent newspapers from publishing information that readers can access on the internet at the click of a mouse.” “We should point out immediately that we are not accusing the footballer concerned of any misdeed. Whether the allegations against him are true or not has no relevance to this debate. The issue is one of freedom of information and of a growing argument in favour of more restrictive privacy laws.” Super-Injunctions
Canada
“There is also a very real concern about the effectiveness of any ban that I might impose. This trial will be conducted in a public courtroom. The court has no real means to police the broadcasting of information, whether innocently or deliberately, by those who attend the trial. Given the proliferation of twitter, facebook postings, blogs and the general posting of information on internet sites, a publication ban might well serve only to restrict responsible reporting by established media and do nothing to control the general dissemination of information." Justice Molloy, R. v. G.C., 2009 CanLII 89067
Canada’s Digital Charter
Recommendation 7: Providing a Civil Remedy That the Government of Canada develop a working group comprised of relevant stakeholders to establish a civil remedy for those who assert that their human rights have been violated under the Canadian Human Rights Act, irrespective of whether that violation happens online, in person, or in traditional print format. This remedy could take the form of reinstating the former section 13 of the Canadian Human Rights Act , or implementing a provision analogous to the previous section 13 within the Canadian Human Rights Act , which accounts for the prevalence of hatred on social media.
S. 13 of the CHRA (repealed) 13 (1) It is a discriminatory practice for a person or a group of persons acting in concert to communicate telephonically or to cause to be so communicated, repeatedly, in whole or in part by means of the facilities of a telecommunication undertaking within the legislative authority of Parliament, any matter that is likely to expose a person or persons to hatred or contempt by reason of the fact that that person or those persons are identifiable on the basis of a prohibited ground of discrimination. (2) For greater certainty, subsection (1) applies in respect of a matter that is communicated by means of a computer or a group of interconnected or related computers, including the Internet, or any similar means of communication, but does not apply in respect of a matter that is communicated in whole or in part by means of the facilities of a broadcasting undertaking. (3) For the purposes of this section, no owner or operator of a telecommunication undertaking communicates or causes to be communicated any matter described in subsection (1) by reason only that the facilities of a telecommunication undertaking owned or operated by that person are used by other persons for the transmission of that matter.
Recommendation 8: Establishing Requirements for online Platforms and ISPs That the Government of Canada establish requirements for online platforms and Internet service providers with regards to how they monitor and address incidents of hate speech, and the need to remove all posts that would constitute online hatred in a timely manner.
Recommendation 8: Establishing Requirements for online Platforms and ISPs ● These requirements should set common standards with regards to making reporting mechanisms on social media platforms more readily accessible and visible to users, by ensuring that these mechanisms are simple and transparent. ● Online platforms must have a duty to report regularly to users on data regarding online hate incidents (how many incidents were reported, what actions were taken/what content was removed, and how quickly the action was taken). Failure to properly report on online hate, must lead to significant monetary penalties for the online platform. ● Furthermore, online platforms must make it simple for users to flag problematic content and provide timely feedback to them relevant to such action.
“What if we held the platforms accountable for every time they posted something hateful online; for every view, $25,000 fine. Don't you think they would move quickly? ... If we could have heavy fines to the ISPs.” - Randy Boissonnault, Standing Committee on Justice and Human Rights, May 16th, 2019
Who Regulates?
Recommend
More recommend