http://dbpedia.org/ontology/abstract
|
In 2018, the European Commission convened … In 2018, the European Commission convened representatives of major technology firms and the online advertising industry to develop a voluntary framework of industry self-regulation to fight disinformation. In the aftermath of the Facebook-Cambridge Analytica scandal and Russian interference in the 2016 U.S. presidential election, the Commission expressed concern that "mass online disinformation campaigns" were being "widely used by a range of domestic and foreign actors to sow distrust and create societal tensions." Moreover, the online platforms where these campaigns take place, according to the commission, had "failed to act proportionately, falling short of the challenge posed by disinformation and the manipulative use of platforms' infrastructures." The Code sets out a definition of disinformation as well as five broad commitments for industry signatories. Signatories commit to prepare annual self-assessment reports for review by the European Commission.rts for review by the European Commission.
|
http://dbpedia.org/ontology/wikiPageID
|
70695590
|
http://dbpedia.org/ontology/wikiPageLength
|
5236
|
http://dbpedia.org/ontology/wikiPageRevisionID
|
1117357413
|
http://dbpedia.org/ontology/wikiPageWikiLink
|
http://dbpedia.org/resource/Category:Disinformation +
, http://dbpedia.org/resource/Mozilla +
, http://dbpedia.org/resource/TikTok +
, http://dbpedia.org/resource/Industry_self-regulation +
, http://dbpedia.org/resource/Category:European_Commission_projects +
, http://dbpedia.org/resource/Facebook%E2%80%93Cambridge_Analytica_data_scandal +
, http://dbpedia.org/resource/Facebook +
, http://dbpedia.org/resource/Microsoft +
, http://dbpedia.org/resource/European_Commission +
, http://dbpedia.org/resource/Twitter +
, http://dbpedia.org/resource/Russian_interference_in_the_2016_United_States_elections +
, http://dbpedia.org/resource/Google +
|
http://dbpedia.org/property/wikiPageUsesTemplate
|
http://dbpedia.org/resource/Template:Orphan +
, http://dbpedia.org/resource/Template:Empty_section +
, http://dbpedia.org/resource/Template:Primary_sources +
|
http://purl.org/dc/terms/subject
|
http://dbpedia.org/resource/Category:Disinformation +
, http://dbpedia.org/resource/Category:European_Commission_projects +
|
http://www.w3.org/ns/prov#wasDerivedFrom
|
http://en.wikipedia.org/wiki/EU_Code_of_Practice_on_Disinformation?oldid=1117357413&ns=0 +
|
http://xmlns.com/foaf/0.1/isPrimaryTopicOf
|
http://en.wikipedia.org/wiki/EU_Code_of_Practice_on_Disinformation +
|
owl:sameAs |
https://global.dbpedia.org/id/GWrWo +
, http://www.wikidata.org/entity/Q112061741 +
, http://dbpedia.org/resource/EU_Code_of_Practice_on_Disinformation +
|
rdfs:comment |
In 2018, the European Commission convened … In 2018, the European Commission convened representatives of major technology firms and the online advertising industry to develop a voluntary framework of industry self-regulation to fight disinformation. In the aftermath of the Facebook-Cambridge Analytica scandal and Russian interference in the 2016 U.S. presidential election, the Commission expressed concern that "mass online disinformation campaigns" were being "widely used by a range of domestic and foreign actors to sow distrust and create societal tensions." Moreover, the online platforms where these campaigns take place, according to the commission, had "failed to act proportionately, falling short of the challenge posed by disinformation and the manipulative use of platforms' infrastructures."lative use of platforms' infrastructures."
|
rdfs:label |
EU Code of Practice on Disinformation
|