SMARTeD https://smartedemocracy.org Against fake news Mon, 19 Aug 2019 11:33:59 +0000 en-GB hourly 1 https://wordpress.org/?v=5.0.4 Kariņš: European-wide regulation for social media should be considered https://smartedemocracy.org/karins-european-wide-regulation-for-social-media-should-be-considered/ https://smartedemocracy.org/karins-european-wide-regulation-for-social-media-should-be-considered/#respond Tue, 18 Jun 2019 13:49:45 +0000 https://smartedemocracy.org/?p=6998 Prime Minister Krišjānis Kariņš, during his speech at “The Riga StratCom Dialogue 2019”, stressed the need to introduce the European level regulation for social media platforms

“Social media platforms are used throughout the world in the ways in which they are not intended by their founders. On the one hand, you can find groups of people who think like you and maybe share your hobbies, while on the other hand, the social media can be used in ways that undermine our society as such. There are states, which are in the business of doing their best to undermine our societies – to create discord among people and unrest in society,” said K. Kariņš.

Although the EU regulates many spheres, the Prime Minister acknowledged that the sphere of social media is not regulated.

“I believe that there is a fine line between freedom of speech, free information flow and responsibility for Internet content and social media platforms. While avoiding censorship, we have to consider clever, European-wide legislation to bring the social media platforms under the umbrella of responsibility of what sorts of information they are disseminating. We have to introduce it in order to defend democracy, our values and way of life. New technologies require new thinking,” stressed the Prime Minister.

Source: https://mk.gov.lv/en/aktualitates/karins-european-wide-regulation-social-media-should-be-considered

]]>
https://smartedemocracy.org/karins-european-wide-regulation-for-social-media-should-be-considered/feed/ 0
SMARTeD survey results presented at a media literacy event https://smartedemocracy.org/smarted-survey-results-presented-at-at-a-media-literacy-event/ https://smartedemocracy.org/smarted-survey-results-presented-at-at-a-media-literacy-event/#respond Fri, 12 Apr 2019 10:41:41 +0000 https://smartedemocracy.org/?p=6963 As a part of the European Media Literacy Week a conversation on media literacy took place on 21th March in Trubar literature house in Ljubljana. The event was organized by the Slovene Association of Journalists and Časoris – Slovenia’s newspaper for kids.

Participants from media, government, education, and non-governmental organizations addressed the importance of media literacy in society in which fake news and other forms of disinformation influence election results and encourage hate and intolerance.

During the conversation, Simon Delakorda from the Institute for Electronic Participation presented SMARTeD project survey results. On average, 46 % of the population in Czech Republic, Estonia, France, Greece, Latvia and Slovenia cannot identify disinformation. Anonymous social media accounts and politicians, followed by political parties, are the most likely agents to create and disseminate disinformation. The survey results suggest promoting media and information literacy and encouraging critical thinking about the origin of the information on the internet.

Invited speakers highlighted more focus should be given to educating young people to understand how digital technologies are functioning and how these functionalities are impacting daily lives. The participants also stressed that in the world of information overabundance and in the time in which disinformation and fake news are destroying people’s trust in media and other public institutions, ability of finding and using credible information is of crucial importance.

At the end of the event, the Slovene translation of the online game Bad news (https://getbadnews.si/droggame_book/junior/#intro) was presented by the Časoris editor dr. Sonja Merljak Zdovc.

]]>
https://smartedemocracy.org/smarted-survey-results-presented-at-at-a-media-literacy-event/feed/ 0
Statement on the Code of Practice against disinformation: Commission asks online platforms to provide more details on progress made https://smartedemocracy.org/statement-on-the-code-of-practice-against-disinformation-commission-asks-online-platforms-to-provide-more-details-on-progress-made/ https://smartedemocracy.org/statement-on-the-code-of-practice-against-disinformation-commission-asks-online-platforms-to-provide-more-details-on-progress-made/#respond Fri, 01 Mar 2019 08:20:46 +0000 https://smarted.ithouse.lv/?p=6856 European Commission – Statement

Brussels, 28 February 2019

European Commission published reports by Facebook, Google and Twitter covering the progress made in January 2019 on their commitments to fight disinformation. These three online platforms are signatories of the Code of Practice against disinformation and have been asked to report monthly on their actions ahead of the European Parliament elections in May 2019.

More specifically, the Commission asked to receive detailed information to monitor progress on the scrutiny of ad placement, transparency of political advertising, closure of fake accounts and marking systems for automated bots. Vice-President for the Digital Single Market Andrus Ansip, Commissioner for Justice, Consumers and Gender Equality Věra Jourová, Commissioner for the Security Union Julian King, and Commissioner for the Digital Economy and Society Mariya Gabriel said in a joint statement:

“The online platforms, which signed the Code of Practice, are rolling out their policies in Europe to support the integrity of elections. This includes better scrutiny of advertisement placements, transparency tools for political advertising, and measures to identify and block inauthentic behaviour on their services.

However, we need to see more progress on the commitments made by online platforms to fight disinformation. Platforms have not provided enough details showing that new policies and tools are being deployed in a timely manner and with sufficient resources across all EU Member States. The reports provide too little information on the actual results of the measures already taken.

Finally, the platforms have failed to identify specific benchmarks that would enable the tracking and measurement of progress in the EU. The quality of the information provided varies from one signatory of the Code to another depending on the commitment areas covered by each report. This clearly shows that there is room for improvement for all signatories.

The electoral campaigns ahead of the European elections will start in earnest in March. We encourage the platforms to accelerate their efforts, as we are concerned by the situation. We urge Facebook, Google and Twitter to do more across all Member States to help ensure the integrity of the European Parliament elections in May 2019.

We also encourage platforms to strengthen their cooperation with fact-checkers and academic researchers to detect disinformation campaigns and make fact-checked content more visible and widespread.”

Main outcomes of the signatories’ reports:

  • Facebook has not reported on results of the activities undertaken in January with respect to scrutiny of ad placements. It had earlier announced that a pan-EU archive for political and issue advertising will be available in March 2019. The report provides an update on cases of interference from third countries in EU Member States, but does not report on the number of fake accounts removed due to malicious activities targeting specifically the European Union.
  • Google provided data on actions taken during January to improve scrutiny of ad placements in the EU, divided per Member State. However, the metrics supplied are not specific enough and do not clarify the extent to which the actions were taken to address disinformation or for other reasons (e.g. misleading advertising). Google published a new policy for ‘election ads’ on 29 January, and will start publishing a Political Ads Transparency Report as soon as advertisers begin to run such ads. Google has not provided evidence of concrete implementation of its policies on integrity of services for the month of January.
  • Twitter did not provide any metrics on its commitments to improve the scrutiny of ad placements. On political ads transparency, contrary to what was announced in the implementation report in January, Twitter postponed the decision until the February report. On integrity of services, Twitter added five new account sets, comprising numerous accounts in third countries, to its Archive of Potential Foreign Operations, which are publicly available and searchable, but did not report on metrics to measure progress.

Next steps

Today’s reports cover measures taken by online companies in January 2019. The next monthly report, covering the activities done in February, will be published in March 2019. This will allow the Commission to verify that effective policies to ensure integrity of the electoral processes are in place before the European elections in May 2019.

By the end of 2019, the Commission will carry out a comprehensive assessment of the Code’s initial 12-month period. Should the results prove unsatisfactory, the Commission may propose further actions, including of a regulatory nature.

Background

The monitoring of the Code of Practice is part of the Action Plan against disinformation that the European Union adopted last December to build up capabilities and strengthen cooperation between Member States and EU institutions to proactively address the threats posed by disinformation.

The reporting signatories committed to the Code of Practice in October 2018 on a voluntary basis. In January 2019 the European Commission published the first reports submitted by signatories of the Code of Practice against disinformation. The Code aims at achieving the objectives set out by the Commission’s Communication presented in April 2018 by setting a wide range of commitments articulated around five areas:

  • Disrupt advertising revenue for accounts and websites misrepresenting information and provide advertisers with adequate safety tools and information about websites purveying disinformation.
  • Enable public disclosure of political advertising and make effort towards disclosing issue-based advertising.
  • Have a clear and publicly available policy on identity and online bots and take measures to close fake accounts.
  • Offer information and tools to help people make informed decisions, and facilitate access to diverse perspectives about topics of public interest, while giving prominence to reliable sources.
  • Provide privacy-compliant access to data to researchers to track and better understand the spread and impact of disinformation.

Between January and May 2019, the Commission is carrying out a targeted Monthly Intermediate Monitoring of the platform signatories’ actions to implement Code commitments that are the most relevant and urgent to ensure the integrity of elections. Namely: scrutiny of ad placements (Commitment 1); political and issue-based advertising (Commitments 2 to 4); and integrity of services (Commitments 5 & 6).

The Code of Practice also goes hand-in-hand with the Recommendation included in the election package announced by President Juncker in its 2018 State of the Union Address to ensure free, fair and secure European Parliament’s elections. The measures include greater transparency in online political advertisements and the possibility to impose sanctions for the illegal use of personal data to deliberately influence the outcome of the European elections. As a result, Member States have set up a national election cooperation network of relevant authorities – such as electoral, cybersecurity, data protection and law enforcement authorities – and appointed a contact point to participate in a European-level election cooperation network. The first meeting of this network took place on 21 January 2019 and a second one on 27 February 2019.

]]>
https://smartedemocracy.org/statement-on-the-code-of-practice-against-disinformation-commission-asks-online-platforms-to-provide-more-details-on-progress-made/feed/ 0
How can you participate? Join the national workshops! https://smartedemocracy.org/ho-can-you-participate-join-the-national-workshops/ https://smartedemocracy.org/ho-can-you-participate-join-the-national-workshops/#respond Tue, 12 Feb 2019 10:08:44 +0000 https://smarted.ithouse.lv/?p=6819 The Riga conference on Disinformation and fake news challenge to democracy, as well as the international survey on disinformation, was only a kick-off activity for a year-long project “Smart E-democracy Against Fake News (SMARTeD​)”

On 30th of Nov 8 organisations from 7 EU countries came together to develop a methodology for workshops that will be put in practice already next year all over Europe! Workshops aim to raise practical awareness for citizens of skillful participation through eDemocracy tools.

🔔 Follow our activities and stay engaged!

]]>
https://smartedemocracy.org/ho-can-you-participate-join-the-national-workshops/feed/ 0
“Fake news is not the problem – people are.” https://smartedemocracy.org/viltus-zinas-no-patiesam-latvija-nespej-atskirt-43-iedzivotaju/ https://smartedemocracy.org/viltus-zinas-no-patiesam-latvija-nespej-atskirt-43-iedzivotaju/#respond Tue, 08 Jan 2019 11:34:17 +0000 https://smarted.ithouse.lv/?p=6114
On 29th of November, 2018, organisation ManaBalss hosted an international conference “Disinformation and fake news challenge to democracy”! What a great event it was with a lots of knowledgeable speakers, fresh ideas and even better audience!


During the conference the preliminary results of survey on disinformation and fake news (carried out in 6 EU member states – CZ, EE, FR, LV, SL, GR) were also presented. According to the report, the highest percentage of people who cannot tell the difference between fake and real news is in Greece – 56.1%.

Conference participants pointed out, that the solution for determining the “truth” and expecting informative and reasoned activities from the society, while there is a visible disinformation presence in the information space around us, is to raise awareness that this problem exists.

Self-regulated quality mechanisms used by the content makers are another effective way how to deal with this challenge. It means to possess a stronger responsibility towards the published content, as well as the collective knowledge and ability to monitor and control the presence of disinformation in publicly available information.

]]>
https://smartedemocracy.org/viltus-zinas-no-patiesam-latvija-nespej-atskirt-43-iedzivotaju/feed/ 0