


The EU will adopt legislation "to ensure greater transparency in the area of sponsored content in a political context". This broader scope is better than merely focusing on political 'ads' and can tackle paid for influence content such as https://www.nytimes.com/2020/02/13/style/michael-bloomberg-memes-jerry-media.html
The EU wants to ensure that users can see the difference between such content and genuinely organic content. Which types of sponsored content will fall within the scope of this legislation is still TBD. Also note that these rules will complement new rules for ALL ads in the DSA!
So we'll have general rules that apply to all ads in the DSA and more specific, lex specialis rules, for 'political ads'. That could mean, for instance, more mandatory data points in ad registries for 'political ads' as opposed to commercial ads.
This focus on the need to develop 'risk management tools' to address 'systemic risks' is very much in line with what I argued here on pp. 22-23 https://osf.io/vnswz/ Although the effectiveness of this approach will depend very much on the audit/oversight mechanism in the DSA
Interestingly, the DSA will establish a "co-regulatory backstop" for the measures in the new Code of Practice on Disinformation. This can be major development and provide real teeth to the Code - if done right. This is peak EU-lingo so let me try to unpack this:
It means that compliance with the Code could be seen as part of the risk mitigating measures in the context of the DSA, which can be taken into account to assess whether a platform lives up to its obligations to address 'systemic risks' in the DSA.
So the Code is a stopgap to address some of these systemic risks related to disinformation until the DSA is being implemented. Makes sense, bc we cant wait until the DSA is implemented to address this. We move indeed from self-regulation to co-regulation here,as @VeraJourova said
Finally, one of my pet peeves: access to data. The Code will ensure an "effective data disclosure for research on disinformation". The EDAP says that the EDMO can facilitate the access to such data, which would include personal data. In practice, this could mean working towards
.. a Code of Conduct on Access to Platform Data under Article 40 of the GDPR. How that process could look like I explain here: https://twitter.com/mathver/status/1333333137145933825?s=20
Importantly, the Commission says clearly: "the GDPR does not a priori and across the board prohibit the sharing of personal data by platforms with researchers". A busy 2021 is ahead of us. Read the full proposal here: https://ec.europa.eu/transparency/regdoc/rep/1/2020/EN/COM-2020-790-F1-EN-MAIN-PART-1.PDF
Good overview from @mscott and @LauKaya here https://pro.politico.eu/news/europe-democracy-disinformation-media, includes the same analysis of the 'co-regulatory backstop'