The Lab “Platform Governance, Media, and Technology” (PGMT) does research, teaching and knowledge transfer at the intersection of governance, digital communication and new technologies. We are particularly interested in platforms and their governance, as well the discursive, political and technological construction of “artificial intelligence” (AI). This website is to showcase a repertoire of our activities. Please also visit our official page at our home institution, the Centre for Media, Communication and Information Research (ZeMKI), University of Bremen.
Research
The PGMT Lab does research, teaching and knowledge transfer at the intersection of governance, digital communication and new technologies.
Online Talk Series
We are hosting a monthly online conversation on empirical research on platform governance. Each session consists of a short input talk by a scholar, followed by ample time for discussions and exchange.
Platform Governance Archive
We are hosting the Platform Governance Archive (PGA) – a longitudinal data repository and interface on social media platforms’ policies and content moderations guidelines.
-
Meta redefines glorification of dangerous entities
Meta ‧ Community Guidelines · January 11, 2024 On 11 January, Meta updated the section of its Community Guidelines that outlines the principles for the treatment of dangerous organizations and individuals. Namely, the platform has changed the description of the policy rationale as well as the definition of the glorification of dangerous entities that included…
-
End of Year Frenzy at Meta: Multiple changes to Facebook’s and Instagram’s Community Guidelines
In the final weeks of 2023, major social media platforms have made a range of changes to their Community Guidelines. This peak of policy changes represents the highest period of activity registered by the Platform Governance Archive in the last year. The following blog post gives an overview of the changes that were picked up…
-
One Day in Content Moderation: Analyzing 24 h of Social Media Platforms’ Content Decisions through the DSA Transparency Database
We have examined one day of content moderation by social media platforms in the EU. This report examines how social media platforms moderated user content over a single day. It analyzes whether decisions were automated or manual, the visibility measures applied, and which content categories were most subject to moderation by specific platforms on that…
-
Interdisciplinary Erasmus+ blended-intensive programme a success
More than 20 students and ten instructors from universities across Europe spent a week in beautiful Padova for the final phase and highlight of their Erasmus+ “blended-intensive programme” (BIP). The in-person component of the BIP “Digital Constitutionalism and Platform Governance” took place in late September at the Department of Political Science, Law and International Studies…
-
Platforms overwhelmingly use automated content moderation, first DSA transparency reports show
November 6th 2023 was the first deadline for platforms categorised as “Very Large Online Platforms (VLOPs)” by the EU to deliver their transparency reports. The EU’s new Digital Services Act (DSA) demands VLOPs to publish these twice a year, containing standardized information about the number of users and aggregated numbers of content moderation decisions taken…
-
X (formerly Twitter) softens its violent speech policy
On the 25th of October 2023, the social media platform X, formerly known as Twitter, made changes in their global Community Guidelines policy. According to the Platform Governance Archive, a data depository which automatically tracks policy changes on 18 platforms, X significantly softened its violent speech policy. In 2023, violent speech is defined by X (and…
-
Meta changes its policy on sadistic imagery and violence as the Israeli–Palestinian conflict escalates
Meta · Community Guidelines · 17 October, 2023 *The following post may contain explicit language and content that could be distressing to some readers On October 17, Instagram and Facebook changed their Community Guidelines to specifically prohibit sadistic posts containing imagery of deceased babies. While before the corresponding policy section forbade the posting of “sadistic…