Project Details
Projekt Print View

NSF-DFG: SaTC: CORE: Small: Requirements, Metrics, and Tools for Effective and Interpretable Content Moderation Transparency

Subject Area Security and Dependability, Operating-, Communication- and Distributed Systems
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term since 2024
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 538873437
 
The PIs propose a fundamental research agenda aimed at improving the utility of content moderation transparency for multiple stakeholders. Building on recent work by the research team, the research is driven by a case study of moderating image-based sexual abuse (IBSA) -- facilitated violence in which someone's intimate images are shared without their consent. 1. The PIs will survey existing content moderation transparency mechanisms and how they may fall short for different audiences. From a user's perspective, the PIs will conduct a user study to determine how targets of IBSA and organizations who support them attempt to have these images removed from platforms, to what extent existing transparency mechanisms assist them in tracking the progress of the removal process, and what novel transparency tools they might need. From a researcher's perspective, the PIs will analyze platforms' transparency reports to discover to what extent platforms provide sufficient context to explain trends observed in the data, and whether the chosen metrics and level of granularity are sufficient to evaluate the platform's performance in content moderation as well as the experience of users on the platform (e.g., whether the report explains and reflects on users' difficulties in having non-consensually shared intimate content removed). 2. To fill the gaps left by platforms' existing practices, the PIs will develop methodologies for providing third-party transparency. In collaboration with IBSA victim support organizations, the PIs will build a system that automatically tracks the outcome of IBSA takedown requests submitted by users and support organizations, so they are not re-traumatized by having to manually check whether the platform has complied with their request. Building on preliminary work estimating platforms' delays in content moderation, the PIs will also compute a more comprehensive range of content moderation transparency metrics based on data collected independently of the platforms. 3. The PIs will create frameworks for expanding transparency by utilizing new German and European Union data access rights for researchers. This will enable the investigation of new transparency metrics based on internal platform data (and therefore currently not publicly understood), such as how IBSA violations are handled at the account level for repeat offenders.
DFG Programme Research Grants
International Connection USA
 
 

Additional Information

Textvergrößerung und Kontrastanpassung