Risk Assessment for Illegal Content
From Decidim community, https://matrix.to/#/!WbkoamENosPmKRcBfm:matrix.org/$5GoT-G2Xs2iPZKapeBgxhEGnCMVnmi6937a0sbqWqHA?via=matrix.org&via=gitter.im&via=privchat.eu
It applies for any communities (user to user) who have UK users: https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/guide-for-services/ - if you don't have under 18s and don't do pornography then mostly what you need to do is a risk assessment, and have a record keeping system, providing the systems you use for user-to-user communication have the appropriate systems in place for reporting illegal/harmful content, at least that is my understanding. The important thing here is you must start the risk assessment soon, by mid-March 2025.
From Decidim's side it would be important to be able to flag content as harmful or illegal and have some record keeping around that.
From what I understand, it is more import to do risk assessment than already changing the product. As changing the product would be a mitigation to a risk. It would be useful to have a living document to iterate around these assessments.