Jump in the discussion.

No email address required.

Compliance, security and reputational risks are substantial and evergrowing in unpredictable ways. Though largely hypothetical, these risks were judged serious enough to exercise management at the highest levels. Those people have better things to do than to clean up our mess.

There weren't any security volunerabilities, but there could be? So like any software?

Take, for instance, the UK Investigatory Powers Act 2016(opens a new window). Diligent people have spent years figuring out how its imprecise wordings apply to media organisations. Do these same conclusions hold for a sort-of-but-not-really decentralised silo of user generated content?

Do Mastodon server owners wear any responsibility for their users’ defamations? It’s unlikely but, because libel involves judges, not impossible. Again, the value in finding out is outweighed by the cost of finding out.

lol @ all UK cucks.

Mastodon administrators have access to everyone’s direct messages by default. FTAV has no interest in sliding uninvited into anyone’s DMs and the best way to prove it is to remove all opportunity.

People that own the database have access to the database :marseyshook:


https://i.rdrama.net/images/17187151446911044.webp https://i.rdrama.net/images/17093267613293715.webp https://i.rdrama.net/images/17177781034384797.webp

Jump in the discussion.

No email address required.

What about GDPR(opens a new window)? Digital Millennium Copyright Act takedowns(opens a new window)? Electronic Commerce Regulations(opens a new window)? CAN-SPAM(opens a new window)? FTAV treats user data with a combination of disinterest and uninterest, but that’s not enough to guarantee compliance with all relevant global laws and regulations.

:marseyemojirofl:

The well-intended regulations discourage small instances of online communities. I'm so torn about this... :merdesey:

![](/images/16746817433408995.webp)

>an amendment aimed at prosecuting social media bosses who fail to protect under-18s online.

Ohhhh, that's why. Naughty mastodon....

:marseyglancing: :marseypedo:

designed to protect under-18s from harmful content and remove illegal content online, with Ofcom, the communications regulator, able to issue large fines to tech giants that do not comply with the law.

Criminal liability for senior managers has the backing of child safety charities and campaigners, including Ian Russell, whose daughter Molly took her own life after viewing harmful content on sites including Instagram.

Love how the UK is cucking its online presence. Just post "keep yourself safe" or :marseygasp: CP, and you can take down companies.

:marseywholesome:

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.