It’s not surprising that these videos make news. People make their videos because they work. Getting a perspective to push a big platform to improve something over the years is an effective strategy. Ticket k, Twitter and Facebook make it easy for users to report abuse and violations by other users. But when these companies seem to be breaking their own policies, people often find that the best way to move forward is to just try to post about it on the platform, in the hope of going viral and attracting some kind of resolution. Leads to. Tyler’s two videos on the Marketplace BIOS, for example, each have more than 1 million views.

“The content is being flagged because they are from a marginalized group talking about their experiences with racism. Talking about hate speech and hate speech can look a lot like an algorithm.”

Casey Fitzgerald, University of Colorado, Boulder

“I probably tag about something once a week,” says Casey Fisler, an assistant professor at Boulder University of Colorado who studies ethnology and communities online. She is active on Ticket OK, with over 1,000,000 followers, but while what she sees doesn’t seem like a legitimate concern, she says the app’s regular parade is real. There have been a number of such bugs in the last few months in ticket ok, all of which have affected disproportionately marginalized groups on the platform.

MIT Technology Review Ticket asked Kane about each of these recent examples, and the answers are the same: after checking, Ticket ok Kane thinks the issue was created by mistake, insisting that the blocked content in question is not a violation of their policies and links to support the company. Gives such groups.

The question is whether that cycle – some technical or policy error, viral response and apology – can be changed.

Issues are resolved before they arise

“These potential algorithmic content are two types of loss of moderation that people are observing,” says Fisler. “There is a false negative. People are like, ‘Why is there hate speech on this platform and why hasn’t it been taken down?’ ”

The second is false positives. “Their content is being flagged because they are from a marginalized group that is talking about their experiences with racism.” “Talking about hate speech and hate speech will look very similar to the algorithm.”

She notes that both of these categories hurt the same people: people who are unfairly targeted for abuse are censored differently for speaking out.

Ticket OK’s mysterious recommendation algorithms are part of its success – but its vague and constantly changing boundaries are already having a cooling effect on some users. Fitzgerald notes that many ticket ok creators have self-censored words on the platform so as not to trigger a review. And although she’s not sure exactly how much this strategy accomplishes, Philser herself, too, has begun to do it. Account bans, algorithms and bizarre moderation decisions are a constant part of the conversation on the app.