Tim KorsoAll materialsWrite to the authorDespite ongoing debates regarding toxicity and healthy conversations online, it seems like the latest proposed solution to the problem presented by Twitter does not have many fans.A recently rolled out feature in which Twitter warns its users they are about to enter a “heated conversation” was a poorly calculated move on the part of the social media giant, akin to Playboy’s decision to stop “showing boobies”, Fox Business host Lisa Kennedy Montgomery has suggested during a recent “The Five” panel.
Kennedy went on to call the feature, which is still being tested, a “really bad marketing strategy” for Twitter, pointing out that heated conversations are one of the main reasons why people use the platform. The Fox host added that apart from Playboy, which in 2015 removed nude pictures from the journal and later in 2017 called the move a mistake, there were other examples where removing key content was not good for a particular product.”And like, when Tumblr was like, no, we’re done with porn. No one uses Tumblr anymore. It’s like, that’s what people go to Twitter for. They go because it’s abusive, it’s filthy”, Kennedy said.Her colleague on “The Five” panel, Greg Gutfeld, also chipped in on the issue, suggesting there are better ways of making the social media platform less toxic than labeling heated conversations. He namely brought up Twitter trends, which he compared to “irrational mob attacks”. He alleged that they foster attacks on individuals at the centre of these trends by underscoring the popularity of the issue.”So you go on, you look, oh look, Kenny Loggins is trending! And you go, you find out, oh my God, he did something. Twitter trends is one of the most destructive things in social media because it directs people to somebody who is in hot water. And everybody likes to go watch the frog boil in that bucket of hate”, Gutfeld noted.The newest label, which so far only appears for mobile Twitter users, was introduced hot on the heels of a scandal involving its rival, Facebook. One of the latter company’s former employees leaked the alleged internal evaluations of Facebook regarding the content that its algorithm was bringing up in users’ feeds. It turned out that toxic content, fearmongering and conspiracy theories generated the greatest user activity, thus bringing up the relevant posts higher.
Facebook Products Under ‘Reputational Review’ After Whistleblower, Media Question Safety6 October, 23:21 GMTHowever, the company whistleblower claimed that upon learning this fact, Facebook did nothing to change the algorithm and prevent such content from going viral, purportedly because it would reduce users’ engagement and the social media’s profits. Nor did it allegedly change the algorithms in another of its social media platform, Instagram, after learning that it caused mental health problems in 20% of teens using the app.