That was true in the newspapers on the sidewalks in the 1800s, it was true in evening news in the ’90s, and it’s true when you’re optimizing click-through rate and stuff like that. So yes, I do believe that if you’re just giving people what they want in that moment, it is going to skew toward sensational stuff.
#Trap street boys tv#
What you’re saying is the whole TV news thing of leading with salacious crimes, which gives viewers a warped perception of the risks of violent crime, came from a new ability to focus on engagement as a metric. There were new measurement systems that allowed the TV industry to realize that people who watch the program before the evening news-whether or not they stick around to watch the actual evening news broadcast after the entertaining show has a whole lot to do with the first five minutes. Interestingly, when the “If it bleeds, it leads” type of TV evening news emerged in the late ’80s and early ’90s, it was very metrics-driven. I got really into this because how crime is discussed online is very problematic. There are things that happen in the media industry that are very problematic, and that would be “If it bleeds, it leads” evening news kind of stuff. Are we blaming algorithms for user behavior?
But you could argue that people just like to click on the scandalous, the naughty, the provocative, the whatever. WIRED: We’ve all heard a lot about how Facebook optimizes for engagement. We spoke earlier this week about how companies like Facebook should rethink their approach to platform design-including taking a page from the media industry at the turn of the 20th century.
In October, along with former colleague Sahar Massachi, he launched the Integrity Institute, an organization devoted to bringing together current and former integrity researchers to develop ideas and best practices for an industry that sorely needs them. Jeff Allen spent four years working as a data scientist at Facebook, including two years on its integrity team, before leaving in late 2019.
Where does social media go from here? The leaked documents known as the Facebook Papers hammered home the fact-if there was any doubt remaining-that even the world’s most sophisticated content moderation systems can’t keep pace with human misbehavior on the billions-of-users scale, or the damage generated by algorithms designed to maximize engagement.