My feed showed me a video of a protest I was in, but it cut out the part where the cops shoved us
I was scrolling last Tuesday and an app suggested a clip from a rally I attended in Chicago. The video was sharp, but it only showed protestors yelling, not the moment three officers pushed a guy right next to me. The algorithm picked the most 'engaging' angry faces. It hit me then: I've been treating these feeds like news, but they're just showing me what keeps me glued, not what happened. I felt sick realizing I'd been getting a skewed view of other events for months, trusting a system built for clicks, not truth. So is the bigger problem the biased editing, or the fact that the algorithm rewards it by pushing it to more people? Where do we even start to fix that feeling?