💡
7
c/ethical-frontiers•riverw17riverw17•1mo ago

A talk with my sister about her new job monitoring AI content

My sister just started a job in Austin where she has to review and label harmful content flagged by an AI. She said the worst part is the AI keeps showing her borderline stuff, like a video of a fight where it can't tell if it's a real assault or a movie scene. She has to make the final call hundreds of times a day. It made me think about the people behind these systems and the mental toll. How do we build this tech without burning out the humans training it?
3 comments

Log in to join the discussion

Log In
3 Comments
avery_ross
avery_ross1mo ago
Mandatory therapy breaks and a strict cap on daily reviews saved my sanity doing that work.
5
susanb34
susanb341mo ago
It's wild how many jobs now need those kinds of built-in rules just to be doable long-term. I see it with friends in teaching and nursing too, where they have to schedule hard stops or they'd just burn out. Your point about capping the daily reviews is key, because the work will always expand to fill the time you give it.
9
angela_knight3
Mandatory therapy breaks" seems like a bit much for just watching videos all day. It's not like she's a first responder, it's a desk job. People deal with way worse.
3