SAN JOSE, Calif. (KGO) -- Even though Facebook and other platforms say they're on the lookout for objectionable material, the live stream from New Zealand got past them. Facebook tweeted that police had alerted them. By that time, the video had run 17 minutes, and it went viral.
"If it takes police contacting them to let them know, they have no idea what's going on over the platform," said Dr. Don Heider, a former TV reporter and a veteran journalism educator.
Heider is Executive Director of the Markkula Center for Applied Ethics at Santa Clara University. He says social platforms need to incorporate ethics in their culture and to speed up the ability to report violent and objectionable feeds.
"If on your Facebook page you started seeing something disturbing or violent that someone was live streaming, there could be a panic button. There could be a stop button. There could be a 911 button," said Heider.
Experience often helps news organizations to recognize when a situation could become graphic. When a UPS truck was commandeered in San Jose recently, we stopped SKY7's live feed when events turned and it appeared police might shoot the suspect, which indeed happened.
That's an example how human intervention by simply pressing a button can keep that kind of video from going out. But is there a better solution?
Social platforms say they're using a combination of humans and artificial intelligence, or AI. However, AI reliability is far from perfect. Ryan Welsh, founder & CEO of AI company Kyndi, says it's unclear what AI is seeing to predict a need for action.
"We really don't know what it's actually looking at," he said. "There's been some research around accurately being able to predict cars, but is it actually triggering on the car, or is it triggering on the shadow on the road? We don't know."
People can trick AI and humans by cropping or manipulating video to avoid detection. For that reason, Facebook says it's monitoring closely in case someone times to repost the New Zealand live stream.
While it missed that video, Facebook says it removed over a quarter-million violent videos just in the last three months of last year.