Instagram Vixen


Warning: Undefined array key 0 in /home/digitald/public_html/courses/wp-content/plugins/radgeek-FWP---Add-Attribution-70acf52/add-attribution-feedwordpress.php on line 363

Roberts’ article on Commercial Content Moderation makes the point how there is so much content online that it makes it difficult for reviewers to sift through them and uncover inappropriate content. They majorly rely on the content that has been flagged to find inappropriate content; material is posted, flagged and only then is it removed. This means that only content that someone else has flagged and reported are likely to be removed.

What flagged content assumes is that it was flagged by a user. But of course communities of people with the same ideas will converge on ideas and will not think their contents are inappropriate. What do we do about content that is not flagged that is still inappropriate? When we rely on users to flag content we are relying the social norms that dictate what is appropriate versus what isn’t.

Photo from Tazs Angels Instagram

The image above is a snippet from the Instagram page of a group of women who call themselves Taz’s Angels. They frequently post images of themselves half naked or doing acts that are inappropriate. But why hasn’t there been tons of people flagging their images and videos as being inappropriate and boarder line pornographic? Society has accepted these women and the content of there pages. Why are pages like these so openly accepted but pornography sites are so taboo?

The people who visit these pages most likely go to specifically see these types of images and will not report them and those who would report them are not in the circles to find such pages. Therefor pages and sites with inappropriate content are majorly visited by people who want to find them and therefor will not likely report its inappropriate content. How do we make new policies and systems that address these issues?

Posted from Course Blogs by Shayla B.