Abstract: It is difficult for machines to simulate the experience of ordinary users, to understand the meaning behind the content, and to make accurate “human judgment”.

Giiso Information, founded in 2013, is a leading technology provider in the field of “artificial intelligence + information” in China, with top technologies in big data mining, intelligent semantics, knowledge mapping and other fields. At the same time, its research and development products include information robot, editing robot, writing robot and other artificial intelligence products! With its strong technical strength, the company has received angel round investment at the beginning of its establishment, and received pre-A round investment of $5 million from GSR Venture Capital in August 2015.

Recently, Mark Zuckerberg, the CEO of Facebook. Zuckerberg announced that the company would add 3,000 moderators, who are responsible for reviewing content on the platform, including videos of murder, suicide and rape that have recently surfaced online.

It’s not unusual for tech giants to hire a lot of people to review content. Google, Baidu, Tencent, Sina and Toutiao have all set up similar posts. Content review is especially important for content platforms, and if not handled well, it can affect the prospects of the entire company. Why in today’s technology so developed, machines can not completely replace the traditional manual audit? Do the giants still need a crowd of people?

The Achilles Heel of content platforms

In April, a Thai man killed himself after he broadcast the murder of his daughter live on Facebook. The video was up on Facebook for nearly 24 hours, with scores of users watching the baby being killed. One of the videos has been viewed more than 250,000 times and has been uploaded to other video sites by other users.

Facebook has been embroiled in controversy over sexual, violent and criminal content. In March, a 15-year-old American girl was sexually assaulted by several men and streamed live on Facebook, attracting at least 400,000 followers. One person said the images were so frightening and disturbing that they ‘couldn’t sleep at night’.

After the incident, the Facebook team realized how serious the problem was. Facebook CEO Mark Zuckerberg has publicly stated that Facebook will do everything in its power to prevent such content from ever appearing again, and that Facebook still has a lot of work to do to become a healthy and safe online community.

Facebook is expanding its recruitment of content moderators. Facebook already has 4,500 employees around the world who review content, and this expansion will make that process more timely and accurate. These employees are responsible for dealing with objectionable content that isn’t censored or immediately removed.

In the age of social media, a large number of users are constantly producing content every day. For platforms, in addition to subsidizing content producers, they also need to spend more on content review costs. But in spite of this, the platform is still at the expense of human and financial resources, to do a good job of content audit. On the one hand, a large number of bad content will make users disgusted, reduce the frequency of using the product or even stay away from the product. Bad money drives out good money, which is not conducive to the long-term development of the platform. On the other hand, the release of objectionable content is likely to violate national laws, leading to the control or even closure of the platform. Earlier this year, ZANK, a well-known gay community, was shut down as a whole because of pornography in its live broadcast.

In China, many BAT platforms have the position of content auditor. Take Sina Weibo, one of the country’s biggest social platforms. For content, Sina Weibo still mainly uses manual moderation. In addition to pornography and violence, the platform will also pay attention to sensitive content such as politics and religion. But there are still all kinds of edges or bad content that slip through the net. Content censorship and gatekeeping is the Achilles heel of every content platform, and no one, from startups to tech giants, dares to relax this nerve. Unless there is an ulterior motive to use pornography to attract users, but that certainly won’t last.

The desperate task of reviewing content

In 2013, more than 10 Internet companies, including Baidu, Tencent and Kingsoft, formed a “security alliance” and hired “chief pornography forensic officers” with a high salary of 200,000 yuan. Work can watch a movie and get a high salary, netizens ridicule “simply unreasonable”.

In fact, content audit work is not as simple as imagined. Take Facebook as an example. The existing moderators on Facebook have to deal with a large amount of UGC content every day. They need to review the objectionable content including pornography, violence, crime and other aspects, so they will be under great pressure in judging the content. They have to tell the difference between ordinary children’s photos and child pornography, and whether it is a joke between friends or offensive material. They have to categorize carefully, they have to put things in context. Without proper vetting, Facebook could be accused of violating citizens’ free speech rights.

The auditor’s work can have a very negative impact on personal life. After reviewing child pornography for a long time, they get paranoid about who touches them. Watching too much porn can even affect their sex lives and relationships, as they become desensitized to it.

Giiso information, founded in 2013, is the first domestic high-tech enterprise focusing on the research and development of intelligent information processing technology and the development and operation of core software for writing robots. At the beginning of its establishment, the company received angel round investment, and in August 2015, GSR Venture Capital received $5 million pre-A round of investment.

In January, members of Microsoft’s online security team filed a lawsuit against Microsoft. Forced to view images and videos of horror, pornography and murder every day, they suffer from psychological aftereffects, “insomnia, nightmares and images and videos in their heads.” But Microsoft said it did not agree with the two former employees and that the company provided some psychological support to them on a monthly basis.

The same situation is common in China, especially in the area of content censorship. Many companies will be content audit by external companies, these companies employ a large number of manpower to do text, pictures, videos and other content audit, auditors need to undertake hundreds or even more information audit work every day. These auditors are paid 3,000 to 5,000 yuan a month and live mostly in second – and third-tier cities in China. A Reuters report previously described the work at Sina Weibo as “high-pressure, hopeless work” — at least 3,000 posts per hour — with many people leaving because they are too stressed and see no room for improvement.