Candid: The next attempt in information control

Facebook and other social networks are invaluable to intelligence agencies. Using the right text analysis software while employing teams of translators opens the gate to torrents of information ranging from flirtations to political dissent in users’ posts, replies, chats, and reactions. It is ludicrous to think the CIA would allow anything to obstruct any possible way to get to this data, which comes from more than one hundred countries. One way of controlling the flow and influence of content on social networking websites is smart censoring, enter Candid. Made by ex-Googlers, Candid is developing an AI to automatically recognize harmful content and bad users -as defined by the developers- after analyzing posts, replies, and exchanges. This AI might be sold and used for:-


1) Automatically censoring unwanted content for specific users.
2) Connecting like-minded users to increase their time on ad-sponsored websites.
3) Altering phrases and words known to annoy some users, which can be tailored to meet the taste of specific visitors.
4) Promoting popular posts and making them viewable to more users.
5) Promoting advertiser-friendly posts and hiding posts that drive advertisers away.

What’s alarming is if this AI succeeds and becomes able to achieve these goals big companies will begin implementing it and social networking will be AI-supervised. For a detailed look into the AI and the aims of its developers, watch Harmful Opinions videos on the subject.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Create a free website or blog at WordPress.com.

Up ↑

%d bloggers like this: