This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
Back in 2016, the concern about Russian influence in social media focused on specific accounts that could be traced back to troll farms like the Russian Internet Research Agency. People were concerned about certain high-karma accounts that posted a lot of disinformation or propaganda.
But Reddit is and has been ripe for a much more subtle and difficult-to-detect manipulation for a long time by any large organized group (like Russians):  Thereâs a critical time in any post where it either dies in ânewâ or gets above a critical threshold where it gets bumped into âhotâ or âtrendingâ and then propels itself up.  There have been studies about comments that show similar behavior⌠almost all top comments were made in the first 5 minutes.
There are only certain people (bored teenagers mostly?) who want to venture into ânewâ to filter the billions of pieces of mostly-garbage content. Â It wouldnât take much effort to manipulate that without being noticed.
So an organized entity like a Russian Internet Research Agency could generate tens of thousands of Reddit accounts over a few years, have them post/repost enough to each get their own karma in uncoordinated ways.  But then when youâd like certain stories to start trending, you just have a few dozen of them upvote and comment on a new post to get it out of that critical early stage. Or downvote ideas you don't want to trend. With enough different and seemingly uncoordinated accounts, you could do this under the radar.
Subreddit
Post Details
- Posted
- 4 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/TheoryOfRed...
It seems like a lot of people do this, especially when you already know what all the top voted comments are going to say.