This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
Our bot is u/DrRonikBot.
We rely on scraping some pages which are necessary for moderation purposes, but lack any means of retrieval via the data API. Specifically, reading Social Links, which has never been available via the data API (the Devvit-only calls aren't useful, as our bot and its dependencies are not under a compatible license, and we cannot relicense the dependencies even if we did spend months/years to rewrite the entire bot in Typescript). During the API protests, we were assured that legitimate usecases like this would be whitelisted for our existing tools.
However, sometime last night, we were blocked by a redirect to some anti-bot JS, to prevent scraping. This broke the majority of our moderation functions; as Social Links is such a widely-used bypass by scammers targeting communities like ours, we rely on being able to check for prohibited content in these fields. Bad actors seem to be well aware of the limitations of bots in reading/checking these, and only our method has remained sufficient, up until Reddit blocked it.
Additionally, our data API access seems to have been largely turned off entirely, with most calls returning only a page complaining about "network policy" and terms of service violations.
What do we need to do to get whitelisted for both these functions, so we can reopen all of our communities?
Our bot user agent contains the username of our bot (DrRonikBot). If more info is needed, I can provide it, though I have limited time to respond and would appreciate it if Reddit could just whitelist our UA or some other means, like adding a data API endpoint (we really only need read access to Social Links).
Subreddit
Post Details
- Posted
- 3 months ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/ModSupport/...