This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
Consider an approximately normally distributed set of measurements of height which are accurate to 0.1cm. Then consider how many of them have a heart attack.
Now, you as a researcher believe short people may be at risk of heart attacks (the theory being that they had intrauterine stress) and that tall people may also be at risk (they require to be hypertensive to perfuse their brains and may also have connective tissue disorders).
But when there are few people with a given height, five say, any [edit: any non-zero] rate of heart attacks gives at least 20% risk.
Is there a name for this? If not I shall call it the keithreid-sfw bias. (Jokes)
Thanks.
For context my actual problem is incidents of aggression on a ward and we think that extreme temperatures increase their risk.
A scatter plot does indeed show that more extreme temperatures have more risk of events with a U-shaped curve on scatter graph. Most temperatures have a rate of events. The increased ratio is more than what would happen due to this bias.
I am controlling for this by plotting three lines on the scatter graph. One is the data. One is a flat line representing the overall rate. One is a low U-shaped curve which scatter plots the rate is there were just one event per temperature. The data are much higher up the y axis.
Subreddit
Post Details
- Posted
- 1 year ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/AskStatisti...