This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
This is going to be a pretty basic question I assume, but a detailed explanation would be of great help to me.
I’m struggling to understand if what I am doing is correct.
Let’s say I have a die and I throw it 100 times. I am aware that the events are independent from each other and therefore at any point during this sequence I always have 1/6 chance to get any specific face.
My considerarion is that if I treat the hundred throws as a whole then there is a certain probability P(sequence) of a certain sequence happening.
This is to say that there are certain sequences that are more likely to happen than others so for example, if I throw the die 99 times and I get one every single time, my theory that is that I am better off betting that the last throw will be not a one because the probability of 100 throws being one is less than the probability of 99 ones and one other face.
I think that may be what I’m getting wrong is the fact that I’m always better off betting that the result of the dice isn’t one specific face because there are five other faces and clearly 1/6<5/6, but in a certain way, and I know that this next sentence is probably wrong, it feels like it should be better to “bet” on one of the faces of the set of faces that don’t include the one because the probability of the sequence of 100 ones is less likely than the probability of 99 ones and then any other face.
I hope I explained clearly and I would be grateful for any help in understanding this better.
Subreddit
Post Details
- Posted
- 6 months ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/AskStatisti...