Hi all. Sorry for the mess of a title, I don't know the terms to properly describe what I'm trying to do. I have an equation:
Z = 10^((log(M)-6.09)/1.5)
where
M = 10^((1.5*W) 6.07);
and W is normally distributed, mean = 1.18, stdev = 0.746
The nature of W is that it's an event that happens again and again, obviously with an associated Z, which can then be summed. So far, I've approached this in a monte-carlo fasion, so I run the equations 10,000 times, giving me normally distributed W:
http://i.imgur.com/RsiOQZA.png
which results in 10,000 values for Z looking like this: (W is on the x axis, Z on the Y axis)
http://i.imgur.com/HdoCnWG.png
Hopefully you can see that whilst high values of W give very high Z, there are way more values of W at around 1, which would mean the sum of the Zs at about 1 would be greater than the sum of the Zs at 3 or 4.
I've created a bit of a hacky solution where I sort the W's into bins and sum the Z for those bins, resulting in a histogram like this:
http://i.imgur.com/Zmg4vas.png
and you can see that values of W between 2 and 3 make the greatest "contribution" to the sum of Z.
What I am sure it is possible to do, but that I don't have the mathematical chops for, is to solve this algebraically. I think it should be possible to draw a graph of the W vs the sum of Z.
Could someone point me in the right direction? Thanks!
Subreddit
Post Details
- Posted
- 9 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/mathematics...