I never was a huge fan when it came to chances, and I was never particularly good at it. Now, I actually need to do something but I can't figure out how.
I have a normal distribution, given the standard deviation and the mean. This represents the chance of a random number x being selected. This is mathematical, not computer science, so there are no limits on the number being selected or its precision. x is a real number. Let's say P(x) is the chance of number x being selected.
Now what I need to find out; given a numbers y and n, the chance that after selecting n numbers x, (x0, x1, ..., xn) with each a chance of being selected of P(x), these numbers multiplied (x0*x1*...*xn) result in the number y.
I've got some calculations on paper, but I think I'm not even on the right track... Anybody here has any idea on how to go about this?