
I'm trying to generate lognormal samples.
When I do something like this:
double sample = LogNormal.Sample(aRandomGenerator, 0.000001, 0.000001)
I get numbers back close to 1, but I was expected them to be distributed around my mean (0.000001).
What am I doing wrong?



It looks like I get the correct result if I do this:
double sample = Math.Log(LogNormal.Sample(aRandomGenerator, 0.000001, 0.000001));
Why's that?


May 10, 2013 at 9:11 AM
Edited May 10, 2013 at 9:14 AM

From a first look it seems it is working correctly. The mu/sigma parameters do
not specify the mean and standard deviation of the variable (these are called "Mean" and "StdDev"), but of the natural logarithm of the variable (thus confirming your observation in your second post). This is consistent
with
wikipedia, although admittedly quite confusing.




