Second, if you happen to have N random variables with a constraint, you simply happen to have only (N-1) random variables. If the sum of N items must add up to 100, you can only choose (N-1) of them, otherwise you'll end up breaking the constraint with - ehr - probability 1.
Of course, nobody is saying that you must always choose the same set of N "free" variables! For each iteration, I would suggest a two stage process, in which:
An iteration must be rejected if the (N-1) values do not allow the correct generation of the Nth, of course. This would happen if the (N-1) values add up to more than 100, for example; the probability of this happening is somehow correlated with the variance of the (N-1) variables. In your case, this rejection is easily spotted, because it would lead to negative values for the N-th percentage.
The above process should also address the "mean values do not add up to 100", even if I would repeat my suggestion to verify your model about this.
Flavio
perl -ple'$_=reverse' <<<ti.xittelop@oivalf
In reply to Re: Need technique for generating constrained random data sets
by polettix
in thread Need technique for generating constrained random data sets
by GrandFather
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |