Hello,

I'm wondering how to approach a problem of calculating probability in C#.

Lets say, we know that some task has a 5% probability of happening and there're say 100 identical tasks. So theoretically, 5 tasks should happen, but as those tasks are individual tasks, each task has a 5% probability of happening. So, how can I simulate this case using programming?

I presume that I can get 5 random number using Random class (Random rand = new Random(); rand.Next(1, 100)), and then get another 100 numbers and see if any number is in the set I generated first. But in this case, it's not truly random as generating first 5 numbers can use all "random potential" of these numbers and they will happened to be generated again.

Is there any accurate way of doing it?

Thanks

For each task, generate a random number between (1 to 100), and check if the number is > 95. Assuming that the random generator is uniformly distributed (which it claims so), then there is a 5% probability that the task gets > 95. Repeat this for every task.