Probability Adjust Method

Preview:

Citation preview

Probability Adjustment Method

Yonsei University

Jamie J Seol

Motivation

Throw coin 10 times. You’ll expect 5 heads.

Let’s try it with computer

1,000,000 tosses with no adjustment

-1000

-500

0

500

1000

1500

y-value = heads - tails

Why not Horizon?

By Chevyshev inequality, the result should be horizon-like as n becomes bigger.

But it showed not, because of computer’s pseudo random generator.

Maximum difference of head-tail was 1554, which is 0.15% error and it’s not that critical.

Although, we want to adjust this error when using pseudo random.

Object

We want n/2 heads out of n tosses.

Assume we can change probability of toss result.

If we got d tails in a row, then probability of head-up should be increased, responsible to d.

Using adjustment method, we can control how fast actual outcome converges into given probability.

Example

Let h be number of heads-up, and t be number of tails-up in n tosses.

Let d = t – h.

For d > 0, changing probability of tails-up into (1/2)^d will produce stable result.

Simulation of Previous Example

-20

-15

-10

-5

0

5

10

15

20

1,000,000 tosses with (1/2)^d

y-value = heads - tails

Result of (1/2)^d

Maximum difference was 5, which gives 0.0005% error!

But this method converges into horizon too fast.

Weighted Random

Changing probability directly into exponential value is too fragile.

How about giving weight into probability?

Weighting Function

Using weighting function as exponent won’t be big problem.

We’ll use simple growth function from general MMORPG’s experience-level system.

Weighted Random Function

If weighting function f is distributive, i.e.

then for , and random number

by defining ,

w(r) gives weighted random value.

Choosing Weighting Function

For easy calculation, we’ll use simple exponential function that can consider d.

By letting weighting function like above, we can see that probability of higher values get more weight as n becomes higher, and of course, it’s distributive.

Calculation of Weighted Random

With ,

weighting function w(x, n) will be

Therefore w(r, n) will likely to give higher value as n goes bigger.

Graph of Weighting Function

k = 1, n = 2 k = 1, n = 4

Adjustment #1

With former functions, we can adjust probability in accurate way. For example, if d = 5 which means total number of tails-up is ahead by 5 compared to total number of heads-up, then we might want next toss result to be head by 95%.

Calculation

Simulation with Adjustment #1

1,000,000 tosses with k = 3.5997

y-value = heads - tails

-20

-15

-10

-5

0

5

10

15

20

Result of Adjustment #1

Although it’s calculation was perfect, which means that d = 5 will produce heads-up by 95%, it still converges too fast, even faster than directly changing probability into exponential!

Adjustment #2

We’ll repeat similar adjustment method, but with more reasoning.

Let’s set situation as d = 100 producing tails in 95%.

In this case, k will be 0.180004

Simulation with Adjustment #2

100,000 tosses with k = 0.180004

y-value = heads - tails

-20

-15

-10

-5

0

5

10

15

20

Not Bad!

In this method, what we were actually doing was bounding maximum difference. But the result gave too rigid bound.

Why did this happen? Proportion of weighted random value giving bigger than 0.5 with n = 10, k = 0.180004 is about 68%.

Since weighing function gives exponential weight, we need more gradual one.

Adjustment #3

Actually maximum difference from adjustment #2 gave 8, which is quite reasonable for 100,000 tosses. We’ll just make another case with more gradual version.

New situation is d = 1000 producing tails in 95%.

In this case, k will be 0.0180004

Simulation with Adjustment #3

100,000 tosses with k = 0.0180004

y-value = heads - tails

-25

-20

-15

-10

-5

0

5

10

15

20

25

Any idea?

Maybe we can use weighting function like

which gives 0.5 minimum weight to all productions, but calculating weighted random with this function is impossible for general case.

Result

• This new method won’t directly touch probability directly.

• Although restriction power is stronger than it’s set-up, but we can control boundaries without strictly-rigid way, with exact calculation.

• Now coin will show 4~6 heads out of 10 tosses, if we desire.

Thank You!

Jamie J Seol

Yonsei University, Joyfl CSO

theeluwin@gmail.com

fb.me/theeluwin, @theeluwin

Appendix: Python Simulator Code

from random import randomt = 100000d = 0k = 0.0180004for i in range(t):

r = random() ** (1.0/(k * abs(d) + 1))if d == 0: d += -1 + 2 * round(r)elif r > 0.5: d += -1 * d / abs(d)else: d += d / abs(d)

Recommended