26
Carnegie Mellon Lecture 14 Loop Optimization and Array Analysis I. Motivation II. Data dependence analysis Chapter 11.1-11.1.4, 11.6 Dror E. Maydan CS243: Loop Optimization and Array Analysis 1

Lecture 14 Loop Optimization and Array Analysis

  • Upload
    zayit

  • View
    46

  • Download
    0

Embed Size (px)

DESCRIPTION

Lecture 14 Loop Optimization and Array Analysis. Motivation Data dependence analysis Chapter 11.1-11.1.4, 11.6. Software Pipelining. LD[1] loop N-1 times LD[i+1]ST[i] ST[n]. for i = 1 to n A[x] = A[y] Can I software pipeline the loop? . Software Pipelining. LD[1] - PowerPoint PPT Presentation

Citation preview

Page 1: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Lecture 14Loop Optimization and Array Analysis

I. MotivationII. Data dependence analysis

Chapter 11.1-11.1.4, 11.6

Dror E. Maydan CS243: Loop Optimization and Array Analysis 1

Page 2: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Software Pipelining

for i = 1 to n A[x] = A[y]

• Can I software pipeline the loop?

Dror E. MaydanCS243: Loop Optimization and Array Analysis 2

LD[1]loop N-1 times LD[i+1] ST[i]

ST[n]

Page 3: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Software Pipelining

for i = 1 to n A[i] = A[i+n]

for i = 1 to n A[2*i] = A[2*i+1]

• Can I software pipeline the loop?

Dror E. MaydanCS243: Loop Optimization and Array Analysis 3

LD[1]loop N-1 times LD[i+1] ST[i]

ST[n]

Page 4: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Software Pipelining

for i = 1 to n A[i] = A[i-1]

for i = 1 to n A[i] = A[i+1]

• Can I software pipeline the loop?– We moved ldi+1 ahead of storei

• In case 1, load of a[i] ahead of store a[i]• In case 2, load of a[i+2] ahead of store a[i]

Dror E. MaydanCS243: Loop Optimization and Array Analysis 4

LD[1]loop N-1 times LD[i+1] ST[i]

ST[n]

Page 5: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Loop Interchange

for i = 1 to n for j = 1 to n

A[j][i] = A[j-1][i-1] * b[i]

• Should I interchange the two loops?– Stride-1 accesses are better for caches– But one more load in the inner loop– But one less register needed to hold the result of the loop

Dror E. MaydanCS243: Loop Optimization and Array Analysis 5

for j = 1 to n for i = 1 to n A[j][i] = A[j-1][i-1] * b[i]

Page 6: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Loop Interchange

for i = 1 to n for j = 1 to n

A[j][i] = A[j-1][i-1] * b[i]

Iteration Space

Dependence Vectors

Dror E. MaydanCS243: Loop Optimization and Array Analysis 6

j

i

j

i

j

i

Page 7: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Data Dependences

• Do two array references, nested in some loop nest and perhaps some if statements, refer to overlapping memory locations– As a first step, we check for overlap but don’t care about the nature of

dependence• Type of dependences

– True dependence:a[i] = a[i-1] …

– Anti-dependence:a[i] = a[i+1]

– Output dependencea[i] =a[i] =

Note that we can’t distinguish true from anti by just checking for overlap

Dror E. MaydanCS243: Loop Optimization and Array Analysis 7

Page 8: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Memory Disambiguation

• In general undecidable at compile time

read(n)for i =

a[i] = a[n]+ 3

if n > 2 if a > 0 if b > 0 if c > 0 x[an] = x[bn + cn]

Can the compiler solve Fermat’s last theorem?

Dror E. MaydanCS243: Loop Optimization and Array Analysis 8

Page 9: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Affine Array Accesses and Bounds• Common patterns of data accesses: (i, j, k are loop indexes, m is a loop

invariant constant)A[i], A[j], A[i-1], A[0], A[i+j], A[2*i], A[2*i+1]A[i,j], A[i-1, j+1], A[i+m, j]

• Array indexes are affine expressions of surrounding loop indexes and loop invariant constants– Loop indexes: in, in-1, ... , i1– Integer constants: cx, cx-1, ... , c0

– Loop invariant constants: my, my-1, ... , m1

– Array index: cnin + cn-1in-1+ ... + c1i1+ cxmx + cx-1mx-1+ ... + cn+1m1 + c0 – Affine expression: linear expression + a constant term (c0)

• Loop bounds are affine expressions of surrounding loop indexes and loop invariant constants– for i

• for (j=2*i; j<3*i+7; j+=2)

Dror E. MaydanCS243: Loop Optimization and Array Analysis 9

Page 10: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Single Loop, Single Array DimensionFOR i := 2 to 5 do A[i-2] = A[i]+1;

• Between read access A[i] and write access A[i-2]there is a dependence if:– there exist two iterations ir and iw within the loop bounds, s.t.

– iterations ir & iw read & write the same array element, respectively

integers iw, ir 2 iw, ir 5 ir = iw - 2

• Between write access A[i-2] and write access A[i-2] there is a dependence if: integers iw, iv 2 iw, iv 5 iw – 2 = iv - 2

– To rule out the case when the same instance of the same reference depends on itself: • add constraint iw iv

• For now, just asking if there is overlap, not the nature of the overlap

Dror E. MaydanCS243: Loop Optimization and Array Analysis 10

Page 11: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Multiple dimensions, Nested Loops• Only use loop bounds and array indexes that are affine functions of loop

variables and loop invariant constantsfor i = 1 to n for j = 2*i to 100 a[i+2*j+3][4*i+2*j][i*i] = … … = a[1][2*i+1][j]

• Assume a data dependence between the read & write operation if there exists: – a read instance with indexes ir, jr and– a write instance with indexes iw, jw

integers ir, jr, iw, jw 1 iw, ir n 2iw jw 100 2ir jr 100 iw + 2jw + 3 = 1 4iw + 2jw = 2ir + 1• Equate each dimension of array access; ignore non-affine ones

– No solution No data dependence– Solution there may be a dependence

Dror E. MaydanCS243: Loop Optimization and Array Analysis 11

Page 12: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

General Formulation of Data Dependence Analysis

• Equivalent to integer linear programming

integer i A1i b1 A2i = b2

• Integer linear programming is NP-hard – O(nn) where n is the number of variables

Dror E. MaydanCS243: Loop Optimization and Array Analysis 12

For every pair of accesses not necessarily distinct (F1 , f1 ) and (F2, f2) one must be a write operationLet B1i1+b1 0, B2i2+b2 0 be the corresponding loop bound constraints,

integers i1, i2 B1i1 + b1 0, B2i2 + b2 0 F1 i1+ f1 = F2 i2+f2

If the accesses are not distinct, then add the constraint i1 i2

Page 13: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Data Dependence Analysis Algorithm • Typically solving many tiny, repeated problems

– Integer linear programming packages optimize for large problems

– Use memoization to remember the results of simple tests

• Apply a series of relatively simple tests– Extended GCD, …,

• Backed up by a more expensive algorithm– Fourier-Motzkin

Page 14: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

GCD Test• Ignore Bounds

Integer vector i A1i = b1 • Single equation

anin + an-1in-1+ ... + a1i1 = b

There exists an integer solution iff the gcd(an, an-1, …) divides b

Euclid’s Algorithm (around 300 BC)• Assume a and b are positive integers, and a > b.• Let c be the remainder of a/b.

– If c=0, then gcd(a,b) = b. – Otherwise, gcd(a,b) = gcd(b,c).

• gcd(a1, a2, …, an) = gcd(gcd(a1, a2), a3 …, an)

Page 15: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

GCD Test• Multiple equations: Ai = b

x – 2y + z = 03x + 2y + z =5

x = 2y –z; sub into second equation6y-3z+2y+z=5, 8y – 2z = 5

Replace 5 with 48y-2z = 4Many solutions but simplified system of equations

Page 16: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

GCD Test• Multiple equations: Ai = b

2x – 2y + z = 03x + 2y + z =5

Lcm of 2 and 3 is 66x – 6y +3z = 06x + 4y + 2z = 10

6x = 6y – 3z, sub into second equation6y – 3z + 4y + 2z = 1010y –z = 10

Page 17: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

After GCD Test

integer i A1i b1

• Where i is a reduced set of variables subject to the bounds constraints– Originally say we had an n-dimensional array nested m deep– N equality constraints, 2m variables– Assuming equality constraints are independendent, each

equality constraint eliminates one variable– 2m – N variables

Dror E. MaydanCS243: Loop Optimization and Array Analysis 17

Page 18: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Fourier-Motzkin Elimination• To eliminate a variable from a set of linear inequalities.• To eliminate a variable x1

– Rewrite all expressions in terms of lower or upper bounds of Cx1– Create a transitive constraint for each pair of lower and upper

bounds. • Example: Let L, U be lower bounds and upper bounds resp

– To eliminate x1:L1(x2, …, xn) Cx1 U1 (x2, …, xn)L2(x2, …, xn) Cx1 U2 (x2, …, xn)

L1(x2, …, xn) U1 (x2, …, xn)L1(x2, …, xn) U2 (x2, …, xn)L2(x2, …, xn) U1 (x2, …, xn)L2(x2, …, xn) U2 (x2, …, xn)

If at any point, you get an inconsistent equation; e.g. 3 < 2, independent

Otherwise there exists a real, but not necessarily integral, solution

Page 19: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Fourier-Motzkin Elimination• One variable and a series of bounds

– Does the range contain an integer• No, no integral solution• Yes, pick an integer in the middle of the range, and go back to the equation with

two variables• More than one variable and a series of bounds

– Does the range contain an integer• No, might be an integral solution. Branch and bound

– x = 3.5. Add two constraints, x <= 3 and x >= 4. Start over• Yes, pick an integer in the middle of the range, and go back to the equation with

one more variable

Page 20: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Example

FOR i = 1 to 5FOR j = i+1 to 5

A[i,j] = f(A[i,i], A[i-1,j])

Dror E. MaydanCS243: Loop Optimization and Array Analysis 20

1 i i 5i + 1 j j 5

1 i’ i’ 5i’ + 1 j’ j’ 5

A[i,j] vrs A[i’, i’]i = i’, j=i’

Bounds on read: 1 <= i’; i’<=5; i’+1 <= j’; j’<=5;

Bounds on write: 1 <=i; i <= 5;Sub in i=i’ 1 <= i’; i’<=5; // redundantBounds on write: i+1 <= j; j<=5;Sub in i=i’ and j=i’ i’+1 <= i’; i’<=5;i’+1 <= i’ equivalent to 1 <= 0 // inconsistent so independent

Page 21: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Example

FOR i = 1 to 5FOR j = i+1 to 5

A[i,j] = f(A[i,i], A[i-1,j])

Dror E. MaydanCS243: Loop Optimization and Array Analysis 21

1 i i 5i + 1 j j 5

1 i’ i’ 5i’ + 1 j’ j’ 5

A[i,j] vrs A[i’-1, j’]i’ = i+1, j’=j

1. 1 i; i 5; // i bounds2. i+1 j; j 5; // j bounds3. 1 i+1; i+1 5; // i’ bounds, equivalent to 0 i; i 44. i+2 j; j 5; // j’ bounds

1 i 4; // combine 1 and 3 i+2 j; j 5; // 4i+2 5, // First step of Fourier-Motzkin on 4, equivalent to i<=3

Pick i=2, j=4, i’ = 3, j’ = 4 A[2,4] = A[2,4]

Page 22: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Distance Vectors

for i = 1 to n for j = 1 to n A[j][i] = A[j-1][i-1] * b[i]

jw = jr-1, iw = ir-1jr – jw = 1, ir – iw = 1

We say there is a dependence from the write to the read of distance (1, 1)

How do we compute distancesGCD Test: look for normalized equations during gcd test i’ - i = cMore sophisticated: Add constraint i’-i = d to Fourier-Motzkin and see if it is constrained to a single value

Dror E. MaydanCS243: Loop Optimization and Array Analysis 22

j

i

Page 23: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Direction Vectorsfor i = 1 to n

for j = 1 to iA[i] = A[j] * b[i]

jr=iwjw <= iwjw <= jrjr-jw >= 0jr <= iriw <= irir-iw >= 0

We say there is a dependence from the write to the read of direction (>=, >=)All this means is that the two references may refer to the same location on all iterations when jr >= jw and ir >= iw

Dror E. MaydanCS243: Loop Optimization and Array Analysis 23

Page 24: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Direction Vectors

We say there is a dependence from the write to the read of direction (>=, >=)All this means is that the two references may refer to the same location on all i iterations and when jr >= jw and ir >= iw

If (ir, jr) –(iw, jw) = some vector d (or a set of vectors), clearly (iw,jw)-(ir, jr) = -d (for all d)So we don’t run dependence analysis twice, once for the write versus the read and once for the read versus the write

Instead we run it once and negate the result to get the opposite edges

Dependence vectors are a relation between two iterations of two references. We choose to draw the dependence from the reference with the earlier iteration to the latter

Why?We have a property that a location is written by the source of an edge before the sinkWe don’t want to change the order in which we access any memory location

As long as after transformation, the edges are still positive, the transformation is legal

Dror E. MaydanCS243: Loop Optimization and Array Analysis 24

Page 25: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Lexicographically Positive Direction Vectors• Represent dependences as lexicographically positive

– First non-equal component must be “>”. – All “=“ vectors only allowed if source precedes sink in same

iteration

• There was a dependence from the write to the read of direction (>=, >=)– In a given iteration, the read happens before the write

– Dependence from write to read of the positive components of (>=, >=)• (>, >=), (=, >)• Don’t include (=, =) because the read happens before the write

– Dependence from read to write of positive components of (<=, <=)• (=, =)

D.E. MaydanCS243: Loop Optimization and Array Analysis 25

Page 26: Lecture 14 Loop Optimization and Array Analysis

Carnegie Mellon

Direction VectorsComputing Direction Vectors

Start with nothing (*, *, …)Independent: doneDependent

check (>, *, *, …), (=, *, *, …) and (<, *, *, …)Add in a constraint i’-i > 0, etc

If, for example, (<, *, *, …) is dependentcheck (<, >, *, …), (<, =, *, …) and (<, < , *, …)

Exponential blow-upDon’t do it for constant distancesUnused variables are automatically *

for i for j a[j] = a[j+1] .

Dror E. MaydanCS243: Loop Optimization and Array Analysis 26