27
A Random Parallel Approximation Algorithm for the 0-1 Knapsack Problem Michael Adams 1

A Random Parallel Approximation Algorithm for the 0-1 Knapsack Problem

  • Upload
    tino

  • View
    51

  • Download
    0

Embed Size (px)

DESCRIPTION

A Random Parallel Approximation Algorithm for the 0-1 Knapsack Problem. Michael Adams. Agenda. Problem Description Related Work Exact Solution Algorithm Design Heuristic and Random Approximation Algorithm design Results and Analysis Research Answers Future Work Lessons Learned. - PowerPoint PPT Presentation

Citation preview

A Random Parallel Approximation Algorithm for the 0-1 Knapsack Problem

A Random Parallel Approximation Algorithm for the 0-1 Knapsack ProblemMichael Adams1AgendaProblem DescriptionRelated WorkExact Solution Algorithm DesignHeuristic and Random Approximation Algorithm design Results and AnalysisResearch AnswersFuture WorkLessons Learned2Problem DescriptionItems with weights and valuesKnapsack with weight capacityHighest value selection under capacityNP-hardCombinatorial optimization

3Reference [1]3Quality-Up4

Related WorkExact algorithmLists and dominance [2]Essentially a brute force with trimmingWould require complex bookkeepingHeuristicDual population co-evolution genetic algorithm [3]Greedy decreasing profit density [4]Random approximation is new researchQuality up is new research5Item Generation Given N, LW, UW, LV, UV, seedfor each id in range 0..N-1 weight = random in [LW, UW]value = random in [LV, UV]add new item to list with idCan be tightly or loosely correlated

6Sequential Exact Algorithm DesignExhaustive brute force search

for every possible combination of items evaluate the combinationif(total weight > capacity)go to next combinationif(total value > current best) set combination as best

Must perform 2N evaluations7Parallel Exact Algorithm DesignDivvy up evaluations to available CPUsTheoretically K times faster Reduce answer and store highest (exact)8

Decreasing Profit Density Heuristicsort items by decreasing profit densitystart with empty assignmentfor each item in orderif(total weight plus item > capacity)stopadd item weight to total weightadd item value to total valuemake note of item in assignment

9

Random Approximation Algorithm Start with a random combination of itemsMake slight adjustments to improve valueStop when improvement endsHill-climbingOne iteration = onerun of the algorithm10

Reference [5]Random Approximation Algorithm Design use unique seed to get a pseudo random number generatorfor(this threads iterations) use prng to get a random combination of itemsperform addRemove on assignmentORperform bitFlip on assignmentORperform both on assignment

11Add/Remove Algorithmwhile item combination weight > capacityRemove random included item from combinationwhile item combination weight < capacityGet random item that was excludedif adding it would overflow the capacitystopadd that item to the combination

12BitFlip Algorithmwhile assignment evaluation improvesstart with the current best assignment for each i in range 0..N-1 if i is included, remove itelse if i is excluded, add itif this is better than current bestset assignment as best

13Data Loosely Correlatedjava KnapsackExact 27 50 500 50 500 4250 2314156NRAVTQS27exact0000000007B0E7AC60432952427parallel0000000007B0E7AC604393731.527heuristic0000000007B0F7A8597500.989

27BF40 0000000007B0EBAC59001270.976232.527BF400 0000000007B0EBAC59001600.976184.527BF4000 0000000007B0EBAC59001690.976174.727BF40000 0000000007B0EBAC59002710.976108.927BF400000 000000000730F3BC59472820.984104.727BF4000000 000000000730F3BC59478720.98433.9

27AR40 0000000005A975B251371210.850244.027AR400 0000000005AAE78C53601360.887217.127AR4000 000000000338E7AD56822120.940139.327AR40000 0000000007F8EFA058532300.969128.427AR400000 00000000073CF7A859182910.979101.527AR4000000 0000000007B0E7B860185050.99658.5

27BOTH40 0000000007B06B2E53251240.881238.127BOTH400 0000000005BAE78C56171570.930188.127BOTH4000 0000000005B8F3F056971770.943166.827BOTH40000 0000000007F8EFA058532420.969122.027BOTH400000 000000000778F7A459253230.98091.427BOTH4000000 0000000007B0E7AC604311251.00026.2

14Quality versus Iterations15

Speedup versus Quality16

Data Tightly Correlatedjava KnapsackExact 31 50 500 5500 9422134NRAVTQS31exact000000002B63AF9E554253820631parallel000000002B63AF9E55421461336.831heuristic000000002B63ADDC518600.936

31BF40 000000004BC77EB055161420.9953790.231BF400 000000004BC77EB055161540.9953494.831BF4000 000000002FE1B36455261580.9973406.431BF40000 000000002FE1B36455261910.9972817.831BF400000 0000000023C7B7C655303110.9981730.631BF4000000 000000002B436FFC553610480.999513.631AR40 000000005E6CBE5C55101350.9943986.731AR400 000000001E23FFEA55181340.9964016.531AR4000 000000006B61C75755231500.9973588.031AR40000 000000002F63A77555332230.9982413.531AR400000 000000003363BFF255343210.9991676.731AR4000000 000000002E60EDFC55375730.999939.331BOTH40 000000006D419FED55131250.9954305.631BOTH400 000000001E23FFEA55181600.9963363.831BOTH4000 000000006B61C75755231820.9972957.231BOTH40000 000000002F63A77555332290.9982350.231BOTH400000 000000003363BFF255343790.9991420.131BOTH4000000 000000002E60EDFC553713110.999410.517Quality versus Iterations18

Speedup versus Quality19

AnalysisFor small problems, exact algorithmHeuristic achieves > 90% qualityRandom approximation can reach 100%

Loosely Correlated use heuristicTightly Correlated use random

20Research AnswersHow does the effort of writing a parallel approximate algorithm for the problem compare to the effort of writing an exact sequential algorithm for the problem?

Exact algorithm was a simple conceptEasier to code - ~3 hoursRandom algorithm requires more thoughtMore to code, debug - ~5 hours21Research AnswersFor various sizes of the problem, what happens to the parallel approximate algorithm's solution quality as the number of repetitions increases?

More iterations higher quality solutionsMore random starting combinationsIncrease chance one hits higher maximaOccasionally increasing iterationsdoes not increase quality22

Research AnswersFor various sizes of the problem, how much faster than the exact sequential algorithm is the parallel approximate algorithm as a function of solution quality?

Higher quality solutions lower speedupMore iterations are being executedDecrease in speedup is slightExact solution can be found in fraction oftime23

Future WorkAllow more variablesCompare heuristic versus randomGPU Cuda implementationAdd/remove would not be a prime candidateMulti-constraintWeight capacity and volume

24Lessons LearnedGetting ahead of schedule is goodReduces pressure of deadlinesAllows time for exploring other areasNew bitwise and serializing techniquesUse bit pattern as included/excludedExternalizeable Enjoyable work feels more like playActually wanted to sit down and codeHelped me choose my cluster and thesis25References[1] Dake. Knapsack Problem. Wikipedia, The Free Encyclopedia. Wikimedia Foundation, Inc. 7 August 2006. Web. 14 May 2012[2] El Baz, M.E.D.; , "Load balancing in a parallel dynamic programming multi-method applied to the 0-1 knapsack problem," 2006 14th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing. Feb. 2006[3] Wei Shen; Beibei Xu; Jiang-ping Huang; , "An Improved Genetic Algorithm for 0-1 Knapsack Problems," 2011 Second International Conference on Networking and Distributed Computing (ICNDC). Sept. 2011[4] Sartaj Sahni. Approximate Algorithms for the 0/1 Knapsack Problem. J. ACM 22, 1 (January 1975). [5] Montgomery, John. Tackling the Travelling Salesman Problem: Simulated Annealing. Pyschic Origami. 28 June 2007. Web. 14 May 201226Questions?Thank you27