View
217
Download
0
Category
Preview:
Citation preview
8/10/2019 dcmeet third.pptx
1/72
Presented By
K.Indira
Under the Guidance ofDr. S. Kanmani,
Professor,Department of Information Technology,Pondicherry Engineering College.
1
Empirical Study on Mining
Association Rules Using Population
Based Stochastic Search Algorithms
8/10/2019 dcmeet third.pptx
2/72
2
Objective
Introduction
References
Contents
Why Association Rule Mining
Existing Methods and its Limitations
Evolutionary Algorithms in AR miningGA and PSOAn Introduction
Empirical study
Conclusion
Publications
8/10/2019 dcmeet third.pptx
3/72
3
To propose an efficient methodology for mining
Association rules both effectively and efficiently using
population based search methods namely Genetic
Algorithm and Particle Swarm Optimization
Objective
8/10/2019 dcmeet third.pptx
4/72
8/10/2019 dcmeet third.pptx
5/72
Extraction of interesting information or patterns
from data in large databases is known as data
mining.
Data Mining
5
Association rule mining finds interesting associations
and/or correlation relationships among large set of
data items.
Association Rule Mining
8/10/2019 dcmeet third.pptx
6/72
6
Tid Items bought
10 Milk, Nuts, Sugar
20 Milk, Coffee, Sugar
30 Milk, Sugar, Eggs
40 Nuts, Eggs, Bread
50 Nuts, Coffee, Sugar , Eggs,
Bread
Association Rules
Rules are of formX Y with
minimum support and confidence Support, s, probability that a
transaction contains X Y
Confidence, c,conditional
probability that a transactionhaving X also contains YLet minsup = 50%, minconf = 50%
Freq. Pat.: Milk:3, Nuts:3, Sugar:4, Eggs:3, {Milk,Sugar}:3
Association rules: Milk Sugar (60%, 100%) Sugar Milk (60%, 75%)
Customer
buys sugar
Customer
buys both
Customer
buys milk
8/10/2019 dcmeet third.pptx
7/72
7
Apriori, FP Growth Tree, clat are some of the popular
algorithms for mining ARs.
Traverse the database many times.
I/O overhead, and computational complexity is more Cannot meet the requirements of large-scale database
mining.
Does not fit in memory and is expensive to build
Time is wasted (especially if support threshold is high),
as the only pruning that can be done is on single items
Limitations of Existing System
8/10/2019 dcmeet third.pptx
8/72
Applicable in problems where no (good) method is available:
Discontinuities, non-linear constraints, multi-modalities.
Discrete variable space.
Implicitly defined models (if-then-else constructs).
Noisy problems.
Most suitable in problems where multiple solutions are required: Multi-modal optimization problems.
Multi-objective optimization problems.
Parallel implementation is easier.
Evolutionary algorithms provide robust and efficient approach in exploring
large search space.
Uniqueness of Evolutionary Algorithm
8
8/10/2019 dcmeet third.pptx
9/72
GA and PSOAn Introduction
Genetic algorithm (GA) and Particle swarm
optimization (PSO) are both population based
search methods and move from set of points(population) to another set of points in a single
iteration with likely improvement using set of
control operators.
9
8/10/2019 dcmeet third.pptx
10/72
Genetic Algorithm
A Genetic Algorithm (GA) is a procedure used to
find approximate solutions to search problemsthrough the application of the principles of
evolutionary biology.
Particle Swarm Optimization
PSOs mechanism is inspired by the social and
cooperative behavior displayed by various
species like birds, fish etc including human
beings.10
l k f h d l
8/10/2019 dcmeet third.pptx
11/72
Association Rule(AR) Mining
Population Based
Evolutionary Methods
Genetic Algorithm(GA)
Particle SwarmOptimization (PSO)
Mining AssociationRules using GA
Analyzing the roleofControl parameters in
GA for mining ARs
Mining ARs using
Self Adaptive GA
Elitist GA for
AssociationRuleMining
Mining Associationrules with PSO
Mining AssociationRules with chaotic
PSO
Mining Associationrules with Dynamic
NeighborhoodSelection in PSO
Mining Association
rules with SelfAdaptive PSO
Hybrid GA/PSO(GPSO) for AR
Mining
Block Diagram of Research Modules
8/10/2019 dcmeet third.pptx
12/72
Datasets Used
Lenses
Haberman
Car Evaluation
8/10/2019 dcmeet third.pptx
13/72
Lenses
Age of thepatient
1: Young 2: Pre-Presbyopic
3:Presbyopic
SpectaclePrescription
1: Myopic 2:Hypermetropic
Astigmatic 1: No 2: Yes
Tear ProductionRate
1: Reduced 2: Normal
Result 1: HardContactlenses
2: Soft ContactLenses
3: No lenses
8/10/2019 dcmeet third.pptx
14/72
Haberman
14
Age of the patient 30-83Numeric
Patient's year ofoperation
NumericEg. 67
Number of positiveaxillary nodes
detected
0-46Numeric
Result 1= the patient
survived 5 yearsor longer s
2 = the patient died
within 5 year
8/10/2019 dcmeet third.pptx
15/72
Car Evaluation
Buying price Very high High Medium Low
MaintenancePrice
Very high High Medium Low
Doors 2 3 4 5
Persons 2 4 More
Luggage boot Small Big Medium
Safety Low Medium High
Result Unacceptable
Acceptable
Good Verygood
8/10/2019 dcmeet third.pptx
16/72
16
Mining ARs using GA
Methodology
Selection : Tournament
Crossover Probability : Fixed ( Tested with 3 values)
Mutation Probability : No Mutation
Fitness Function :
Dataset : Lenses, Iris, Haberman from
UCI Irvine repository.
Population : Fixed ( Tested with 3 values)
8/10/2019 dcmeet third.pptx
17/72
17
Flow chart of the GA
8/10/2019 dcmeet third.pptx
18/72
Results Analysis
No. of Instances No. of Instances * 1.25 No. of Instances *1.5
Accuracy
%
No. of
GenerationsAccuracy
%
No. of
GenerationsAccuracy
%
No. of
Generations
Lenses 75 7 82 12 95 17
Haberman 71 114 68 88 64 70
Iris 77 88 87 53 82 45
Comparison based on variation in population Size.
Minimum Support & Minimum Confidence
Sup = 0.4 & con =0.4 Sup =0.9 & con =0.9 Sup = 0.9 & con = 0.2 Sup = 0.2 & con = 0.9
Accuracy%
No. of
Gen
Accuracy
%
No. of
Gen.
Accuracy
%No. of
Gen.
Accuracy
%
No. of
Gen
Lenses 22 20 49 11 70 21 95 18Haberman 45 68 58 83 71 90 62 75
Iris 40 28 59 37 78 48 87 55
Comparison based on variation in Minimum Support and Confidence
8/10/2019 dcmeet third.pptx
19/72
19
Cross Over
Pc = .25 Pc = .5 Pc = .75
Accurac
y %
No. ofGenerations
Accuracy % No. ofGenerations
Accuracy
%
No. ofGenerations
Lenses 95 8 95 16 95 13
Haberman 69 77 71 83 70 80
Iris 84 45 86 51 87 55
Dataset No. of
Instances
No. of
attributes
Population
Size
Minimum
Support
Minimum
confidence
Crossover
rate
Accuracy
in %
Lenses 24 4 36 0.2 0.9 0.25 95
Haberman 306 3 306 0.9 0.2 0.5 71
Iris 150 5 225 0.2 0.9 0.75 87
Comparison of the optimum value of Parameters formaximum Accuracy achieved
Comparison based on variation in Crossover Probability
8/10/2019 dcmeet third.pptx
20/72
20
Population Size Vs Accuracy
Minimum Support and Confidence Vs Accuracy
8/10/2019 dcmeet third.pptx
21/72
21
Values of minimum support, minimum confidence and mutation
rate decides upon the accuracy of the system than other GA
parameters
Crossover rate affects the convergence rate rather than the
accuracy of the system
The optimum value of the GA parameters varies from data to
data and the fitness function plays a major role in optimizing the
results
Inferences
8/10/2019 dcmeet third.pptx
22/72
22
Mining ARs using Self Adaptive GA in Java.
MethodologySelection : Roulette Wheel
Crossover Probability : Fixed ( Tested with 3 values)
Mutation Probability : Self Adaptive
Fitness Function :
Dataset : Lenses, Iris, Car fromUCI Irvine repository.
Population : Fixed ( Tested with 3 values)
P d SAGA
8/10/2019 dcmeet third.pptx
23/72
23
Procedure SAGA
Begin
Initialize population p(k);
Define the crossover and mutation rate;Do
{
Do
{
Calculate support of all k rules;
Calculate confidence of all k rules;
Obtain fitness;
Select individuals for crossover / mutation;
Calculate the average fitness of the n and (n-1) the generation;
Calculate the maximum fitness of the n and (n-1) the generation;
Based on the fitness of the selected item, calculate the new crossoverand mutation rate;
Choose the operation to be performed;
} k times;
}
8/10/2019 dcmeet third.pptx
24/72
Self Adaptive GA
SELF ADAPTIVE
R lt A l i
8/10/2019 dcmeet third.pptx
25/72
25
Dataset Traditional GA Self Adaptive GAAccuracy No. of Generations Accuracy No. of Generations
Lenses 75 38 87.5 35
Haberman 52 36 68 28
Car Evaluation 85 29 96 21
Dataset Traditional GA Self Adaptive GA
Accuracy No. ofGenerations
Accuracy No. of Generations
Lenses 50 35 87.5 35Haberman 36 38 68 28Car
Evaluation
74 36 96 21
ACCURACY COMPARISON BETWEEN GA AND SAGA WHEN PARAMETERS ARE
SET TO TERMINATION OF SAGA
ACCURACY COMPARISON BETWEEN GA AND SAGA WHEN PARAMETERS ARE IDEAL
FOR TRADITIONAL GA
Results Analysis
ACCURACY COMPARISON BETWEEN GA AND SAGA WHEN
8/10/2019 dcmeet third.pptx
26/72
26
0
10
20
30
40
50
60
70
80
90
100
Lenses Haberman Car Evaluation
PredictiveAccura
cy(%)
Dataset
Traditional GA Accuracy
Self Adaptive GA Accuracy
ACCURACY COMPARISON BETWEEN GA AND SAGA WHEN
PARAMETERS ARE IDEAL FOR TRADITIONAL GA
8/10/2019 dcmeet third.pptx
27/72
27
0
10
20
30
40
50
60
70
80
90
100
Lenses Haberman Car Evaluation
PredictiveAccuracy(%)
Dataset
Traditional GA
Self Adaptive GA
ACCURACY COMPARISON BETWEEN GA AND SAGA
WHEN PARAMETERS ARE ACCORDING TO
TERMINTAION OF SAGA
8/10/2019 dcmeet third.pptx
28/72
Inferences
Self Adaptive GA gives better accuracy than Traditional GA.
28
8/10/2019 dcmeet third.pptx
29/72
GA with Elitism for Mining ARsMethodology
Selection : Elitism with roulette wheel
Crossover Probability : Fixed to Pc
Mutation Probability : Self Adaptive
Fitness Function : Fitness(x) = con(x)*(log(sup(x) *length(x) + 1)
Dataset : Lenses, Iris, Car from UCI Irvinerepository.
Population : Fixed29
8/10/2019 dcmeet third.pptx
30/72
No. Of
Iterations
Lenses Car Evaluation Haberman
4 90 94.4 706 87.5 91.6 758 91.6 92.8 91.6
10 90 87.5 7515 87.5 90 83.320 91.6 87.5 91.625 87.5 87.5 92.530 83.3 93.75 83.350 90 75 75
Predictive Accuracy for Mining AR based on GA with Elitism
Results Analysis
30
8/10/2019 dcmeet third.pptx
31/72
31
0
10
20
30
40
50
60
70
80
90
100
6 8 10 15 20 25 30 50
PredictiveAccuracy(%)
No. of Iterations
Lenses
Car Evaluation
Haberman
Predictive Accuracy for Mining AR based on GAwith Elitism
8/10/2019 dcmeet third.pptx
32/72
32
No of matches vs. No of iterations
8/10/2019 dcmeet third.pptx
33/72
No. Of
Iterations
Lenses (ms) Car Evaluation
(ms)
Haberman
(ms)
4 15 547 1256 16 721 1568 31 927 187
10 31 1104 20315 32 1525 28120 47 1967 35925 63 2504 421
30 78 2935 53050 94 4753 998
Execution Time for Mining AR based on GA with Elitism
33
8/10/2019 dcmeet third.pptx
34/72
34
0
1000
2000
3000
4000
5000
6000
4 6 8 10 15 20 25 30 50
Executiontime(ms)
No. of Iterations
Haberman (ms)
Car Evaluation (ms)
Lenses (ms)
8/10/2019 dcmeet third.pptx
35/72
Inferences
Marginally better accuracy arrived
Computational Efficiency found to be optimum
Elitism when introduced helps in retaining
chromosomes with good fitness values for next
generation
35
8/10/2019 dcmeet third.pptx
36/72
36
Mining ARs using PSO
Methodology
Each data itemset are represented as particles
The particles moves based on velocity
The particles position are updated based on
8/10/2019 dcmeet third.pptx
37/72
Particle Swarm Optimization (PSO)
Flow chart depicting the General PSO Algorithm:
Start
Initialize particles with random position
and velocity vectors.
For each particles position (p)
evaluate fitness
If fitness(p) better than
fitness(pbest) then pbest= pLoopuntilall
particlesexhaus
t
Set best of pBests as gBest
Update particles velocity (eq. 1) and
position (eq. 3)
Loopun
tilmaxiter
Stop:giving gBest, optimal solution.
Results Analysis
8/10/2019 dcmeet third.pptx
38/72
Dataset NameTraditional
GA
Self
AdaptiveGA
PSO
Lenses 87.5 91.6 92.8
Haberman 75.5 92.5 91.6
Car evaluation 85 94.4 95
Results Analysis
0
200
400
600
800
1000
1200
4 6 8 10 15 20 25 30 50
Execution
Timemsec
No. of iterations
Haberman
PSO
SAGA0
20
40
60
80
100
4 6 8 101520253050
Executio
nTimemsec
No. of iterations
Lenses
PSO
SAGA
0
200
400
600
800
1000
1200
4 6 8 10 15 20 25 30 50
Execution
Timemsec
No. of Iterations
Car Evaluation
PSO
SAGA
Predictive Accuracy
Execution Time38
I f
8/10/2019 dcmeet third.pptx
39/72
Inferences
PSO produce results as effective as self adaptive GA
Computational effectiveness of PSO marginally fast whencompared to SAGA.
In PSO only the best particle passes information to others andhence the computational capability of PSO is marginally betterthan SAGA.
39BACK
i i A i Ch i SO
8/10/2019 dcmeet third.pptx
40/72
40
Mining ARs using Chaotic PSO
The new chaotic map model is formulated as
Methodology
Initial point u0and V0to 0.1The velocity of each particle is updated by
Mining ARs using
8/10/2019 dcmeet third.pptx
41/72
Compute xi(k+1)Compute (f(xi(k+1))
Reorder the particlesGenerate neighborhoods I =1
k K
i = i +1
K = k+1
Start
K =1Initialize xi(k), vi(k)
Compute f(xi(k))
Determine best particles in theneighborhood of i
Update previous best if necessary
I N
Stop
no
no
yes
yes
Mining ARs usingChaotic PSO
41
CC C CO SO
8/10/2019 dcmeet third.pptx
42/72
42
80
82
84
86
88
90
92
94
96
98
100
Haberman lens car evaluation
PredictiveAccuracy(%)
SAGA
PSO
CPSO
ACCURACY COMPARISON
8/10/2019 dcmeet third.pptx
43/72
43
75
80
85
90
95
100
SAGA pso cpso
4
6
8
10
15
20
25
30
50
Convergence Rate Comparison for Lenses
Convergence Rate Comparison for Car
8/10/2019 dcmeet third.pptx
44/72
44
40
50
60
70
80
90
100
SAGA pso cpso
4
6
8
10
15
20
25
30
50
Convergence Rate Comparison for CarEvaluation
8/10/2019 dcmeet third.pptx
45/72
45
0
10
20
30
40
50
60
70
80
90
100
SAGA pso cpso
4
6
8
10
15
20
25
30
50
Convergence Rate Comparison for Habermans
Survival
Inferences
8/10/2019 dcmeet third.pptx
46/72
Inferences
Better accuracy than PSO
The Chaotic Operators could be changed by altering the initial
values in chaotic operator function
The balance between exploration and exploitation is
maintained
46
Mining ARs using Neighborhood Selection
8/10/2019 dcmeet third.pptx
47/72
47
Mining ARs using Neighborhood Selectionin PSOMethodology
The concept of local best particle (lbest) replacing the particle
best (pbest) is introduced
The neighborhood best (lbest) selection is as follows;
Calculate the distance of the current particle from other
particles
Find the nearest m particles as the neighbor of the current
particle based on distance calculated
Choose the local optimum lbest among the neighborhood
in terms of fitness values
8/10/2019 dcmeet third.pptx
48/72
48
Interestingness Measure
The interestingness measure for a rule is taken from relativeconfidence and is as follows:
Where k is the rule, x the antecedent part of the rule and ythe consequent part of the rule k.
Predictive Accuracy Comparison for Dynamic
8/10/2019 dcmeet third.pptx
49/72
49
88
89
90
91
92
93
94
95
96
97
98
Haberman lens car evaluation
PredictiveAcuracy(%)
saga
pso
Npso
y p yNeighborhood selection in PSO
8/10/2019 dcmeet third.pptx
50/72
50
Dataset Interestingness Value
Lens 0.82
Car Evaluation 0.73
HabermansSurvival 0.8
Measure of Interestingness
Execution Time Comparison for Dynamic
8/10/2019 dcmeet third.pptx
51/72
51
0
200
400
600
800
1000
1200
1400
1600
4 6 8 10 15 20 25 30 50
Lenses PSO
Lenses NPSO
Haberman's Survival PSO
Haberman's Survival NPSO
Car Evaluation PSO
Car Evaluation NPSO
p y
Neighborhood selection in PSO
Predictive Accuracy over Generation for a) Car
8/10/2019 dcmeet third.pptx
52/72
52
0
10
20
30
40
50
60
70
80
90
100
PSO NPSO
PredictiveAccuracy(%)
4
6
8
10
15
20
25
30
50
y )
Evaluation b) Lenses c) Habermans Survival datasets
0
10
20
30
40
50
60
70
80
90
100
PSO NPSO
PredictiveAccuracy(%)
4
6
8
10
15
20
25
30
50
0
10
20
30
40
50
60
70
80
90
100
PSO NPSO
PredictiveAccu
racy(%)
4
6
8
10
15
20
25
30
50
Inferences
8/10/2019 dcmeet third.pptx
53/72
53
The avoidance of premature convergence at local optimalpoints tend to enhance the results
The selection of local best particles based on neighbors
(lbest) rather than particles own best (pbest) enhances
the accuracy of the rules mined
Inferences
Mining ARs using Self Adaptive
8/10/2019 dcmeet third.pptx
54/72
54
g s us g Se dapt eChaotic PSO
A slight variant of the PSO is called inertia-weight PSO, in which aweight parameter Is added to the velocity equation adopted
where, w is the inertia weight. The variable w plays the role of
balancing the global search and local search.A method of adaptive mutation rate is used = m x
m x
min
) g/G
where, g is the generation index representing thecurrent number of evolutionary generations, and G is a
redefined maximum number of generations. Here, the
maximal and minimal weights max and min are usually
set to 0.9 and 0.4, respectively.
8/10/2019 dcmeet third.pptx
55/72
Effect of changing w
Dataset
Highest PA achieved within 50 runs of iterations
No weight
(Normal PSO) w = 0.5 w = 0.7
Lenses 87.5 88.09 84.75
Haberman 87.5 96.07 99.80
Car 96.4 99.88 99.84
POP Care 91.6 98.64 97.91
Zoo 83.3 96.88 98.97
8/10/2019 dcmeet third.pptx
56/72
Lenses
0
1020
30
40
50
60
70
80
90
100
5 10 15 25 50
PredictiveAccu
racy
No of generations
Predictive Accuracy CPSO
Predictive Accuracy
Weighted CPSO
Predictive Accuracy Self
Adaptive CPSO
8/10/2019 dcmeet third.pptx
57/72
Habermans Survival
75
80
85
90
95
100
5 10 15 25 50
PredictiveAccu
racy
No of generations
Predictive AccuracyCPSO
Predictive AccuracyWeighted CPSO
Predictive Accuracy Self
Adaptive CPSO
8/10/2019 dcmeet third.pptx
58/72
Post operative Patient Care
0
1020
30
40
50
60
70
80
90
100
5 10 15 25 50
PredictiveAccuracy
No. of Generations
Predictive AccuracyCPSO
Predictive AccuracyWeighted CPSO
Predictive Accuracy Self
Adaptive CPSO
8/10/2019 dcmeet third.pptx
59/72
Zoo
82
84
86
88
90
92
94
96
98
100
5 10 15 25 50
PredictiveAccu
racy
No. of Generations
Predictive AccuracyCPSO
Predictive AccuracyWeighted CPSO
Predictive Accuracy Self
Adaptive CPSO
8/10/2019 dcmeet third.pptx
60/72
Car Evaluation
98.2
98.4
98.6
98.8
99
99.299.4
99.6
99.8
100
5 10 15 25 50
PredictiveAccuracy
No of generations
Predictive Accuracy CPSO
Predictive AccuracyWeighted CPSO
Predictive Accuracy Self
Adaptive CPSO
I f
8/10/2019 dcmeet third.pptx
61/72
61
In term of computational efficiency SACPSO is
faster than GA
Setting of appropriate values for the control
parameters involved in these heuristics methods
is the key point to success in these methods
Inferences
Mining AR using Hybrid GA/PSO
8/10/2019 dcmeet third.pptx
62/72
62
x
8/10/2019 dcmeet third.pptx
63/72
63
When Genetic algorithm used for mining association
rules Improvement in predictive accuracy achieved Particle swarm optimization when adopted for mining
association rules produces results closer to GA but with
minimum execution time
The premature convergence being the major drawback
of PSO was handled by introducing inertia weights,chaotic maps, neighborhood selection adaptive inertia
weight
Papers Published
8/10/2019 dcmeet third.pptx
64/72
p
64
K.Indira, Dr.S.Kanmani, Framework for Comparison of Association Rule
Mining Using Genetic Algorithm, In : International Conference OnComputers, Communication & Intelligence , 2010.
K.Indira, Dr.S.Kanmani, MiningAssociation Rules Using Genetic Algorithm:The role of Estimation Parameters , In : International conference onadvances in computing and communications, Communication in Computer
and Information Science, Springer LNCS,Volume 190, Part 8, 639-648, 2011
K.Indira, Dr. S. Kanmani , Gaurav Sethia.D, Kumaran.S, Prabhakar.J , RuleAcquisition in Data Mining Using a Self Adaptive Genetic Algorithm, In :First International conference on Computer Science and Information
Technology, Communication in Computer and Information Science, SpringerLNCS Volume 204, Part 1, 171-178, 2011.
K.Indira, Dr. S.Kanmani, Prasanth, Harish, Jeeva, PopulationBased SearchMethods in Mining Association Rules , In : Third International Conferenceon Advances in Communication, Network, and Computing CNC 2012,
LNICST pp. 255261, 2012.
Conferences
8/10/2019 dcmeet third.pptx
65/72
65
Journal
K.Indira, Dr. S.Kanmani, Performance Analysis of Genetic Algorithm for
Mining Association Rules, IJCSI International Journal of Computer Science
Issues, Vol. 9, Issue 2, No 1, 368-376, March 2012
K.Indira, Dr. S.Kanmani, Rule Acquisition using Genetic Algorithm,
accepted for publication in Journal of Computing
K.Indira, Dr. S.Kanmani, Enhancing Particle Swarm optimization using
chaotic operators for Association Rule Mining, communicated to
International Journal of Computer Science and Techniques
K.Indira, Dr. S.Kanmani, AssociationRule Mining by Dynamic Neighborhood
Selection in Particle Swarm Optimization, communicated to world science
Publications
References
8/10/2019 dcmeet third.pptx
66/72
Jing Li, Han Rui-feng, ASelf-Adaptive Genetic Algorithm Based On Real-Coded, International Conference on Biomedical Engineering andcomputer Science , Page(s): 1 - 4 , 2010
Chuan-Kang Ting, Wei-Ming Zeng, Tzu- Chieh Lin, LinkageDiscovery throughData Mining, IEEE Magazine on Computational Intelligence, Volume 5,February 2010.
Caises, Y., Leyva, E., Gonzalez, A., Perez, R., An extension of the GeneticIterative Approach for Learning Rule Subsets ,4th International Workshopon Genetic and Evolutionary Fuzzy Systems, Page(s): 63 - 67 , 2010
Shangping Dai, Li Gao, Qiang Zhu, Changwu Zhu, A Novel Genetic AlgorithmBased on Image Databases for Mining Association Rules, 6th IEEE/ACIS
International Conference on Computer and Information Science, Page(s):977980, 2007
Peregrin, A., Rodriguez, M.A., Efficient Distributed Genetic Algorithm forRule Extraction,. Eighth International Conference on Hybrid Intelligent
Systems, HIS '08. Page(s): 531536, 2008 66
References Contd..
8/10/2019 dcmeet third.pptx
67/72
67
Mansoori, E.G., Zolghadri, M.J., Katebi, S.D., SGERD: A Steady-StateGenetic Algorithm for Extracting Fuzzy Classification Rules From Data,IEEE Transactions on Fuzzy Systems, Volume: 16 , Issue: 4 , Page(s): 10611071, 2008..
Xiaoyuan Zhu, Yongquan Yu, Xueyan Guo, Genetic Algorithm Based onEvolution Strategy and the Application in Data Mining,First InternationalWorkshop on Education Technology and Computer Science, ETCS '09,
Volume: 1 , Page(s): 848852, 2009
Hong Guo, Ya Zhou, An Algorithm for Mining Association Rules Based onImproved Genetic Algorithm and its Application, 3rd InternationalConference on Genetic and Evolutionary Computing, WGEC '09, Page(s):
117120, 2009
Genxiang Zhang, Haishan Chen, Immune Optimization Based GeneticAlgorithm for Incremental Association Rules Mining, InternationalConference on Artificial Intelligence and Computational Intelligence, AICI
'09, Volume: 4, Page(s): 341345, 2009
References Contd..
References Contd..
8/10/2019 dcmeet third.pptx
68/72
68
Maria J. Del Jesus, Jose A. Gamez, Pedro Gonzalez, Jose M. Puerta, On theDiscovery of Association Rules by means of Evolutionary Algorithms, from
Advanced Review of John Wiley & Sons , Inc. 2011
Junli Lu, Fan Yang, Momo Li, Lizhen Wang, Multi-objective Rule DiscoveryUsing the Improved Niched Pareto Genetic Algorithm, Third InternationalConference on Measuring Technology and Mechatronics Automation, 2011.
Hamid Reza Qodmanan, Mahdi Nasiri, Behrouz Minaei-Bidgoli, MultiObjective Association Rule Mining with Genetic Algorithm without specifyingMinimum Support and Minimum Confidence, Expert Systems withApplications 38 (2011) 288298.
Miguel Rodriguez, Diego M. Escalante, Antonio Peregrin, Efficient Distributed
Genetic Algorithm for Rule Extraction, Applied Soft Computing 11 (2011) 733743.
J.H. Ang, K.C. Tan , A.A. Mamun, An Evolutionary Memetic Algorithm for RuleExtraction, Expert Systems with Applications 37 (2010) 13021315.
References Contd..
References Contd..
8/10/2019 dcmeet third.pptx
69/72
R.J. Kuo, C.M. Chao, Y.T. Chiu, Application of particle swarm optimization toassociation rule mining, Applied Soft Computing 11 (2011) 326336.
Bilal Alatas , Erhan Akin, Multi-objective rule mining using a chaotic particle
swarm optimization algorithm, Knowledge-Based Systems 22 (2009) 455460.
Mourad Ykhlef, A Quantum Swarm Evolutionary Algorithm for miningassociation rules in large databases, Journal of King Saud University Computer and Information Sciences (2011) 23, 16.
Haijun Su, Yupu Yang, Liang Zhao, Classification rule discovery with DE/QDEalgorithm, Expert Systems with Applications 37 (2010) 12161222.
Jing Li, Han Rui-feng, A Self-Adaptive Genetic Algorithm Based On Real-Coded, International Conference on Biomedical Engineering and
computer Science , Page(s): 1 - 4 , 2010
Chuan-Kang Ting, Wei-Ming Zeng, Tzu- Chieh Lin, Linkage Discoverythrough Data Mining, IEEE Magazine on Computational Intelligence,Volume 5, February 2010.
69
References Contd..
8/10/2019 dcmeet third.pptx
70/72
70
Caises, Y., Leyva, E., Gonzalez, A., Perez, R., An extension of the GeneticIterative Approach for Learning Rule Subsets ,4th International Workshopon Genetic and Evolutionary Fuzzy Systems, Page(s): 63 - 67 , 2010
Xiaoyuan Zhu, Yongquan Yu, Xueyan Guo, Genetic Algorithm Based onEvolution Strategy and the Application in Data Mining, First InternationalWorkshop on Education Technology and Computer Science, ETCS '09,Volume: 1 , Page(s): 848852, 2009
Miguel Rodriguez, Diego M. Escalante, Antonio Peregrin, EfficientDistributed Genetic Algorithm for Rule extraction, Applied Soft Computing11 (2011) 733743.
Hamid Reza Qodmanan , Mahdi Nasiri, Behrouz Minaei-Bidgoli, Multi
objective association rule mining with genetic algorithm without specifyingminimum support and minimum confidence, Expert Systems withApplications 38 (2011) 288298.
Yamina Mohamed Ben Ali, Soft Adaptive Particle Swarm Algorithm for Large
Scale Optimization, IEEE 2010.
References Contd..
8/10/2019 dcmeet third.pptx
71/72
71
Junli Lu, Fan Yang, Momo Li1, Lizhen Wang, Multi-objective Rule DiscoveryUsing the Improved Niched Pareto Genetic Algorithm, 2011 Third InternationalConference on Measuring Technology and Mechatronics Automation.
Yan Chen, Shingo Mabu, Kotaro Hirasawa, Genetic relation algorithm withguided mutation for the large-scale portfolio optimization, Expert Systemswith Applications 38 (2011), 33533363.
R.J. Kuo, C.M. Chao, Y.T. Chiu, Application of particle swarm optimization toassociation rule mining, Applied Soft Computing 11 (2011), 326336
Feng Lu, Yanfeng Ge, LiQun Gao, Self-adaptive Particle Swarm OptimizationAlgorithm for Global Optimization, 2010 Sixth International Conference onNatural Computation (ICNC 2010)
Fevrier Valdez, Patricia Melin, Oscar Castillo, An improved evolutionarymethod with fuzzy logic for combining Particle Swarm Optimization andGenetic Algorithms, Applied Soft Computing 11 (2011) ,26252632
8/10/2019 dcmeet third.pptx
72/72
hank You