Upload
gerard-gaines
View
215
Download
0
Embed Size (px)
Citation preview
MINICON Project
Overview of Minicon Project
Condition monitoring and diagnostics for elevators
Dale Addison
CENTRE FOR ADAPTIVE SYSTEMS
University of Sunderland
School of Computing & Technology
MINICON Project
Project overview
• MINICON– Minimum Cost Maximum Benefit Condition
Monitoring
• Framework 5, Competitive and Sustainable Growth, project value £3 million)
• Two aspects– Condition monitoring of elevators– Condition monitoring of high speed machine
tools (>15000 rpm)
MINICON Project
Project partners
• Kone (4th largest supplier of elevators in the world) (Finland)
• VTT (Finland)
• Goratu (Bilbao, Spain
• Tekniker (Bilbao, Spain)
• Rockwell Manufacturing (Belgium & Czechoslovakia)
• Technical University of Talinn (Estonia)
• IB Krates (Estonia)
• Monitran ltd (UK)
• University of Sunderland, (UK)
• Entek (UK)
• Truth (Athens, Greece)
• ICCS/NTUA (Athens, Greece)
MINICON Project
Signal Processing Unit
On-boardIntelligentAlarm SystemFirst level intelligence
Sensors, transducers etc.
Data store
CMMS
Application software and database with second level Intelligence.Prior Knowledge Intelligence System
Plant or Machine to be monitored e.g. Elevator or machine tool
Maintenance Management SystemWith third level of intelligence,
human, providing Decision Support
Service Engineer with Hand held PC, Email or paging
device etc.
MINICON Project
MINICON Project
Neural networks
• Adaptive technology device based upon “neurons” found in the human brain
• Neurons are connected together and send signals to each other. (networks)
• Signals are summed and when they exceed a certain limit, the neuron “fires” (sends signal to other neurons)
• Networks can be trained using algorithms which respond to the data.
MINICON Project
A Neural network
• Multi-layer perceptron
• Each neuron performs a biased weighted sum of their inputs.
• This activation level is passed through a transfer function (usually sigmoidal) to produce its output,
• Neurons are arranged in a layered feed forward topology.
MINICON Project
Multi layer perceptrons
• Networks weights and thresholds are adjusted by a training algorithm which alters the weights according to the training data.
• This ensures the smallest possible difference between the input data and the outputs
MINICON Project
Dimensionality reduction techniques
• Principal Components Analysis– Non-Linear
• Weight regularisation techniques (Weigand method)
• Genetic algorithms
MINICON Project
Non-Linear principal components analysis (Auto Associative network)
•Neural network which uses its inputs as outputs
•Has at least one hidden layer, with less neurons than the input and output layers, which have the same number of neurons
•Data is effectively “squeezed” through a lower dimensionality
MINICON Project
Non-linear Principal components analysis
MINICON Project
Auto associative training
• Produce Auto associative training set (Inputs map to outputs)
• Create auto associative MLP– 5 layers– Middle hidden layer has less units than output layers– Other two hidden layers have a relatively large number of neurons,
both should have the same number
• Train network on data set• Delete last two layers• Collect reduced dimensionality input data, replace the
original input data, retain original output variables• Create a second neural networkand train on the reduced
data set.
MINICON Project
• GA’s are an optimisation technique which use Darwins concept of “survival of the fittest” to breed successively better strings according to an objective function
• In this problem, that function helps to determine subsets of inter-related bits (correlated or mutually required inputs)
Use of Genetic Algorithms
MINICON Project
• The sensitivity of a particular variable is determined, by running the network on a set of test cases, and accumulate the network error.
• The network is then run using the same cases, but without certain information used earlier,(specific input variable) and the network error accumulated.
• The sensitivity error is the ratio of error with missing value substitution to the original error.
Sensitivity analysis
MINICON Project
Neural Network weight regularisation
• Promotes low curvature by encouraging small weights to model the feature surface
• Adds extra term to the error function which penalises gratuitous larger weights
• Also prefers to “tolerate” a mixture of large and small weights, rather than medium sized weights
MINICON Project
Neural networks used
• Multi Layer Perceptrons (MLP)
• Radial basis function (RBF)
• Self Organisng Feature Maps (Kohonen)
• Experiments were ran on several different data sets, recorded over a number of time periods (day, 2 days one week)
MINICON Project
Radial basis function nets
• Feature space is divided up using circles (hyperspheres)
• Characterised by centre and radius
• Response surface is a gaussian (bell shaped curve)
MINICON Project
Radial basis function networks
• RBF consists of – Hidden Layer of Radial Units modelling a
Gaussian response surface (only one)
• Training of RBF’s– Centres and Deviations of Radial Units are set– Linear Output layer is optimised– Centres assigned to reflect clustering of data
MINICON Project
Radial basis function networks
• Centre assignment methods
– Sub sampling: Random number of training points are copied to the radial units
– K-means algorithm Set of points are selected and placed at the centre of clusters of training data.
MINICON Project
Radial basis function networks
• Deviation assignment (Determines how spiky the gaussian functions are)– Explicit (choose yourself)
– Isotropic heuristic method using the number of centres and volume of space occupied
– K-NN Units deviation is individually set to the mean distance of its K-NN
• Small in tightly packed areas (preserves details)
• Higher in sparse areas of space
MINICON Project
Artificial Neuron, and the Kohonen SOFM
MINICON Project
Application to MINICON project
• Picture shows test elevator used at KONE site, sensors mounted at various sites and results used as input to neural networks
• Self Organising feature maps and Multi-Layer perceptrons trained on a variety of elevator data.
MINICON Project
Results
MINICON Project
Results for Genetic Algorithm
MINICON Project
Results for Sensitivity Analysis
MINICON Project
Results of weight regularisation techniques
MINICON Project
Results for auto association
MINICON Project
Best performing technique per data set
Data set Best Technique
Data set 1 Equal
Data set 2 Weigand
Data set 3 Unreduced & Weigand
Data set 4 Non-Linear PCA 94%
Data set 5 Unreduced & Weigand
Data set 6 Weigand
Data set 7 Unreduced & Weigand
Data set 8 Unreduced
Data set 9 Unreduced & Weigand
MINICON Project
Final Networks
• Two types of neural networks used in final product– Multi-Layer Perceptrons(with weight
regularisation applied)– SOFM
• Both networks require different numbers of inputs depending on the data set (5-15)
MINICON Project
Alternative methods
• Use of Statistical techniques
• Mean, Kurtosis, Standard Deviation
For example the mean of one parameter suggests a significant rise for data set number 5. Since all the other data sets show a consistent mean then this example seems to be highly significant.
MINICON Project
Use of mean value
MINICON Project
Conclusions
• Removing input data does not improve classification performance
• Statistical techniques not consistent to make reliable estimates
• MLP’s and SOFM’s are best performing NN techniques
• MLP’s performance can be improved by applying weight regularisation techniques.