13
ORNL is managed by UT-Battelle for the US Department of Energy Evolutionary Optimization: A Training Method for Neuromorphic Systems Catherine Schuman Computational Data Analytics NICE Workshop March 8, 2016

Evolutionary Optimization: A Training Method for …neuroscience.berkeley.edu/wp-content/uploads/2016/05/Catherine_Sc...• Discrete event simulation ... 9 Evolutionary Optimization:

  • Upload
    doliem

  • View
    217

  • Download
    2

Embed Size (px)

Citation preview

ORNL is managed by UT-Battelle for the US Department of Energy

Evolutionary Optimization: A Training Method for Neuromorphic Systems Catherine Schuman Computational Data Analytics

NICE Workshop

March 8, 2016

2 Evolutionary Optimization: A Training Method for Neuromorphic Systems

Neuroscience-Inspired Dynamic Architecture (NIDA)

• Spiking neural network embedded in 3D space. • Simple neuron and synapse implementation. • Discrete event simulation.

Input&Neuron&

Output&Neuron&

Hidden&&Neurons&

Excitatory&Synapse&

Inhibitory&Synapse&

Threshold)

Charge)

Return)to)neutral)charge)

Firing)Time) Time)

Firing)Neuron)

Neuron)

Change6in6)Charge)Event)

Incoming))Synapse)

Outgoing)Synapse)

Charge)Increase)due)to)Change6in6Charge)Event)

Firing)Time)t) t) t) t)

(No)elapsed)?me))

3 Evolutionary Optimization: A Training Method for Neuromorphic Systems

Dynamic Adaptive Neural Network Array (DANNA) •  Array of programmable

neuromorphic elements. •  Elements can connect to up to

16 neighbors. •  Implementations:

–  Current: FPGA. –  Future: VLSI, memristors.

•  Hardware-accurate software simulation in C. –  Current: Event-driven

implementation. –  Future: GPU implementation.

4 Evolutionary Optimization: A Training Method for Neuromorphic Systems

Training/Design: Evolutionary Optimization

Task Specific: (1)  Input (2)  Output (3)  Fitness Function

5 Evolutionary Optimization: A Training Method for Neuromorphic Systems

Iris

Data Set: https://archive.ics.uci.edu/ml/datasets/Iris Schuman, et al. An Evolutionary Optimization Framework for Neural Networks and

Neuromorphic Architectures. 2016. Submitted.

6 Evolutionary Optimization: A Training Method for Neuromorphic Systems

Wisconsin Breast Cancer

Data Set: https://archive.ics.uci.edu/ml/datasets/Breast+Cancer+Wisconsin+%28Original%29 Schuman, et al. An Evolutionary Optimization Framework for Neural Networks and Neuromorphic

Architectures. 2016. Submitted.

7 Evolutionary Optimization: A Training Method for Neuromorphic Systems

Pima Indian Diabetes

Data Set: https://archive.ics.uci.edu/ml/datasets/Pima+Indians+Diabetes Schuman, et al. An Evolutionary Optimization Framework for Neural Networks and Neuromorphic Architectures. 2016. Submitted.

8 Evolutionary Optimization: A Training Method for Neuromorphic Systems

Library for Arbitrary Graph Optimization

• Can be quickly applied to new neuromorphic architectures that can be represented as graph structures.

• User-specifications: – Graph template file. –  EO template file. –  Two user-defined functions per architecture:

•  ConvertToGraph() •  ConvertFromGraph()

–  Two user-defined functions per application: •  InitializeGraph() – Used to initialize graphs in the EO population. •  Fitness() – Returns a numerical score for the graph.

9 Evolutionary Optimization: A Training Method for Neuromorphic Systems

Summary • Evolutionary optimization is a convenient way to

explore the characteristics and capabilities of new neuromorphic architectures.

• We have had success on basic benchmark tasks using an EO framework for two distinct architectures.

• EO framework can be applied to (relatively) arbitrary network structures. –  Can interact with hardware or software simulations. –  Can learn hyper-parameters on top of existing learning

methods. –  Scalable for HPC implementation. – Generates lots of networks and their performance

characteristics for study.

10 Evolutionary Optimization: A Training Method for Neuromorphic Systems

References •  N. Pavlidis, D. Tasoulis, V. P. Plagianakos, G. Nikiforidis, and M. Vrahatis, “Spiking neural network training using

evolutionary algorithms,” in Neural Networks, 2005. IJCNN’05. Proceedings. 2005 IEEE International Joint Conference on, vol. 4. IEEE, 2005, pp. 2190–2194.

•  M. Valko, N. C. Marques, and M. Castellani, “Evolutionary feature selection for spiking neural network pattern classifiers,” in Artificial intelligence, 2005. epia 2005. portuguese conference on. IEEE, 2005, pp. 181–187.

•  L. Bako, “Real-time classification of datasets with hardware embedded neuromorphic neural networks,” Briefings in bioinformatics, p. bbp066, 2010.

•  R. Vazquez, A. Cach´on et al., “Integrate and fire neurons and their application in pattern recognition,” in Electrical Engineering Computing Science and Automatic Control (CCE), 2010 7th International Conference on. IEEE, 2010, pp. 424–428.

•  R. A. Vazquez, “Pattern recognition using spiking neurons and firing rates,” in Advances in Artificial Intelligence–IBERAMIA 2010. Springer, 2010, pp. 423–432.

•  P. P. Palmes, T. Hayasaka, and S. Usui, “Mutation-based genetic neural network,” Neural Networks, IEEE Transactions on, vol. 16, no. 3, pp. 587–600, 2005.

•  H.-Y. Hsieh and K.-T. Tang, “Hardware friendly probabilistic spiking neural network with long-term and short-term plasticity,” Neural Networks and Learning Systems, IEEE Transactions on, vol. 24, no. 12, pp. 2063–2074, 2013.

•  T. Chen, Y. Chen, M. Duranton, Q. Guo, A. Hashmi, M. Lipasti, A. Nere, S. Qiu, M. Sebag, and O. Temam, “Benchnn: On the broad potential application scope of hardware neural network accelerators,” in Workload Characterization (IISWC), 2012 IEEE International Symposium on. IEEE, 2012, pp. 36–45.

•  N. Garcıa-Pedrajas, D. Ortiz-Boyer, and C. Herv´as-Mart´ınez, “An alternative approach for neural network evolution with a genetic algorithm: Crossover by combinatorial optimization,” Neural Networks, vol. 19, no. 4, pp. 514–528, 2006.

•  D. Soudry, D. Di Castro, A. Gal, A. Kolodny, and S. Kvatinsky, “Memristor-based multilayer neural networks with online gradient descent training,” 2015.

•  M. Suri and V. Parmar, “Exploiting intrinsic variability of filamentary resistive memory for extreme learning machine architectures,” Nanotechnology, IEEE Transactions on, vol. 14, no. 6, pp. 963–968, Nov 2015.

11 Evolutionary Optimization: A Training Method for Neuromorphic Systems

References •  D. Yeung, J.-C. Li, W. Ng, and P. Chan, “Mlpnn training via a multiobjective optimization of training error and stochastic

sensitivity,” Neural Networks and Learning Systems, IEEE Transactions on, vol. PP, no. 99, pp. 1–1, 2015. •  A. Belatreche, L. P. Maguire, and M. McGinnity, “Advances in design and application of spiking neural networks,” Soft

Computing, vol. 11, no. 3, pp. 239–248, 2007. •  R. Vazquez et al., “Training spiking neural models using cuckoo search algorithm,” in Evolutionary Computation (CEC),

2011 IEEE Congress on. IEEE, 2011, pp. 679–686. •  S. McKennoch, T. Voegtlin, and L. Bushnell, “Spike-timing error backpropagation in theta neuron networks,” Neural

computation, vol. 21, no. 1, pp. 9–45, 2009. •  E. Alba and J. F. Chicano, “Training neural networks with ga hybrid algorithms,” in Genetic and Evolutionary Computation–

GECCO 2004. Springer, 2004, pp. 852–863. •  D. B. Fogel, E. C. Wasson, and E. M. Boughton, “Evolving neural networks for detecting breast cancer,” Cancer letters,

vol. 96, no. 1, pp. 49–53, 1995. •  M. M. Islam and X. Yao, “Evolving artificial neural network ensembles,” in Computational intelligence: a compendium.

Springer, 2008, pp. 851–880. •  S. Cawley, F. Morgan, B. McGinley, S. Pande, L. McDaid, S. Carrillo, and J. Harkin, “Hardware spiking neural network

prototyping and application,” Genetic Programming and Evolvable Machines, vol. 12, no. 3, pp. 257–280, 2011. •  X. Yao and Y. Liu, “A new evolutionary system for evolving artificial neural networks,” Neural Networks, IEEE Transactions

on, vol. 8, no. 3, pp. 694–713, 1997. •  H. A. Abbass, “An evolutionary artificial neural networks approach for breast cancer diagnosis,” Artificial Intelligence in

Medicine, vol. 25, no. 3, pp. 265–281, 2002. •  R. Hasan and T. M. Taha, “Enabling back propagation training of memristor crossbar neuromorphic processors,” in Neural

Networks (IJCNN), 2014 International Joint Conference on. IEEE, 2014, pp. 21–28. •  Y. Jin, R. Wen, and B. Sendhoff, “Evolutionary multi-objective optimization of spiking neural networks,” in Artificial Neural

Networks–ICANN 2007. Springer, 2007, pp. 370–379.

12 Evolutionary Optimization: A Training Method for Neuromorphic Systems

Acknowledgements

University of Tennessee Neuromorphic Research Team

• Special thanks to: –  Jim Plank –  Adam Disney –  John Reynolds –  Doug Birdwell – Mark Dean – Garrett Rose –  Tom Potok –  Robert Patton

13 Evolutionary Optimization: A Training Method for Neuromorphic Systems

Neuromorphic Computing Workshop: Architectures, Models, and Applications

June 29, June 30, and July 1, 2016

Oak Ridge National Laboratory

http://ornlcda.github.io/neuromorphic2016/

Email: schumancd [at] ornl.gov