ECE2_Neuraology 1-3

Embed Size (px)

Citation preview

  • 7/29/2019 ECE2_Neuraology 1-3

    1/7

    Neural Network RKJHA

    Lecture-01

    NeuralNetwork:Work on artificial Neural N/W is commonly referred as Neural Networks

    Motivation: Human brain computes in an entirely different fashion from the

    computational computer .Though, it is highly complex, nonlinear, and has parallel

    information processing system. It has the capability to organize its structural

    constituent known as neurons so as to perform certain computation (e.g. Pattern

    reorganization, perception e.t.c) and many times faster than the fastest digital

    computer in existence today.

    E.g human vision:-The visual system provides the representation of the

    environment around us and more important information we need to interact with

    the environment or recognizing a familiar face in an unfamiliar experience.

    A brain has a great structure and the ability to build up its own rules through

    what we usually refer as experience.

    A developing neuron is synonymous with a plastic brain. Plasticity permits

    developing neurons to (neuron system) adopt to is surrounding environment. In

    general, also plasticity is an essential property for functioning of any machine for

    information processing.

    Neural Network is a machine that is designed to model the way in whichhuman brain performs a particular task or function

    This N/W can be realized either by using electronic component or is

    simulated in software on a digital computer.

    An important class of neural network is one that performs useful computation

    through a process of learning.

    Neural network viewed as an adaptive machine can be defined as:A Neural Network is a massively parallel distributed processor made up of simple

    processing units, which has a natural propensity for storing experimental

    knowledge and making it available for use. It resembles the brain in two respects.

  • 7/29/2019 ECE2_Neuraology 1-3

    2/7

    Neural Network RKJHA

    Soft Computing

    1. Knowledge is acquired by the N/W from its environment through a learning

    process.

    2. Inter neuron connection strengths, known as synaptic weights, are used to

    store the acquired knowledge.

    The procedure used to perform the learning process is called learning algorithm,

    the function of which is to modify the synaptic weight of the network in an

    orderly fashion to attain a desired design objective.

    Benefits ofNeural Network:

    Neural networks derive its computing power through:

    1. Massively parallel distributed structure.

    2. Its ability to learn and there for generalize.

    Generalization refers to that the neural network can produce reasonable output for

    inputs not encountered during training (learning).

    **N.N can not provide solution by working individually. Rather it can be

    integrated into a system engineering process.

  • 7/29/2019 ECE2_Neuraology 1-3

    3/7

    Neural Network RKJHA

    Lecture-02

    Soft Computing

    NN offers the following usefulproperties:

    1. Nonlinearity:-A N can be linear as well as non linear.

    2. Input-Output Mapping:- NN can be trained using sample data or task

    example. Each example consists of a unique input signal and a

    corresponding classical response. The network is trained by adjusting the

    weights to minimize difference between classical o/p and actual o/p.

    3. Adaptivity:- Neural network have a built in capability to adopt their

    synaptic weights to changes in the surrounding environment.

    In particular a neural network trained to operate in a specific environment

    can be easily retrained to deal with minor changes in the operating

    environmental condition. Also if NN is meant to function is a non stationary

    environment, it can be designed to change its synaptic weight in real time. This

    enables to make it a useful tool in adaptive pattern classification, adaptive

    signal processing, and adaptive control.

    ** To realize the full benefit of adaptivity, the principle time constant of the

    system should be long enough for the system to ignore spurious disturbances and

    yet short enough to respond to meaning full changes in the environment.

    4. Evidential response:- In context to pattern classification, a neural

    network can be designed to provide information not only about which

    pattern to select but also about the confidence in the discussion made:

    The latter information is used to reject ambiguous patterns.

    5. Contextual Information:- Knowledge is represented by the very

    structure and actuation state of a neural network. Every neuron in the

    network is potentially affected by the global activity of all other neurons

    in the network.

  • 7/29/2019 ECE2_Neuraology 1-3

    4/7

    Neural Network RKJHA

    Soft Computing

    6. Fault Tolerance:- A neural network, implemented in hardware form has

    the potential to be inherently fault tolerant or capable of about

    computation, in the sense that is performance degrades gradually under

    adverse operating conditions. Thus in principle a neural network exhibits

    a graceful degradation in performance rather than catastrophic failure.

    7. VLSI: Implementation:- The massively parallel nature of a neural

    network makes it potentially fast for the computation of certain task. This

    same feature makes a neural network will suited for implementation

    using very large scale integrated technology.

    8. Uniformity of Analysis & Design:- A neural n/w enjoys universality as

    information processes. I.e. like same notion as used in all domainsinvolving application of NN.

    9. Neurobiological Analogy:- The design of a neural network is motivated

    by analogy with the brain, which is a living proof that fault tolerant

    parallel processing is not only physically possible but also fast and

    powerful.

  • 7/29/2019 ECE2_Neuraology 1-3

    5/7

    Neural Network RKJHA

    Lecture-03

    Soft Computing

    NeuralNetworks:

    Brain contains about 1010

    basic units called neurons. A neuron is a small cell that

    receives electro-chemical signals from its various sources and in turn responds by

    transmitting electrical impulses to other neurons.

    Some neurons perform input operation referred to as afferent cell; some perform

    output operation referred to as efferent cells, the remaining form a part of

    interconnected network of neurons which are responsible for signal transformation

    and storage of information.

    Structure ofneuron:Graph

    Dendrites: Behave as input channels, i.e. all inputs from other neurons arrive

    through the dendrites.

    Axiom: Is electrically active and serves as an output channel. There are the non

    linear threshold devices which produce a voltage pulse called Action Potential. It

    the cumulative inputs received by the soma raise the interval electric potential ofthe cell neuron as Membrane potential, then the neuron fires by propagating the

    action potential shown the axiom to either or inhibit other neurons.

    Synapse orSynaptic Junction:

    The axiom terminates in a specialized contact called synapse or synaptic function

    that connects axiom to dendrites links of other neurons.

  • 7/29/2019 ECE2_Neuraology 1-3

    6/7

    Neural Network RKJHA

    Soft Computing

    This synaptic function which is a very minute gap at the end of the dendrite

    link contacts a neuron transmitter fluid.

    The size of the synaptic junction or synapses is believed to be related to learning.

    Thus, synapses with large area are thought to be exhibitory while those with small

    area are believed to be inhibitory.

    Model of A ArtificialNeuron:

    Human brain is a highly interconnected network of simple processing elements

    called neurons. The behavior of a neuron can be captured by a simple model

    termed as artificial neuron.

    In artificial neurons acceleration and retardation of modeled by weights. An

    efficient synapse which transmits a stronger signal will have a corresponding larger

    weight.

    I = w1x1 + w2x2 + .+ wnxn

  • 7/29/2019 ECE2_Neuraology 1-3

    7/7

    Neural Network RKJHA

    =