Upload
kamil
View
64
Download
0
Tags:
Embed Size (px)
DESCRIPTION
Implementing Cognitive Radio. How does a radio become cognitive?. Presentation Overview. Architectural Approaches Observing the Environment Autonomous Sensing Collaborative Sensing Radio Environment Maps and Observation Databases Recognizing Patterns Neural Nets Hidden Markov Models - PowerPoint PPT Presentation
Citation preview
Cognitive Radio Technologies, 2008
1
Implementing Cognitive Radio
How does a radio become cognitive?
Cognitive Radio Technologies, 2008
2/77
Presentation Overview• Architectural Approaches • Observing the Environment
– Autonomous Sensing– Collaborative Sensing– Radio Environment Maps and Observation Databases
• Recognizing Patterns– Neural Nets– Hidden Markov Models
• Making Decisions– Common Heuristic Approaches– Case-based Reasoning
• Representing Information• A Case Study
Cognitive Radio Technologies, 2008
3
Architectural OverviewWhat are the components of a cognitive radio and how do they relate to each other?
Cognitive Radio Technologies, 2008
4/77
Strong Artificial Intelligence• Concept: Make a machine aware
(conscious) of its environment and self aware
• A complete failure (probably a good thing)
Cognitive Radio Technologies, 2008
5/77
Weak Artificial Intelligence• Concept: Develop powerful (but limited)
algorithms that intelligently respond to sensory stimuli
• Applications– Machine Translation– Voice Recognition– Intrusion Detection– Computer Vision– Music Composition
Cognitive Radio Technologies, 2008
6/77
Implementation Classes
• Weak cognitive radio– Radio’s adaptations
determined by hard coded algorithms and informed by observations
– Many may not consider this to be cognitive (see discussion related to Fig 6 in 1900.1 draft)
• Strong cognitive radio– Radio’s adaptations
determined by conscious reasoning
– Closest approximation is the ontology reasoning cognitive radios
l In general, strong cognitive radios have potential to achieve both much better and much worse behavior in a network.
Cognitive Radio Technologies, 2008
7/77
Weak/Procedural Cognitive Radios
• Radio’s adaptations determined by hard coded algorithms and informed by observations
• Many may not consider this to be cognitive (see discussion related to Fig 6 in 1900.1 draft)– A function of the fuzzy definition
• Implementations:– CWT Genetic Algorithm Radio– MPRG Neural Net Radio– Multi-dimensional hill climbing DoD LTS (Clancy)– Grambling Genetic Algorithm (Grambling)– Simulated Annealing/GA (Twente University)– Existing RRM Algorithms?
Cognitive Radio Technologies, 2008
8/77
Strong Cognitive Radios• Radio’s adaptations determined by
some reasoning engine which is guided by its ontological knowledge base (which is informed by observations)
• Proposed Implementations:– CR One Model based reasoning (Mitola) – Prolog reasoning engine (Kokar)– Policy reasoning (DARPA xG)
Cognitive Radio Technologies, 2008
9/77
DFS in 802.16h• Drafts of 802.16h
defined a generic DFS algorithm which implements observation, decision, action, and learning processes
• Very simple implementation
Modified from Figure h1 IEEE 802.16h-06/010 Draft IEEE Standard for Local and metropolitan area networks Part 16: Air Interface for Fixed Broadband Wireless Access Systems Amendment for Improved Coexistence Mechanisms for License-Exempt Operation, 2006-03-29
Channel AvailabilityCheck on next channel
Available?
Choose Different Channel
Log of Channel Availability
Stop Transmission
Detection?
Select and change to new available channel in a defined time with a max. transmission time
In service monitoring of operating channel
Channel unavailable for Channel Exclusion time
Available?
Background In service monitoring (on non-
operational channels)
Service in function
No
No
No
Yes
Yes
Start Channel Exclusion timer
Yes
Learning
Observation
Decision, Action
Decision, Action
Observation
Cognitive Radio Technologies, 2008
10/77
Security
User Model
Policy Model
WS
GA
Evolver
|(Simulated Meters) – (Actual Meters)| Simulated Meters
Actual Meters
Cognitive System Module
Cognitive System ControllerChob
Uob
Reg
Knowledge BaseShort Term MemoryLong Term Memory
WSGA Parameter SetRegulatory Information
Initial ChromosomesWSGA Parameters
Objectives and weights
System Chromosome
}max{}max{
UUU
CHCHCH
USDUSD
Decision Maker
CE
-user interface
User Domain
User preferenceLocal service facility
Policy Domain
User preferenceLocal service facility
Security
User data securitySystem/Network security
X86/UnixTerminal
Radio-domain cognitionRadio
Resource Monitor
Performance API Hardware/platform API
Radio
Radio Performance
Monitor
WMS
CE-Radio Interface
Search SpaceConfig
ChannelIdentifier
WaveformRecognizer
ObservationOrientation
Action
Example Architecture from CWT
Decision
Learning
Models
Cognitive Radio Technologies, 2008
11/77
Architecture Summary• Two basic approaches
– Implement a specific algorithm or specific collection of algorithms which provide the cognitive capabilities
• Specific Algorithms– Implement a framework which permits algorithms to be changed based
on needs• Cognitive engine
• Both implement following processes– Observation, Decision, Action
• Either approach could implement– Learning, Orientation– Negotiation, policy engines, models
• Process boundaries may blur based on the implementation– Signal classification could be orientation or observation
• Some processes are very complementary– Orientation and learning
• Some processes make most intuitive sense with specific instantiations– Learning and case-based-reasoning
Cognitive Radio Technologies, 2008
12
Observations
How does the radio find out about its environment?
Cognitive Radio Technologies, 2008
13/77
The Cognitive Radio and its Environment
• Spectrum information is provided by the network
• Spectrum information is shared by other cognitive radios
• Observes user's applications, incoming/ outgoing data streams
• Performs speech analysis
User
• Passively "listens" to the spectrum
• Performs channel quality estimation
Spectrum
(communication opportunities)
• Receives GPS signals to determine position
• Parses short-range wireless broadcasts in buildings or urban areas for mapped environment
• Observes the network for e.g. weather forecast, reported traffic jams, …etc.
•Measures temperature, light level, humidity, …
Environment
(physical quantities, position, situations)
Other opportunities to get information
How the cognitive radio gets the information?
Information is about
Cognitive Radio Technologies, 2008
14/77
Signal Detection• Optimal technique is matched filter• While sometimes useful, matched filter may not be
practical for cognitive radio applications as the signals may not be known
• Frequency domain analysis often required• Periodogram
– Fourier transform of autocorrelation function of received signal– More commonly implemented as magnitude squared of FFT of
signal
0
00
22
0
1lim
2
T j Ftxx TT
P F x t e dT
Cognitive Radio Technologies, 2008
15/77
Comments on Periodogram• Spectral leaking can mask weak signals• Resolution a function of number of data points• Significant variance in samples
– Can be improved by averaging, e.g., Bartlett, Welch– Less resolution for the complexity
• Significant bias in estimations (due to finite length)– Can be improved by windowing autocorrelation, e.g., Blackman-Tukey
Estimation Quality Factor Complexity
Periodogram 1
Bartlett 1.11 N f
Welch (50% overlap)
1.39 N f
Blackman-Tukey
2.34 N f
2
0.9log
2
N
f
2
1.28logN
f
2
5.12logN
f
2
var
xx
xx
E P fQ
P f
Quality Factor
Cognitive Radio Technologies, 2008
16/77
Other Detection Techniques• Nonparametric
– Goertzel – evaluates Fourier Transform for a small band of frequencies
• Parametric Approaches– Need some general characterization (perhaps as
general as sum of sinusoids)– Yule-Walker (Autoregressive)– Burg (Autoregressive)
• Eigenanalysis– Pisarenko Harmonic Decomposition– MUSIC– ESPRIT
Cognitive Radio Technologies, 2008
17/77
Sub noise floor Detection• Detecting narrowband signals with negative SNRs is
actually easy and can be performed with preceding techniques
• Problem arises when signal PSD is close to or below noise floor
• Pointers to techniques:– (white noise) C. L. Nikias and J. M. Mendel, “Signal processing
with higher-order spectrum,” Signal Processing, July 1993.– (Works with colored noise and time-varying frequencies) K.
Hock, “Narrowband Weak Signal Detection by Higher Order Spectrum,” Signal Processing, April 1996
– C.T. Zhou, C. Ting, “Detection of weak signals hidden beneath the noise floor with a modified principal components analysis,” AS-SPCC 2000, pp. 236-240.
Cognitive Radio Technologies, 2008
18/77
Signal Classification• Detection and frequency identification alone is
often insufficient as different policies are applied to different signals– Radar vs 802.11 in 802.11h,y– TV vs 802.22
• However, would prefer to not have to implement processing to recover every possible signal
• Spectral Correlation permits feature extraction for classification
Cognitive Radio Technologies, 2008
19/77
Cyclic Autocorrelation• Cyclic Autocorrelation
• Quicky terminology:– Purely Stationary– Purely Cyclostationary– Exhibiting Cyclostationarity
• Meaning: periods of cyclostationarity correspond to:– Carrier frequencies, pulse rates, spreading code
repetition rates, frame rates
• Classify by periods exhibited in R
/ 2 2
/ 2
1lim / 2 / 2
t j tx
tR x t x t e dt
t
0 0, 0x xR R
00 only for / ,xR n T n 0xR
Cognitive Radio Technologies, 2008
20/77
Spectral Correlation• Estimation of Spectral Correlation Density (SCD)
– For =0, above is periodogram and in the limit the PSD
• SCD is equivalent to Fourier Transform of Cyclic Autocorrelation
/ 2
*
/ 2
1 1, ,
2 2T
t
X T Ttt
S f X t f X t f dtt T
2
T
j fX xS f R e d
Cognitive Radio Technologies, 2008
21/77
Spectral Coherence Function• Spectral Coherence Function
• Normalized, i.e.,• Terminology:
= cycle frequency– f = spectrum frequency
• Utility: Peaks of C correspond to the underlying periodicities of the signal that may be obscured in the PSD
• Like periodogram, variance is reduced by averaging
0 0/ 2 / 2
xx
x x
S fC
S f S f
1xC f
Cognitive Radio Technologies, 2008
22/77
From Figure 4.1 in I. Akbar, “Statistical Analysis of Wireless Systems Using Markov Models,” PhD Dissertation, Virginia Tech, January 2007
Practical Implementation of Spectral Coherence Function
( )TX tS f
N point FFT
Shift X(f - / 2) and X(f + / 2)
Correlation of
X*(f - / 2) and X(f + / 2)
Get Spectral Coherence Function
Block averageover t
X( f )x(t)
X(f - / 2)
X(f + / 2)
( )XC f
Cognitive Radio Technologies, 2008
23/77
Example Magnitude Plots
DSB-SC AM BPSK
MSKFSK
Cognitive Radio Technologies, 2008
24/77
- Profile -profile of SCF• Reduces data set size, but captures most
periodicities
ˆ( ) max | ( ) |xf
I C f
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1DSBSC AM Cycle Freq. Profile with SNR=0 dB Obs. Length=100
Cycle frequency /Fs
Max. A
mp
ltit
ud
e o
f S
pectr
al C
oh
ere
nce
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1MSK Cycle Freq. Profile with SNR=0 dB Obs. Length=100
Cycle frequency /Fs
Max
. Am
pltit
ude
of S
pect
ral C
oher
ence
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1FSK Cycle Freq. Profile with SNR=0 dB Obs. Length=100
Cycle frequency /Fs
Max
. Am
pltit
ude
of S
pect
ral C
oher
ence
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1BPSK Cycle Freq. Profile with SNR=0 dB Obs. Length=100
Cycle frequency /Fs
Max
. Am
plt
itu
de
of
Sp
ectr
al C
oh
eren
ce
DSB-SC AM
BPSK
MSKFSK
Cognitive Radio Technologies, 2008
25/77
Combination of Signals
BPSK MSK
BPSK + MSK
Cognitive Radio Technologies, 2008
26/77
Impact of Signal Strength
BPSK with SNR=-9dB
BPSK with SNR=9dBMain signature remains
Cognitive Radio Technologies, 2008
27/77
Resolution• High resolution may be
needed to capture feature space– High computational burden
• Lower resolution possible if there are expected features– Legacy radios should be
predictable– CR may not be predictable– Also implies an LPI strategy
Plots from A. Fehske, J. Gaeddert, J. Reed, “A new approach to signal classification using spectral correlation and neural networks,”DySPAN 05, pp. 144-150.
AM
BPSK 200x200
BPSK 100x100
Cognitive Radio Technologies, 2008
28/77
Additional comments on Spectral Correlation• Even though PSDs may overlap, spectral correlation
functions for many signals are quite distinct, e.g., BPSK, QPSK, AM, PAM
• Uncorrelated noise is theoretically zeroed in the SCF– Technique for subnoise floor detection
• Permits extraction of information in addition to classification– Phase, frequency, timing
• Higher order techniques sometimes required– Some signals will not be very distinct, e.g., QPSK, QAM, PSK– Some signals do not exhibit requisite second order periodicity
Cognitive Radio Technologies, 2008
29/77
Collaborative Observation• Possible to combine estimations• Reduces variance, improves PD
vs PFA
• Should be able to improve resolution
• Proposed for use in 802.22– Partition cell into disjoint regions– CPE feeds back what it finds
• Number of incumbents• Occupied bands
0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
100
Grid Index X
Grid
Ind
ex Y
CPE Number = 400, IT Number = 4
Source: IEEE 802.22-06/0048r0
Cognitive Radio Technologies, 2008
30/77
More Expansive Collaboration: Radio Environment Map (REM)• “Integrated database consisting of multi-domain information, which
supports global cross-layer optimization by enabling CR to “look” through various layers.”
• Conceptually, all the information a radio might need to make its decisions.– Shared observations, reported actions, learned techniques
• Significant overhead to set up, but simplifies a lot of applications• Conceptually not just cognitive radio, omniscient radio
From: Y. Zhao, J. Gaeddert, K. Bae, J. Reed, “Radio Environment Map Enabled Situation-Aware Cognitive Radio Learning Algorithms,” SDR Forum Technical Conference 2006.
Cognitive Radio Technologies, 2008
31/77
Example Application: • Overlay network of secondary
users (SU) free to adapt power, transmit time, and channel
• Without REM:– Decisions solely based on link
SINR• With REM
– Radios effectively know everythingUpshot: A little gain for the secondary users; big gain for primary users
From: Y. Zhao, J. Gaeddert, K. Bae, J. Reed, “Radio Environment Map Enabled Situation-Aware Cognitive Radio Learning Algorithms,” SDR Forum Technical Conference 2006.
Cognitive Radio Technologies, 2008
32/77
Observation Summary• Numerous sources of information available• Tradeoff in collection time and spectral resolution• Finite run-length introduces bias
– Can be managed with windowing
• Averaging reduces variance in estimations• Several techniques exist for negative SNR detection and
classification• Cyclostationarity analysis yields hidden “features” related
to periodic signal components such as baud rate, frame rate and can vary by modulation type
• Collaboration improves detection and classification• REM is logical extreme of collaborative observation.
Cognitive Radio Technologies, 2008
33
Pattern Recognition
Hidden Markov Models, Neural Networks, Ontological Reasoning
Cognitive Radio Technologies, 2008
34/77
Hidden Markov Model (HMM)• A model of a system which behaves like a Markov chain
except we cannot directly observe the states, transition probabilities, or initial state.
• Instead we only observe random variables with distributions that vary by the hidden state
• To build an HMM, must estimate:– Number of states– State transition probabilities– Initial state distribution– Observations available for each state– Probability of each observation for each state
• Model can be built from observations using Baum-Welch algorithm
• With a specified model, output sequences can be predicted using the forward-backward algorithm
• With a specified model, a sequence of states can be estimated from observations using the Viterbi algorithm.
Cognitive Radio Technologies, 2008
35/77
Example• A hidden machine selects balls from an
unknown number of bins.• Bin selection is driven by a Markov chain.• You can only observe the sequence of balls
delivered to you and want to be able to predict future deliveries
Hidden States (bins)
Observation Sequence
Cognitive Radio Technologies, 2008
36/77
HMM for Classification• Suppose several different HMMs have been calculated
with Baum Welch for different processes• A sequence of observations could then be classified as
being most like one of the different models• Techniques:
– Apply Viterbi to find most likely sequence of state transitions through each HMM and classify as the one with the smallest residual error.
– Build a new HMM based on the observations and apply an approximation of Kullback-Leibler divergence to measure “distance” between new and existing HMMs. See M. Mohammed, “Cellular Diagnostic Systems Using Hidden Markov Models,” PhD Dissertation, Virginia Tech, October 2006.
Cognitive Radio Technologies, 2008
37/77
System Model for Signal Classification
Cognitive Radio Technologies, 2008
38/77
Signal Classification Results
Cognitive Radio Technologies, 2008
39/77
Effect of SNR and Observation Length
0 5 10 15 20 25 30 35 400
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Observation Length
Det
ecti
on R
ate
BPSK Signal Classification Performance for various SNRs and Observation Lengths (BPSK HMM with 9dB)
-12dB
-9dB
-6dB
-3dB
0dB
3dB
6dB
9dB
• BPSK signal detection rate of various SNR and observation length(BPSK HMM is trained with 9dB)
• Decreasing SNR increases observation time to obtain a good detection rate
Observation Length (One block is 100 symbols)
0%
50%
100%
Dete
cti
on
R
ate
- 9dB
- 12dB
0 5 10 15 20 25 30 35 40
- 6dB
Cognitive Radio Technologies, 2008
40/77
Location Classifier Design• Designing a classifier requires two fundamental steps
– Extraction of a set of features that ensures highly discriminatory attributes between locations
– Select a suitable classification model
• Features are extracted based on received power delay profile which includes information regarding the surrounding environment (NLoS/LoS, multipath strength, delay etc.).
• The selection of hidden Markov model (HMM) as a classification tool was motivated by its success in other applications i.e., speech recognition.
Collect statistics to form signature
features
Use pattern matching algorithm to classify features
Location of interest
Cognitive Radio Technologies, 2008
41/77
Determining Location by Comparing HMM Sequences
Featureextraction
Vectorquantization
HMM forposition 2Candidate received
power profile
v*
Observationsequence,O
HMM forposition 1
HMM forposition n
Selectmaximum
p(O|1 )
p(O|n )
p(O|2 )
v*=argmax [p(O|v ]
1 v n
Index ofrecognizedposition,
• In the testing phase, the candidate power profile is compared against all the HMMs previously trained and stored in the data base.
• The HMM with the closest match identifies the corresponding position.
Cognitive Radio Technologies, 2008
42/77
Feature Vector Generation
• Each location of interest was characterized by its channel characteristics i.e., power delay profile.
• Three dimensional feature vectors were derived from the power delay profile with excess time, magnitude and phase of the Fourier transform (FT) of the power delay profile in each direction.
-100 0 100 200 300 400 500-95
-90
-85
-80
-75
-70
-65
-60
Excess delay (ns)
Receiv
ed p
ow
er
(dB
m)
Power delay profile
Noise threshold = -80 dBm
Cognitive Radio Technologies, 2008
43/77
Measurement Setup Cont.
RX 1.1
RX 1.2
RX1.4
RX 1.3
• Transmitter location 1 represents NLOS propagation from a room to another room, and from a room to a hallway. The transmitter and receivers were separated by drywall containing metal studs.
• The transmitter was located in a small laboratory. Receiver locations 1.1 – 1.3 were in adjacent rooms, whereas receiver location 1.4 was in an adjacent hallway. Additionally, for locations 1.1 – 1.3, standard office dry-
erase “whiteboard” was located on the wall separating the transmitter and receiver.
Measurement Locations 1.1 – 1.4, 4th Floor, Durham Hall, Virginia Tech. The transmitter is located in Room 475, Receivers 1.1 and 1.2 are located in Room 471; Receiver 1.3 is in the conference room in the 476 computer lab, and Receiver 1.4 is located in the hallway adjacent to 475.
~17’
~58
’
Cognitive Radio Technologies, 2008
44/77
Vector Quantization (VQ)• Since the discrete observation
density is required to train HMMs, a quantization step is required to map the “continuous” vectors into discrete observation sequence.
• Vector quantization (VQ) is an efficient way of representing multi-dimensional signals. Features are represented by a small set of vectors, called codebook, based on minimum distance criteria.
• The entire space is partitioned into disjointed regions, known as Voronoi region. Example vector quantization in
a two-dimensional space.http://www.geocities.com/mohamedqasem/vectorquantization/vq.htm
*
*
Cognitive Radio Technologies, 2008
45/77
Classification Result• A four-state HMM was used to represent each location (Rx
1.1-1.4). • Codebook size was 32 • Confusion matrix for Rx location 1.1-1.4
Position Rx 1.1 Rx 1.2 Rx 1.3 Rx 1.4
Rx 1.1 95% 5% 0% 0%
Rx 1.2 5% 95% 0% 0%
Rx 1.3 0% 0% 100% 0%
Rx 1.4 0% 10% 0% 90%
Correct classification
HMM based on Rx location (estimated)
Ca
ndi
dat
e r
ece
ive
d p
ower
pro
file
(tru
e)
Overall accuracy 95%
Cognitive Radio Technologies, 2008
46/77
Some Applications of HMMs to CR from VT
• Signal Detection and Classification
• Position Location from a Single Site
• Traffic Prediction
• Fault Detection
• Data Fusion
Cognitive Radio Technologies, 2008
47/77
The Neuron and Threshold Logic Unit
• Several inputs are weighted, summed, and passed through a transfer function
• Output passed onto other layers or forms an output itself
• Common transfer (activation) functions
– Step
– Linear Threshold
– Sigmoid
– tanh
1
0
af a
a
Neuron
Image from: http://en.wikipedia.org/wiki/Neuron
a
Threshold Logic Unit
f (a)
w1
w2
wn
x1
x2
xn
1
0
af a
a
/
1
1 af a
e
tanha
f a
1nf a a w
Cognitive Radio Technologies, 2008
48/77
Neuron as Classifier• Threshold of
multilinear neuron defines a hyperplane decision boundary
• Number of inputs defines defines dimensionality of hyperplane
• Sigmoid or tanh activation functions permit soft decisions
x1
x2
Inputs Weights ActivationActivationFunction
x1 x2
0 0
0 1
1 0
1 1
w1 w2
-0.5 0.5
a
0
0.5
-0.5
0
>0.25?
1
1
0
1
w3
0.5
Cognitive Radio Technologies, 2008
49/77
Training Algorithm• Perceptron (linear
transfer function)• Basically an LMS training
algorithm• Steps:
– Given sequence of input vectors v and correct output t
– For each (v,t) update weights as
– where y is the actual output (thus t-y is the error)
• Delta (differentiable transfer function)
• Adjusts based on the slope of the transfer function
• Originally used with sigmoid as derivative is easy to implement 1k k t y w w v
1k k df at y
da w w v
/
1
1 af a
e
11
df af a f a
da
Cognitive Radio Technologies, 2008
50/77
The Perceptron• More sophisticated version of TLU• Prior to weighting, inputs are processed with Boolean
logic blocks• Boolean logic is fixed during training
a
Threshold Logic Unit
f (a)
w1
w2
wn
x1
x2
xn
Boolean Logic Blocks
Cognitive Radio Technologies, 2008
51/77
More Complex Decision Rules• Frequently, it is impossible
to correctly classify with just a single hyperplane
• Solution: Define several hyperplanes via several neurons and combine the results (perhaps in another neuron)
• This combination is called a neural net
• Size of hidden layer is number of hyperplanes in decision rules
x1
x2
x1
x2
Input Layer Hidden Layer
Output Layer
Cognitive Radio Technologies, 2008
52/77
Backpropagation Algorithm• Just using outputs and inputs doesn’t tell us how
to adjust hidden layer weights• Trick is figuring out how much of the error can be
ascribed to each hidden neuron
o
oo
df at y
da
x1
x2
Input Layer Hidden Layer
Output Layer
m
m
m j jmm
j I
df aw
da
1k k k w w v
Cognitive Radio Technologies, 2008
53/77
Example Application• Each signal class is a
multilayer linear perceptron network with 4 neurons in the hidden layer
• Trained with 199 point -profile, back propagation
• Activation function tanh• MAXNET chooses one
with largest value
295 Trials unknown Carrier, BW, 15 dB SNR
460 Trials Known Carrier, BW, -9 dB SNR
Results from: A. Fehske, J. Gaeddert, J. Reed, “A new approach to signal classification using spectral correlation and neural networks,”DySPAN 2005. pp. 144 - 150.
Cognitive Radio Technologies, 2008
54/77
Comments on Orientation• By itself ontological reasoning is likely inappropriate for
dealing with signals
• HMM and Neural Nets somewhat limited in how much they can scale up arbitrarily
• Implementations should probably feature both classes of techniques where – HMMs and NNs identify presence of objects, locations, or
scenarios, and reasoning engine combines.
– Meaning of the presence of these objects is then inferred by ontological reasoning.
Cognitive Radio Technologies, 2008
55
Decision Processes
Genetic algorithms, case-based reasoning, and more
Cognitive Radio Technologies, 2008
56/77
Decision Processes• Goal: choose the actions
that maximize the radio’s goal
• Very large number of nonlinearly related parameters tends to make solving for optimal solution quite time consuming.
Cognitive Radio Technologies, 2008
57/77
Case Based Reasoning• An elaborate switch (or if-
then-else) statement informed by “cases” defined by orientation (or context)
• “Case” identified by orientation, decision specified in database for the case
• Database can be built up over time
• Problem of what to do when new case is identified
A. Aamodt, E. Plaza (1994); Case-Based Reasoning: Foundational Issues, Methodological Variations, and System Approaches. AI Communications. IOS Press, Vol. 7: 1, pp. 39-59.
Cognitive Radio Technologies, 2008
58/77
Local Search • Steps:
1. Search a “neighborhood” of solution, sk to find s* that that improves performance the most.
2. sk+1=s*
3. Repeat 1,2 until sk+1= sk
• Variant: Gradient search, fixed number of iterations, minimal improvement
• Issues: Gets trapped in local maxima
Figure from Fig 2.6 in I. Akbar, “Statistical Analysis of Wireless Systems Using Markov Models,” PhD Dissertation, Virginia Tech, January 2007
Cognitive Radio Technologies, 2008
59/77
Genetic Algorithms• Concept: Apply concept of evolution to
searching complex spaces• Really random search with some structure• Successive populations (or generations) of
solutions are evaluated for their fitness.• Least fit solutions are removed from the
population• Most fit survive to breed replacement
members of the population• Breeding introduces mutation and cross-
overs so that new population is not identical to original population– Like parents and kids
• Lots of variants– Parents die off– Niches– Tabu for looping
Cognitive Radio Technologies, 2008
60/77
Genetic Algorithm Example
PopulationFitness Breeding
PWR F MAC NET
PWR F MAC NET
PWR F MAC NET
PWR F MAC NET
7
5
9
1
PWR F MAC NET
PWR F MAC NETMAC NET
PWR FPWR F MAC NET
Cross over
Mutation
Cognitive Radio Technologies, 2008
61/77
Comments on GA• Tends to result in good solution very quickly• Long time (perhaps no better than a random search) to find optimum
– Often paired with a local search• Low mutation rates can cause “genetic drift”• High mutation rates can limit convergence• Cross over is like high mutation, but without damaging convergence,
but can get stuck on local maxima• In theory, reaches global optimum, but requires more time to
guarantee than an exhaustive search• Lots of freedom in the design
– Mutation rate, cross over rate, chromosome size, number of generations, population size, number of survivors, breeding rules, surviving rules
– Even more variation used when fitness function or data sets are changing over time (e.g., set mutation rate or population as a function of fitness)
– Theoretically, best combination of parameters is a function of the characteristics of the solution space
– In practice, empirically setting parameters tends to be better (GA to program a GA?)
Cognitive Radio Technologies, 2008
62/77
Simulated Annealing• Steps:
1. Generate a random solution, s*
2. If s* is better than sk, then sk, then sk+1=s*; else generate random variable r. If r is less than some function of temperature and the difference in value of sk and s* and T, then sk+1=s*.
3. From time to time decrease T so that f(sk – s*,T) decreases over time.
4. Repeat steps 1-3 until stopping criterion
• Comments:– Important to store best result– In theory, reaches global
optimum, but requires more time to guarantee than an exhaustive search
– Often finished with a local search applied to best solution
• Freedom in algorithm– Distributions for generating
s*, schedules for T, change in distributions with T
• Threshold trading can be less costly
Cognitive Radio Technologies, 2008
63/77
Comments on Decision Processes• Execution time
– Case-based reasoning < Searches
• Good architectural decision is to combine approaches:– CBR except when unknown case– GA for a quick good solution– Refine with local search– Can revisit searches later when excess cycles are available
• CBR can provide initial solution(s) to search algorithms
• Sometimes simpler algorithms are all that are required and will run much faster than any of these– Adjust power level for a target SINR
Cognitive Radio Technologies, 2008
64
Representing Information
How can a radio store and manipulate knowledge?
Cognitive Radio Technologies, 2008
65/77
Types of Knowledge• Conceptual Knowledge
– Analytic or axiomatic– Analytic if it expresses or follows from the meaning of objects
• E.g., a mobile radio is a radio with the property of mobility
– Axiomatic – fundamental conceptual relationships not based on meaning alone
• Rules– Relationships or theorems committed to memory– Some authors draw a distinction between rules and conceptual
knowledge, but it could be argued that a rule is just an axiom (or property)
• Can be expressed symbolically (e.g., UML), ontologically, or behaviorally (e.g., GA)
Cognitive Radio Technologies, 2008
66/77
Why languages to represent information?• Negotiation
– Heterogeneous devices can exchange information• Sharing learned information between devices• Permits reasoning and learning to be abstracted
away from specific platforms and algorithms– Portability, maintainability
• Permits appearance of intelligence by reasoning in a manner that appears familiar to a human
• Note: much of the preceding could also be done with behavioral knowledge (e.g., sharing GA states) but it is somewhat clumsier
Cognitive Radio Technologies, 2008
67/77
Proposed Languages• UML• Radio Knowledge Representation Language
– Describes environment and radio capabilities– Part of “radioOne”
• Resource Description Language• Web-based Ontology Language (OWL)
– Proposed for facilitate queries between radios
• DAML and (used by BBN)• Issues of language interoperability, testability,
actual “thought” processes
Cognitive Radio Technologies, 2008
68/77
Language Capabilities and Complexity• Increasing capabilities significantly increases complexity
Modified from Table 13.1 in M. Kokar, The Role of Ontologies in Cognitive Radio in Cognitive Radio Technology, ed., B. Fette, 2006.
Language Features Reasoning Complexity
XTM Higher order relationships None O(N)
RDF Binary Relationships None O(N)
RDFSRDF plus subclass, subproperty, domain, and range
Subsumption O(Nm)
OWL LiteRDFS plus some class constructors; no crossing of metalevels
Limited form of description logic
O(eN)
OWL-DLAll class constructors; no crossing of metalevels
General description logic
<
OWL Full No restrictionsLimited form of first order predicate logic
?
Cognitive Radio Technologies, 2008
69/77
Comments on Knowledge Representation• Ontologies are conceptually very appealing for realizing
thinking machines• Personal concern that goals of very high level abstraction,
platform independence, lack of a detailed specification, and automated interoperability will lead to JTRS-like implementation difficulties (see theoretically unbounded complexity, JTRS is at least bounded)– However these are really the benefits of using ontologies…
• Building an ontology is a time-intensive and complex task• Combining ontologies will frequently lead to logical
inconsistencies– Makes code validation hard
• Encourage development of domain standardized ontologies– Policy, radio, network
Cognitive Radio Technologies, 2008
70
Virginia Tech Cognitive Radio Testbed - CORTEKS -
Researchers:Joseph Gaeddert, Kyouwoong Kim, Kyung Bae, Lizdabel Morales, and Jeffrey H. Reed
Modulation schemes
Transmit spectral power mask
Policy (4 ch.)
Modulation schemes
Transmit spectral power mask
Policy (4 ch.)
Mod. Type
Transmit Power
Symbol Rate
Center Freq.
Waveform
Mod. Type
Transmit Power
Symbol Rate
Center Freq.
Waveform
Received Image
Transmitted Image
Image Display
Received Image
Transmitted Image
Image Display
Spectral Efficiency
Bit error Rate
Packet History Display
Spectral Efficiency
Bit error Rate
Packet History Display
Detected Interference
Current CR spectrum usage
Available Spectrum
Spectrum Display
Detected Interference
Current CR spectrum usage
Available Spectrum
Spectrum Display
Personal ComputerMPRG’s OSSIE (Open Source
SCA Implementation Embedded) platform
CoRTekS Software Application(Rudimentary Neural
Network based Cognitive Engine)
Personal ComputerMPRG’s OSSIE (Open Source
SCA Implementation Embedded) platform
CoRTekS Software Application(Rudimentary Neural
Network based Cognitive Engine)
Arbitrary Waveform Generator AWG430
Multi-mode transmitter
Arbitrary Waveform Generator AWG430
Multi-mode transmitter
Arbitrary Waveform Generator AWG710B
Signal upconverter
Arbitrary Waveform Generator AWG710B
Signal upconverter
Real Time Spectrum Analyzer (RSA3408A)
Receiver & spectrum sensor
Real Time Spectrum Analyzer (RSA3408A)
Receiver & spectrum sensor
GPIB
GPIB
GPIB
Cognitive Radio Technologies, 2008
71/77
Current Setup (CORTEKS)
Personal ComputerMPRG’s OSSIE (Open Source
SCA Implementation Embedded) platform
CoRTekS Software Application(Rudimentary Neural
Network based Cognitive Engine)
Arbitrary Waveform Generator AWG430
Multi-mode transmitter
Arbitrary Waveform Generator AWG710B
Signal upconverter
Real Time Spectrum Analyzer (RSA3408A)
Receiver & spectrum sensor
GPIB
GPIB
GPIB
Cognitive Radio Technologies, 2008
72/77
Current Waveform Architecture
Wireless Microphone(Interferer)
OSSIEOpen Source SCA Implementation Embedded
APPLICATION
AWG Component
RSAComponent
AWG710BRSA3408A
Cognitive Engine
Component
Assembly Controller
AWG430
Transmitter ReceiverRF Interface
Cognitive Radio Technologies, 2008
73/77
CoRTekS Screenshot
Policy (4 ch.)
Transmit spectral power mask
Modulation schemes
Waveform
Center Freq.
Symbol Rate
Mod. Type
Transmit Power
Image Display
Transmitted Image
Received Image
Packet History Display
Bit error Rate
Spectral Efficiency
Spectrum Display
Available Spectrum
Detected Interference
Current CR spectrum usage
Cognitive Radio Technologies, 2008
74/77
CoRTekS Decision Process
Utility Function
Policy
Acceptable BERLatencyTx Power
Required QoS
Available Spectrum
Neural NtwkMemory
Tx PowerFrequencySymbol RateModulation Type
ParametersUtility Function
Policy
Acceptable BERLatencyTx Power
Required QoS
Available Spectrum
Neural NtwkNeural NtwkMemoryMemory
Tx PowerFrequencySymbol RateModulation Type
Parameters
Tx PowerFrequencySymbol RateModulation Type
Parameters
Cognitive Radio Technologies, 2008
75/77
Demonstration of CORTEKs
Cognitive Radio Technologies, 2008
76/77
Implementation Summary• Broad differences in architectural approaches to implementing cognitive
radio– Engines vs algorithms– Procedural vs ontological
• Numerous different techniques available to implement cognitive functionalities
– Some tradeoffs in efficiencies– Likely need a meta-cognitive radio to find optimal parameters
• Process boundaries are sometimes blurred– Observation/Orientation– Orientation/Learning– Learning/Decision– Implies need for pooled memory
• Good internal models will be important for success of many processes• Lots of research going on all over the world; lots of low hanging fruit
– See DySPAN, CrownCom, SDR Forum, Milcom for papers; upcoming JSACs• No clue as to how to make a radio conscious or if we even should