Upload
phyllis-green
View
215
Download
0
Embed Size (px)
Citation preview
NEST for Knowledge Assisted Analysis
Petr BerkaUEP, Praha
Thanos AthanasiadisNTUA, Athens
Knowledge Assisted Analysis
KAA for Images A set of regions is generated by an
initial segmentation of images MPEG-7 Visual Descriptors (dominant
color, texture, shape) are extracted Spatial relations (left-of, above-of, etc.) Regions Adjacency Graph as image
representation
Regions Adjacency Graph A graph’s node represents a
segment/region, where visual information (MPEG-7 descriptors, spatial relations, region mask, contour, etc.) is stored
A graph’s edge represents link between two regions, holding the overall neighboring information
Region labelingFor each region: Visual Descriptor matching with the
instances of the concepts in the domain ontology
Calculation of a combined distance from multiple descriptors
Assignment of labels (concepts) along with a confidence of value -> fuzzy set of labels
Hierarchical merging of regions based on the fuzzy set of labels
KAA architecture
Semantic based segmentation (1/2)
Approach Graph-based representation of images Semantic vs. Syntactic: regions are
assigned fuzzy set of labels instead of low-level features
Modification of traditional segmentation algorithms to operate on labeled regions
Simultaneous image segmentation and region labeling
Semantic based segmentation (2/2)
Target: Solve over-segmentation problems Assign labels with confidence values to
regions Link labels with concepts existing in
ontologies
KAA example results
NEST …
Expert systems at UEP - history of the NEST
May 2003: begining of implementation (P. Berka, V. Laš) DELPHI under Windows knowlege base represented in XML stand-alone + client/server (web) version knowledge base editor czech and english versions
http://lisp.vse.cz/NEST
Knowledge representation (1/4)
Attributes and propositions binary True, False single nominal values of attribute multiple nominal values of attribute numeric fuzzy intervals
sources and actions related to attributes
attribute describes case or environment
Knowledge representation (2/4)
Rules with priorities
IF condition THEN conclusion AND actionwhere condition is disjunctive form (disjunction of conjunctions) of literals (propositions or their negations), conclusion is a list of literals and action is a list of actions (external programms)
compositional - each literal in conclusion has a weight
apriori - compositional rules without condition logical - non-compositional rules without weights;
only these rules can infer true or false
Knowledge representation (4/4)
Integrity constraints
ANT SUC (degree) where ANT and SUC are DNF of literals and degree is a number expressing the importance of the integrity constraint
used to check logical consistency of the consultation
diagnosis(TBC) not diagnosis(healthy)
Contexts - disjunctive form of literals, that (iff having positive weight) determines the applicability
of a rule or integrity constraint
Inference Inference in the network of rules as a
combination of backward and forward chaining compositional inference for compositional and
apriori rules (combining contributions of rules) non-compositional inference for logical rules
(modus ponens + disjunction)
Evaluation of integrity constraintsIMPL(a,s) = max(0, min(1, a-s)) pro a > 0
Uncertainty processing (1/4)
(Based on the algebraic theory of P. Hájek)
defined combination functions on [-1, 1]: NEG to compute the weight of negation, CONJ to compute the weight of conjunction, DISJ to compute the weight of disjunction, CTR to compute the contribution of a rule to
the weight of conclusion, GLOB to combine contributions of more rules.
Uncertainty processing (3/4)
Inferenčnímechanismus
Funkce CTR(a,w) prováhu předpokladu a>0
Fumkce GLOB(w1 ,…, wk)
standardní a*w (w1+w2) / (1 + w1*w2)logický sign(w)*max(a+|w|-1) min(w>0w, 1) - min(w<0 |w|, 1)neuronový a*w min(max(iwi, -1),1)hybridní standardní logický
NEG(w) = - w CONJ(w1,w2) = min(w1,w2)
DISJ(w1,w2) = max(w1,w2)
Modes of consultation
dialogue mode - classical question/answer mode that selects current question using backward chaining
dialogue/questionnaire mode – user can input some volunteer information (using questionnaire), during furthe consultation the system asks questions if needed
questionnaire mode – after filling in the questionnaire the system directly inferrs the goals using forward chaining
input answers form a file – answers can be changed using questionnaire
Types of answers
binary attribute - weight single nominal attribute – value and weight multiple nominal attribute – list of values and
their weights numeric attribute - value
Questions not answered during consultation get the default answer „unknown“ [-1,1] or “irrelevant“ [0,0],
Answers can be postponed - user can return to them after finishing the consultation
Consultation setup
Input from file (batch mode)
… for KAA
Basic idea (1/2)
Expert system NEST (or it’s principles) can be used in KAA for: re-labeling a region if the original labeling has
low confidence proposing to merge a region with it’s neighbors
These two tasks can be solved separately, by two different knowledge bases (expert systems – ES).
Basic idea (2/2)
Because NEST cannot express relations between objects, NEST will be used to process the image locally, i.e. to process one object in one step. So, NEST will be activated repeatedly for different regions in the image. This will require to determine some control mechanism that will decide:
what region to take when to stop processing
Processed part of image – version 1 (Athens meeting)
NEST for re-labeling (1/5)
Input: labels (and confidences) of the
processed region labels (and confidences) of the
neighboring region some global info?
NEST for re-labeling (2/5)So the input can be for example “sky(0.6), sea(0.8)
…” . To be able to reason about the confidences, NEST has to turn them into (fuzzy) intervals like “very_low”, “low”, “medium”, “high”, “very_high” – this can be easily done when creating the knowledge base:
NEST for re-labeling (3/5) Output:
(new) labels (and confidences) of the processed region
Used knowledge:IF the labels have high confidence, THEN don’t change the labelsELSE change the labels according to the
neighbors
NEST for re-labeling (4/5)
Examples of rules: IF old_label(sky) THEN new_label(sky) (1)
IF region_confidence_sky(very_low) AND region_confidence_sea(very_low) AND region_confidence_sand(very_low) AND west_confidence_sky(high) AND east_confidence_sky(high) THEN new_label(sky) (0.6),new_label(unidentified)(0.9)
NEST for re-labeling (5/5) General strategy:This module can be activated for each region
once e.g. according to the confidence of labeling, starting with lowest confidence. The stopping criterion can have a form of a threshold of the confidence.
NEST for merging (1/3) Input:
labels (and confidences) of the processed region labels (and confidences) of the neighboring region some global info?
Output: recommendation for processed region: e.g.
merge_west, merge_east, merge_north, merge_south, don’t_merge
probably also labels (and confidences) for the merged region
NEST for merging (2/3) Used knowledge:
The knowledge can be generalized as follows:IF the neighbors have same labels THEN merge ELSE don’t merge
Example: THEN don’t_merge (0.5) {apriori rule that says,
that we prefer not to merge} IF region_confidence_sky(high) AND
west_confidence_sky(hig) THEN merge_west (1), merged_confidence(sky) (0.85)
NEST for merging (3/3) General strategy:This module can be activated for the regions
repeatedly (performing a kind of bottom-up clustering of regions), starting e.g. with region with highest visual salience? The stopping criterion can have a form of a threshold of the salience (not to handle uninteresting regions)?
Processed part of image – version 2 (EKAW poster)
NEST for merging (1/3) Input:
labels (and confidences) of the regions A and B
labels (and confidences) of the neighboring regions
some global info? Output:
(dis)similarity between regions A and B
NEST for merging (2/3) Used knowledge:
The knowledge can be generalized as follows:
IF regions A and B have same labels THEN similar
ELSE dissimilar Example:
IF regionA_confidence_sky(high) AND regionB_confidence_sky(high) THEN similar (1)
NEST for merging (3/3) General strategy (Semantic Recursive
Shortest Spanning Tree):1. use NEST to evaluate dissimilarity of
current neighbors (edges in ARG)2. select neighbors with lowest dissimilarity3. IF this value is below given threshold
THENa. merge neighbors A and Bb. assign new label
c. goto step 1.
Alternative approachMerge two adjacent regions if they have
the same distribution (in quantitative sense) of classes:
assign labels according to different combinations of majority classes - so for k classes (like sea, sky …) we will have 2k-1
labels merge neighbor regions with the same label
(Inspired by my algorithms for discretization and grouping)
NEST – to do Create knowledge base Implement convertors to transform
data between KAA and NEST (xml -> xml)
Build inference mechanism of NEST into KAA