25
Intelligent Database Systems Lab N.Y.U.S. T. I. M. A Web 2.0-based collaborative annotation system for enhancing knowledge sharing in collaborative learning environments Presenter : Yu-hui Huang Authors :Addison Y.S. Su, Stephen J.H. Yang , Wu-Yuin Hwang ,Jia Zhang CE 2010 國國國國國國國國 National Yunlin University of Science and Technology 1

Intelligent Database Systems Lab N.Y.U.S.T. I. M. A Web 2.0-based collaborative annotation system for enhancing knowledge sharing in collaborative learning

Embed Size (px)

Citation preview

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.

A Web 2.0-based collaborative annotation system for enhancing knowledgesharing in collaborative learning environments

Presenter : Yu-hui Huang

Authors :Addison Y.S. Su, Stephen J.H. Yang , Wu-Yuin Hwang ,Jia Zhang

CE 2010

國立雲林科技大學National Yunlin University of Science and Technology

1

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Outline

Motivation

Objective

Methodology

Experiments

Conclusion

Comments

2

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Motivation

Taking notes on textbooks or handouts is a common learning behavior.

Annotation can to increase deep reading of textual resource.

To provide a shared mechanism for discussion about shared annotations among multiple users.

3

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Objective

This study designed a personalized annotation management system 2.0 (PAMS 2.0) for managing, sharing, and reusing individual and collaborative annotations

The purposes of this study are three-fold:

(1) to understand students’ perceived attitudes toward the use of PAMS 2.0;

(2) to investigate the effects of different annotation sharing scenarios on quantity of annotation and its influence on learning achievements;

(3) to examine the relationship between learning achievements and quantity of annotation. A quasi-experiment was conducted with two classes of college students for fifteen weeks.

4

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

Research structure and research variables

5

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

annotation management model

6

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

PAMS 2.0 comprises two separate parts : an annotator side (client side) and a server side.

server side four managers :

An anchoring position mechanism, a document manager, an association manager, a user manager.

four repositories : annotations, annotated objects, annotated documents, their associations, and corresponding users.

7

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

On the annotator side, four handlers collaborate to process annotation Creation : Annotators can create, edit, remove.

Retrieval : users can retrieve the annotations from other users in the same group or their own annotations.

discussion : users can discuss with each other through the annotation discussion handler.

Management : Access control is handled by the user, using a predefined set of roles.

8

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

Storage of annotation management model has three metadata: annotation metadata:

(1) definitions (e.g., descriptions and explanations),

(2) comments (e.g., opinions and arguments),

(3) questions (e.g., problems and answers),

(4) associations (e.g., links to other resources).

annotated object metadata: includes three kinds of annotated objects: text, image, or audio.

annotated document metadata: title, short description,creator, publisher, publication date, URI on the Internet, type,

and format, and language used in the document.

9

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

Annotation creation

Step 1: Retrieve the metadata of the annotated object in a special document.

Step 2: Specify the desired annotated object (object_id).

Step 3: Compute the anchoring position as a tuple comprising a starting and an ending position of the anchor (anchor_position) based on the specified object (object_id).

Step 4: Handwrite an annotation by using the annotation creation handler.

10

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

Annotation creation

11

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

Annotation retrieval

this system has developed a simple classification approach to semantically organize knowledge.

It computes the clustering similarity among each annotation with respect to specific concepts, and then classifies the annotations with similar clusters.

12

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

Annotation retrieval

Step 1: The annotation retrieval handler waits for incoming user’s query with a set of keywords.

Step 2: computes the semantic similarity between the user’s query and each of the clusters in the personalized annotation model, and determines the target cluster with the highest semantic similarity.

Step 3: The semantic similarity between the user’squery and each annotated document in the target cluster is calculated by comparing normalized keywords.

Each result is recorded as a weighted annotation value (0.6 for ann_weight) and assigned to the corresponding annotation.

13

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

Annotation retrieval

Step 4: The semantic similarity between the user’s query and each annotated document in the target cluster is calculated by comparing normalized keywords.

Each result is recorded as a weighted document value (0.4 for doc_weight) and assigned to the corresponding document.

Step 5: annotation * ann_weight + document * doc_weight.

If the weighted similarity value outperforms a predefined threshold, then adds the association and document to the final result set

Step 6: If the result set is not empty,returns the user’s query results to the user; otherwise, it requests the user to refine the query.

14

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

Annotation discussion Group members can review all the annotations in the same group,

denoted as ANu,i, where u = 1...m, of an annotated object in a document, denoted as AOi.

Related discussion can further be treated as a new annotation, denoted as ANm+1, i+1, and associated with an annotated object in a document, i.e. AOi.

15

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

46 students were assigned to the experiment class, and 40 students were assigned to the control class.

16

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments- Experiment class

PAMS 2.0 Step 1: log in and access the assigned articles.

Step 2: identify the format of target content.(ex: PDF 、 WORD 、… .)

Step 3: underline, circle, or highlight important passages …

Step 4: By applying this system to inquire, explain, and clarify, they can cooperatively ascertain the meaning of difficult questions.

Step 5: students can look at each collaborator’s answers and choose the best explanation or answer to the problem.

Step 6: If any of group members does not complete his or her tasks, the entire group cannot hold any cooperative discussions.

17

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments-annotation discussion handler

18

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments- student learning situation

19

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

Questionnaire

(1) Perceived usefulness, which examines how PAMS 2.0 helps students learn (five questions);

(2) Perceived ease of use, which examines how useful PAMS 2.0 is (five questions);

(3)Learning satisfaction, which examines to students’ involvement in learning scenarios within PAMS 2.0 (three questions);

(4) Willingness for future use, which examines to students’ will to use PAMS 2.0 in the future (two questions).

20

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

21

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

22

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

The results of independent samples T-test: learning achievements

Correlation between learning achievements and quantity of annotation According to the result of the Pearson correlation, there was a

significant positive correlation (the correlation = 0.337, p = 0.022 < 0.05) between learning achievements and quantity of annotation (the quantity of annotation = 608, mean=13.22, SD=3.04).

23

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Conclusion

2424

According to the results of this questionnaire, most of students have positive attitudes toward the questions for four dimensions.

Students would like to use PAMS 2.0 in group learning scenarios because of they believe PAMS 2.0 easy to use and stable.

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Comments

Advantage It describe very detail about system architecture .(ex:ERD)

Author made a lot of experiments to prove the relationship between learning achievement and quantity of annotation.

Drawback

Application Mobile learning…

25