Upload
weiqin-chen
View
213
Download
0
Embed Size (px)
Citation preview
Supporting teachers’ intervention in collaborative
knowledge building
Weiqin Chen*
Department of Information Science and Media Studies, University of Bergen,
P.O. Box 7800, N-5020 Bergen, Norway
Received 20 October 2004; accepted 20 October 2004
Abstract
In the context of distributed collaborative learning, the teacher’s role is different from traditional
teacher-centered environments, they are coordinators/facilitators, guides, and co-learners. They monitor
the collaboration activities within a group, detect problems and intervene in the collaboration to give
advice and learn alongside students at the same time. We have designed an Assistant to support teachers’
intervention in collaborative knowledge building. The Assistant monitors the collaboration, visualizes it
and provides advice to the teacher on the subject domain and the collaboration process. The goal of the
research present in this paper is to explore the possibilities of enriching Computer Supported
Collaborative Learning (CSCL) environments with tools to support collaborative interaction.
q 2005 Elsevier Ltd. All rights reserved.
Keywords: Software agents; CSCL; Knowledge building
1. Introduction
In collaborative learning, instruction is learner-centered rather than teacher-centered
and knowledge is viewed as a social construct, facilitated by peer interaction, evaluation
and cooperation. Therefore, the role of the teacher changes from transferring knowledge to
students (the ‘sage on the stage’) to being a facilitator in the students’ construction of their
own knowledge (the ‘guide on the side’) (McKenzie, 1998).
Journal of Network and
Computer Applications 29 (2006) 200–215
www.elsevier.com/locate/jnca
1084-8045/$ - see front matter q 2005 Elsevier Ltd. All rights reserved.
doi:10.1016/j.jnca.2005.01.001
* Tel.: C47 555 841 43; fax: C47 555 891 49.
E-mail address: [email protected]
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215 201
According to Dillenbourg (1999), the teacher retains an important role in the success
of collaborative learning. This role is more important as the size of the group increases.
As a ‘facilitator’ instead of a tutor, a teacher should not provide the right answer or
say which group member is right, but perform a minimal pedagogical intervention
(e.g. provide some hint) in order to redirect the group work in a productive direction
or monitor which members are left out of the interaction. He further pointed out that in the
context of CSCL, the external regulator needs specific tools for monitoring the interactions
occurring in different places and/or at different times. The design of this tool is a main item
on the CSCL research agenda.
The teacher’s role in distributed collaborative learning depends heavily upon
observation of the interaction. An intensive collaboration, however, which includes a
relatively large number of messages or interactions, makes it difficult to follow. It is
always time and effort consuming to analyze the collaboration, detect problems and give
useful advice to facilitate the collaboration. In order to lessen the problem, the use of
agents to analyze the collaboration and support effective collaboration has been
investigated. For example: iDLCE (Okamoto et al., 1995) developed an Expert System
Coordinator, GRACILE (Ayala and Yano, 1996) implemented two types of agents:
mediator agent and domain agent, Dillenbourg et al. (1997) proposed agents that
compute statistics regarding interaction, EPSILON (Soller, 2001) developed a
facilitation agent to provide pedagogical support to students learning collaboratively
on-line, COLER Constantino-Gonzalez et al., 2000) developed an agent that coaches
collaborative Entity-Relationship modeling, Mørch and his students from our group
(Mørch et al., 2003) developed an agent that directly interact with students (giving
advice to students) based on the statistics. Most of these efforts, however, have been
placed on designing intelligent modules that replace the teacher’s role in the
collaboration. In order to obtain this goal, students are restricted to using ‘semi-
structured’ interfaces such as menu-driven or sentence-openers to collaborate, which
restrains the interaction channels and slows the communication process. Furthermore,
the advice generated by these intelligent systems is based on its own understanding of
the collaboration process, which has a high possibility of misinterpretation or
misunderstanding. As a result, the advice might sometimes be inappropriate and
confuse the students. While closely related to these and other CSCL research efforts, our
research has taken a somewhat different approach in that we have aimed at developing a
software agent, which, instead of taking the place of teachers, acts as a supplement to
them. To support the teacher’s facilitation role in collaborative knowledge building, we
have designed an Assistant for FLE3—a distributed collaborative learning environment
developed by Media Lab, University of Helsinki in Finland.
The rest of this paper is organized as follows. Following introduction, Section 2
gives a brief introduction of FLE3 and the collaborative knowledge building process in
order for readers to understand the role and functions of the Assistant. Section 3
presents the design of the Assistant and its integration with FLE3. Primary evaluation
results are presented in Section 4. Section 5 discusses related work and places the
Assistant into a bigger research context. Section 6 concludes the paper and presents
some issues for further discussions.
Fig. 1. Progressive inquiry model (Muukkonen et al., 1999).
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215202
2. Collaborative knowledge building
FLE3 is a web-based groupware for computer supported collaborative learning (CSCL)
(Muukkonen et al., 1999). It is an asynchronous environment and designed to support
a collaborative process ofprogressive inquiry learning.According toMuukkonen etal. (1999),
the basic idea of progressive inquiry is that students gain deeper understanding by engaging
in a research-like process where they generate their own problem, make hypotheses and search
out explanatory scientific information collaboratively with others (Fig. 1).
As a starting point, the teacher has to set up the context and the goal for a study
project in order for the students to understand why the topic is worthwhile investigating.
Then the teacher or the students present their research problems that define the directions
where the inquiry goes. As the inquiry proceeds, more refined questions will be posted.
Focusing on the research problems, the students construct their working theories,
hypotheses, and interpretations based on their background knowledge and their research.
Then the students assess strengths and weaknesses of different explanations and identify
contradictions and gaps of knowledge. To refine the explanation, fill in the knowledge
gaps and provide deeper explanation, the students have to do research and acquire new
information on the related topics, which may result in new working theories. In so doing,
the students move step by step toward answering the initial question.
To support collaborative progress inquiry process, FLE3 provides several modules,
such as WebTop and Knowledge Building module. The WebTop module is a supporting
module where teachers and students can store and share resources such as documents and
links. The Knowledge Building module is considered to be the scaffolding module for
progressive inquiry, where the students post their messages to the common workspace
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215 203
according to predefined categories. The categories they can use are Problem,
My Explanation, Scientific Explanation, Evaluation of the Process, and Summary.
These categories are defined to reflect the different phases in the progressive inquiry
process.
3. Assistant design and implementation
In the collaborative learning process in FLE3, teachers can contribute to the progressive
inquiry process in the two aspects: process facilitation and content facilitation. Process
facilitation includes monitoring participation in KB discussion, encouraging non-active
students to be more active, suggesting what messages to reply to and who should do so,
suggesting what category to choose for the next posting in the discussion forum, and
advising when postings do not follow the scientific method of knowledge building.
Content facilitation includes setting up a context, enhancing the discussion by presenting
problems or working theories, encouraging students to join the knowledge building
session by sending student emails with links to relevant and interesting notes in the
knowledge building, and uploading learning materials and informing students and let
them visit the new material. To support the teacher’s facilitation role, the Assistant is
designed to include a domain model and a collaboration model. It helps the teacher to
monitor the updates in WebTop and Knowledge Building module. The Assistant also
presents statistical information and gives advice to teachers based on the domain and
collaboration models. It can also learn from the teacher’s feedback in order to improve its
performance.
Fig. 2 shows the integration of the Assistant with FLE3. The Assistant receives
messages and activities of both students and the teacher through from FLE3 and stores
them in a database. The activities are mainly logon/off, updates on the virtual WebTop
module, updates in the Knowledge Building module and teacher’s activities on the advice
from the Assistant. Each of the activities has timestamp and other properties. For example,
a message posted in the Knowledge Building module also includes message content, post
person, category and corresponding message.
The Assistant is responsible for providing statistical information of the collaboration
process, sending emails, and presenting advice. The Statistic Computation module goes
through the database, computes statistics on the collaboration process and presents them to
the teachers and students in the form of tables or charts. The Advice Generation module
creates advice by reasoning on the domain model and the information from the database.
The Assistant can also send emails to students on behalf of the teacher.
The Assistant has two interfaces in FLE3, one for the teacher and one for the students.
The teacher interface has links to the following information:
†
Who is online. By clicking on this link, the teacher can see all the students who are online.†
Update in WebTop. This links to the update in student’s WebTop. The teacher can seeall the new documents on the WebTop.
†
Update in Knowledge Building. This links to the update in the Knowledge Buildingmodule. The teacher can check all the new messages posted since the last time he/she
Fig. 2. Integration with FLE3.
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215204
logged out. If he/she finds some notes are interesting, he/she can send emails to the
students with links to those notes.
†
Check statistics. Clicking on this link will trigger the statistic Computation module togo through the database, compute statistics on the collaboration process and present
them in the form of tables or charts.
†
Check advice. Clicking on this link will trigger the Advice Generation module to createadvice by reasoning on the domain model and the information from the database using
the rules in the knowledge base. The teacher can accept/reject or tailor the advice
generated by the Assistant. He/she can also ask the Assistant to explain the advice or
delegate the Assistant to send emails or present the advice to students.
†
Topic management. This link allows the teacher to create and edit the domain modelrepresented by a Topic map.
Fig. 3 is a snapshot of the teacher’s interface. Except for the ‘Check advice’ and ‘Topic
management’ links, the student interface has links to all the other information.
3.1. Domain model
A conceptual domain model is used to describe the domain concepts and the
relationships among them, which collectively describe the domain space. This domain
model is usually represented by an ontology. It is particularly appropriate for modeling
concepts and their relationships. Various tools and environments can be used to build
Fig. 3. Teacher’s interface.
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215 205
a domain ontology. For example, Protege (Noy et al., 2001) is an ontology editor for
constructing domain ontologies. With its storage plug-ins, the domain ontology can be
saved into various formats, including XML, RDF, etc. A simple conceptual domain model
can also be represented by a topic map. Topic maps (Pepper and Moore, 2001) are a new
ISO standard for describing knowledge structures and associating them with information
resources. It is used to model topics and their relations in different levels. The main
components in Topic maps are topics, associations, and occurrences. The topics represent
the subjects, i.e. the things, which are in the application domain, and make them machine
understandable. A topic association represents a relationship between topics. Occurrences
link topics to one or more relevant information resources. Topic maps provide a way to
represent semantically the conceptual knowledge in a certain domain.
In our project, we need to represent the topics and their relations and link them to the
related notes accordingly. Topic maps can fulfill this requirement in a simple and friendly
way. Furthermore, it is easier for teachers to understand and manage the Topic maps. This
domain model includes topics in Artificial Intelligence (course domain) and their relations
such as machine learning, agents, knowledge representation, searching algorithm, etc.
These topics are described as topics in the topic map. Relations between the topics are
represented as associations. The occurrence describes the links to the messages where the
topic was discussed in the knowledge building process.
In the earlier prototypes of the Assistant, teachers have to write XML in order to create
Topic maps for their course domains and when a message is posted, associated topics to
this message have to be selected manually by the contributors (students/teachers). These
have been proved rather tedious. In the current version, we have developed a tool
‘AnnForum’ for teachers to create a domain Topic map interactively (Fig. 4). This tool can
also automatically associate the messages to the related topics using automatic
classification techniques in information retrieval. Teachers can also use this tool to edit
and verify the associations (Chen, 2004).
Using AnnForum, teachers can create Topic maps for their course domain and
load/reload them into FLE3. Because Topic map are written in XML format, it is easy for
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215206
teachers to understand and maintain the topics, and the domain model can also be easily
reused in other contexts. Furthermore, the evaluation in a University course in fall 2003
shows that topic maps provide students with domain visualization and topic navigation
which help them to get oriented within the course domain and deepen their understanding
of the topics and the conceptual associations.
The following code describes a part of the topic map in XML format.
Fig. 4. AnnForum (Topic and Association Management).
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215 207
To analyze the interaction in the collaborative knowledge building process, the agent
combines the structure (progressive inquiry model) and domain (conceptual domain
model). The interaction is mapped to the progressive inquiry model and the course domain
model. The progressive inquiry model is used to check if the discussion has followed
the sequence of the knowledge building process. The conceptual domain model is used to
check how the discussion covers the topics in the course domain.
3.2. Collaboration model
In the knowledge building process of FLE3, the main activity of the students is to post
messages according to categories. Therefore, the information collected and stored by the
Assistant includes the properties of the messages posted by the students. It includes:
†
Topic: to what topic/topics is the message related?†
Category: to which category (knowledge type) is a message posted?†
Student-Post: who posts the message?†
Msg-Correspond: to which message does the message correspond?†
Depth: at which depth of the thread is the message?†
Time-Stamp: when is the message posted?†
Depth: at which depth of the thread is the message?By querying the database, the Assistant is able to provide statistical information on the
collaboration process. For example, how many notes have been posted in each category?
How many notes has a certain student posted? How often does a certain student post
messages? How many notes has each student posted in a certain category? How many
Fig. 5. Statistical information: (a) number of messages by each user; (b) number of messages in each category.
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215208
notes has a certain student posted corresponding to a certain message? How many notes
are related to a certain topic in the domain model?
The Assistant presents the statistical information in tables or charts to teachers.
Although this information is rather simple, it can provide valuable overview of the
collaboration so that the teacher can follow the collaboration easily and detect problems
quickly. For example, Fig. 5(a) shows the number of messages posted by each student.
The teacher would notice that student ‘hegullak’ has not made any contribution to the
knowledge building. He/she can send ‘hegullak’ an email to encourage this student to join
the knowledge building. From Fig. 5(b), the teacher can easily see that there are not
enough messages in the category of ‘Scientific Explanation’. It means that the research
step in the inquiry progress model is not done properly by the students. This could be
caused by either the students do not understand the scientific explanation category in
progressive inquiry model well enough or they did not spend time working on scientific
material. The teacher can further look into these possible problems and intervene when
necessary.
The statistical information is also available to the students so that they can be aware of
the collaboration process and their performance with respect to the group. The evaluation
in a University course in fall 2003 shows that this could also help with the student’s self-
regulation (See Section 4).
3.3. Advice generation
Knowledge about how students interact is useful to a system only if it can apply
this knowledge to recognize specific situations that call for intervention. Although
the statistical information can provide the teacher with an overview of the
collaboration and the teacher can find some possible problems from checking this
information, the problems that could be found based on this information are rather
limited. To find other problems, the teacher needs to look at the collaboration at a
deeper level.
For example, according to the progressive inquiry model, the sequence of posting
messages should be ‘Problem’, ‘My explanation’ and ‘Scientific explanation’.
Fig. 6. Assistant presents advice to teachers.
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215 209
It means that the student should first post a message in a ‘Problem’ category, which
should be followed by a message in a ‘My explanation’ category. Then he/she
should post a message in a ‘Scientific explanation’ category. However, some
students do not follow this order when they post messages. Although this problem
can be found by looking at ‘category-number of message’ table in the statistical
information, it is not so straightforward. In addition, to find which student has this
problem is even more complicated if the teacher only looks at the statistical
information. In order to help the teacher find this problem, we create rules in the
knowledge base to represent the ‘perfect’ sequence of the messages. The Assistant
checks each student’s sequences of messages against these rules. If discrepancies are
found, an advice will be generated to the teacher.
Fig. 6 shows a short list of advice generated by the Assistant. The ‘to’ column shows the
student’s name to whom the email or advice should be sent and ‘all’ means to all students.
The title column shows the title of the advice, and it is also the title of the email if
the advice is decided by the teacher to be sent to the student. In Fig. 6, if the teacher clicks
on the link title ‘knowledge building process’ to student ‘tove’, he/she will see a window
pop up and it contains the content of the advice:
From: [email protected]
Subject: knowledge building process
Hi tove,
I have noticed that you posted problems right after problems. Are you aware of the
sequence in the progressive inquiry model?
Weiqin.
In Fig. 6, if the teacher clicks on the link title ‘topic discussed’ to ‘all’, he/she will see a
popup window which contains the suggested topic:
From: [email protected]
Subject: topic discussed
Hi all,
I suggest that you should read ‘Computing machinery and intelligence’ (http://www.
abelard.org/turpap/turpap.htm) and discuss ‘Turing Test’.
Weiqin.
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215210
The advice is given to the teacher and the teacher can view the advice and ask the
Assistant to explain it. It is up to the teacher to make a decision on whether he/she should
intervene, delegate the Assistant to present advice to the student or send emails to the
student. The teacher can also save the advice to a file and review it later. If a student posts
bogus messages in the discussion forum to boost his/her participation, in the current
prototype the Assistant is not able to find out or to prevent it from happening. It would be
the task of teachers to figure out whether the participations are valid or not and to intervene
when they think it is necessary.
3.4. Learning
Since the Assistant uses a fixed rule set to generate advice. The lack of adaptivity
affects the performance of the Assistant. In order for the Assistant to adapt the
advice it generates and improve its performance, we tried two methods. One is to
design a rule editor for the teacher to create and manage the rules in the knowledge
base. The adaptivity is improved manually by allowing the teacher to create
different rules for different situations. However, we find this method adding extra
workload to the teacher. Another method we tried is machine learning. By learning
from the teacher’s feedback, the Assistant can automatically improve its
performance.
Among the existing learning algorithms, we picked up those that can learn rules.
So far the learning algorithm we have experimented is CN2 (Clark and Niblett,
1989). It can induce new production rules periodically instead of doing it each time
new feedback is provided. We believe that this feature fits asynchronous
environments where real time update is not so crucial as compared to synchronous
environments.
The input of the CN2 algorithm is the features of advice and the teacher’s activities to
the advice. The features of advice include:
†
Message feature. category, student-post, timestamp, and topics,†
Student feature. last-logout and last-message-post,†
Confidence factor: how confident the Assistant is on the advice.The teacher’s activities include:
†
Present (delegate the Assistant to send/show the advice to students),†
Explain (ask the Assistant to explain how it generates the advice),†
View (view the content of the message to be sent to students).Each advice presented to the teacher becomes one training example for the CN2
algorithm in the form of feature set: {msg_feature, student_feature, teacher_activity,
confidence_factor}. Going through the training examples, CN2 creates a new set of
rules and saves them. Afterward these new rules can be verified and used in generating
advice.
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215 211
4. Evaluation
The evaluation of the collaboration supporting system is divided into two phases.
We first conducted an informal evaluation in a university course INFO281 (Introductory
Artificial Intelligence) in fall 2002. The goal of this evaluation was to discover potential
improvements to the design of prototype. We focused on functionality and user interface
issues. A more thorough evaluation with focus on the performance of the Assistant
is carried out in fall 2003 in INFO281. In this scenario, 53 students in INFO281 discuss
issues related to Artificial Intelligence through FLE3.
In the first lecture of the course, we gave the students an introduction of FLE3 with the
focus on the functionalities and what they could do with it for the assignment. From the
FLE3 User Management, we sent out invitation emails to all the students so that they can
register themselves to the environment and started trying out different functionalities. The
experiment was divided into two stages. The first stage lasted until the middle of the
semester when we used the original FLE3 without the Assistant. The second stage was
from the middle of the semester until the end of the semester when we used Fle3 with the
Assistant. In the second stage, the teacher accepted all the advice on ‘topic discussed’ and
‘knowledge building process’ generated by the Assistant and delegated the Assistant to
present the advice to students. For the advice on participation, the teacher did not take any
of them because the discussion contribution counts 20% of the final grade and all students
participated in somewhat similar level. Data were collected by system log, questionnaires
and interviews.
The total number of messages is 237. Nine was posted to ‘Problem’, 196 was post to
‘My Explanation’, 31 was posted to ‘Scientific Explanation’, one was posted to
‘Evaluation of the Process’ and no message was posted to ‘Summary’.
We find some changes in the usage of categories after introducing the Assistant. As
shown in Table 1, before introducing the Assistant, the number of messages in ‘Scientific
Explanation’ is only 9% of the total number of messages. After introducing the Assistant,
it becomes 16%. We cannot claim that the changes were caused by the Assistant by only
looking at the table. So we also looked into the data from questionnaires and interviews.
Of the 31 students who answered the questionnaires, seven thought the advice
presented in the knowledge building were very informative and helpful, 16 thought
the advice provided some kind of guidance in both the knowledge building process and the
discussion topics. One thought it was somewhat confusing and seven students did not
notice the advice at all.
When asked how they like the advice, one student responded:
Table 1
Compa
No. of
Total n
I particularly like the recommended discussion topics and the link because they
point to something that we have not thought about.
re number of messages in ‘scientific explanation’
Before Assistant After Assistant
messages in ‘Scientific Explanation’ 9 22
o. of messages 97 140
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215212
Students also reflected on the advice related to the knowledge building process.
Although we kept getting advice on the “Scientific Explanation” vs. “My
Explanation”, most of us just did not response to it. That’s why there are so many
messages in “My explanation”. “My Explanation” is the default thinking type. If I do
not know where to put a message, I put it in the “My Explanation”.
Of the 31 students who answered the questionnaires, 16 thought the statistic information
was helpful and five thought it was a little confusing. The rest did not look at it.
For those who thought the statistics are helpful mentioned in the interviews that they
checked the statistics to see (1) what position they are in the knowledge building process,
e.g. how the number of messages they posted compares with other students. (2) How other
students and the whole group use the categories.
By checking this information, they themselves could decide what to do next. This
helped the students’ in regulating their own activities in the knowledge building process.
For example, one student responded:
I feel that I have to be a little more active after I looked at the statistics.
5. Related work
Roehler and Cantlon classified the teacher’s role in distribute learning environments
into five categories: offering explanations, inviting students’ participation, verifying
and clarifying student understandings, modeling of desired behaviors and inviting
students to contribute clues. The Assistant presented in this paper can help the teacher
with inviting students’ participation, modeling of desired behaviors and inviting
students to contribute clues. In addition, the Assistant can assist the teacher in finding
problems in the coverage of the discussion topics and direct the discussion to other
topics.
Classroom teachers analyze and assess student interaction through close
observation of group interaction. In distributed collaborative learning environment,
developing tools to analyze student interaction is a challenge. Jermann et al. (2001)
provided a conceptual framework for collaboration supporting tools and the
capabilities they can offer based on the work by Barros and Verdejo (2000) and
reviewing of collaborative learning supporting systems (Fig. 7). In Jermman and his
colleagues’ term, collaboration management can be described as a repetitive cycle
containing four phases:
(1)
Data collection phase involves observing and recording the interaction.(2)
Indicator selection involves selecting one or more high-level variables torepresent the current state of the interaction.
(3)
Diagnosing interaction phase involves comparing the current state of theinteraction to an ideal model of the interaction.
(4)
Remedial actions are proposed when discrepancies are found in phase (3).Fig. 7. The collaboration management cycle (Jermann et al., 2001).
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215 213
They further divided the collaborative learning supporting systems into three
categories:
†
systems that reflect actions (mirroring systems): collect raw data and display it to thecollaborators;
†
systems that monitor the interaction (metacognitive tools): model the state of theinteraction and provide collaborators with visualizations that can be used to self-
diagnose the collaboration;
†
systems that offer advice: guide collaborators by recommending actions students mighttake to improve their interaction.
In their work, the teachers were treated in the same way with students. There was no
emphasis and support on the teacher’s facilitation role.
To assist the teacher’s facilitation role in the collaborative learning environment, the
Assistant needs to have the ability to understand the collaboration to a certain degree.
Several research works have been published in analyzing the interaction in
collaboration. Gaßner et al. (2003) categorized the methods that have been used in
analyzing interaction into two dimensions. The first dimension is classified into two
categories based on raw data which the analysis methods operate on: activity-based and
state-based analysis. The second dimension is classified into two categories based on the
viewpoints under which the interaction was analyzed: summary analysis and structural
analysis. In the second dimension, they further divided it into domain-independent and
domain-specific interpretation of the analyzed data. In our research, we use both of the
two dimensions for analyzing the collaboration in a simpler manner. For example, we
use structural analysis only in domain-independent situation and summary analysis in
both domain-independent and domain-specific situation.
6. Discussion and future work
This paper presented our ongoing project—-an Assistant to support teacher’s
intervention in collaborative knowledge building environment. The Assistant is
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215214
designed to support the teacher’s facilitation role in the distributed collaborative
learning by providing overview and advice. It does not replace the teacher. Instead
it is a complementary to the teacher’s role. The Assistant has its limitations in truly
understanding the collaboration. The teacher has difficulties in following the
collaboration. Therefore, the intervention is done by a Teacher–Assistant team. The
abilities of the Assistant to explain the advice and to learn and improve its
performance help to build up a trust relationship between the Assistant and the
teacher.
During the design and development of the collaboration supporting tools in the
DoCTA-NSS, we have been considered several issues and some of them merit further
discussion.
†
Agent design. Unlike the agents in many intelligent tutoring systems, the agentsin distributed collaborative learning environments work mainly in the back-
ground. They monitor the collaboration, collect data, analyze the interaction and
provide statistical information and advice, which can be ignored if it is
considered of low priority. In our research, the Assistant presents information and
advice in a fixed text area in the environment. Although the Assistant is not
intrusive and the teachers/students can concentrate on their task. However, the
findings from our experiment show that some students did not notice the advice
at all. So pop-up windows may help to draw attentions to the advice from the
Assistant. We will look into the agent presentation mechanism in our further
studies.
†
Understanding collaboration. In order to effectively support the collaboration, itis crucial to understand the interaction. Classroom teachers learn to analyze and
assess student interaction through close observance of group interaction, trial and
error, and experience. For agents to be able to fulfill this task is a real challenge.
The Assistant in our project is able to understand the collaboration very well. For
example, it cannot detect if students post bogus message to boost their
participation. Teachers are needed to detect this kind of problem and prevent it
from happening.
FLE3 with Assistant is currently under investigation in a university course. In this
study, we will look into the reactions of the students to the advice from the teacher
and the Assistant. It is possible that they would react differently if they know who
creates the advice. Another issue that we would investigate further is the balance
between flexibility and structure. One goal of the Assistant within FLE3 is to regulate
the collaboration. However, one can ask if it is good to have this regulation or is it
better to give students more flexibility? For example, is it better to let the students use
whatever categories (knowledge types) they think are appropriated or to force them
follow the predefined sequence? We hope further experiments will help us to answer
these questions.
W. Chen / Journal of Network and Computer Applications 29 (2006) 200–215 215
Acknowledgements
This project is a part of DoCTA-NSS, a project funded by the ITU (IT in Education)
program of KUF (Norwegian Ministry of Church Affairs, Education, and Research). The
author would like to thank Prof. Barbara Wasson and the pedagogical agent group within
the DoCTA-NSS project. Special thanks to anonymous reviewers for their constructive
comments which helped improve this paper.
References
Ayala G, Yano Y. Intelligent agents to support the effective collaboration in a CSCL environment. In:
Proceedings of the world conference on educational communications (Ed-Telecom’96), Boston, MA, USA;
1996. p. 19–24.
Barros B, Verdejo MF. Analysing student interaction processes in order to improve collaboration—the
DEGREE approach. Int J Artif Intell Educ 2000;11:221–41.
Chen W. Reuse of collaborative knowledge in discussion forums. In: Lester JC, Vicari RM, Paraguacu F,
editors. Proceedings of intelligent tutoring systems (ITS2004). Berlin: Springer; 2004. p. 800–2.
Clark P, Niblett T. The CN2 induction algorithm. Mach Learn J 1989;3:261–83.
Constantino-Gonzalez M, Suthers D. A coached collaborative learning environment for entity-relationship
modeling. In: Proceedings of the fifth international conference on intelligent tutoring system (ITS’00),
Montreal, Canada; 2000. p. 324–33.
Dillenbourg P. What do you mean by collaborative learning?. In: Dillenbourg P, editor. Collaborative learning:
cognitive and computational approaches. Amsterdam: Pergamon Press; 1999. p. 1–19.
Dillenbourg P, Traum D, Jermann P, Schneider D, Buiu C. The design of MOO agents: implications from an
empirical CSCW study. In: Proceedings of the eighth world conference on artificial intelligence in
education (AIED’97), Kobe, Japan; 1997. p. 15–22.
Gaßner K, Jansen M, Harrer A, Herrmann K, Hoppe U. Analysis methods for collaborative models and
activities. In: Proceedings of the international conference on computer supported collaborative learning
(CSCL’03); 2003. p. 369–77.
Jermann P, Soller A, Mulenbruck M. From mirroring to guiding: a review of state of the art technology for
supporting collaborative learning. In: Proceedings of European conference on computer supported
collaborative learning (ECSCL’01), Maastricht, The Netherlands: Maastricht McLuhan Institute; 2001.
p. 324–31.
McKenzie J. The wired classroom: creating technology enhanced student-centered learning environments.
FNO: Educ Technol J 1998;7(6).
Mørch A, Dolonen J, Omdahl K. Integrating agents with an open source learning environment. In: Proceedings
of the international conference on computers in education (ICCE2003), HongKong; 2003. p. 393–401.
Muukkonen H, Hakkarainen K, Lakkala M. Collaborative technology for facilitating progressive inquiry:
future learning environment tools. In: Proceedings of the international conference on computer supported
collaborative learning (CSCL’99); 1999. p. 406–15.
Noy NF, Sintek M, Decker S, Crubezy M, Fergerson RW, Musen MA. Creating semantic web contents with
Protege-2000. IEEE Intell Syst 2001;16(2):60–71.
Okamoto T, Inaba A, Hasaba Y. The intelligent learning support system on the distributed cooperative
environment. In: Proceedings of the seventh world conference on artificial intelligence in education
(AIED’95), Washington, DC, USA; 1995. p. 210–8.
Pepper S, Moore G. XML topic maps; 2001. At URL: http://www.topicmaps.org/xtm/1.0
Roehler L, Cantlon D. A powerful tool in social constructivist classrooms. In: Hogan K, Pressley M, editors.
Scaffolding student learning: instructional approaches and issues, Cambridge, MA: 1997; Brookline Books.
Soller AL. Supporting social interaction in an intelligent collaborative learning system. Int J Artif Intell Educ
2001;12:40–62.