22
How to Submit Proof Corrections Using Adobe Reader Using Adobe Reader is the easiest way to submit your proposed amendments for your IGI Global proof. If you don’t have Adobe Reader, you can download it for free at http://get.adobe.com/reader/ . The comment functionality makes it simple for you, the contributor, to mark up the PDF. It also makes it simple for the IGI Global staff to understand exactly what you are requesting to ensure the most flawless end result possible. Please note, however, that at this point in the process the only things you should be checking for are: Spelling of Names and Affiliations, Accuracy of Chapter Titles and Subtitles, Figure/Table Accuracy, Minor Spelling Errors/Typos, Equation Display As chapters should have been professionally copy edited and submitted in their final form, please remember that no major changes to the text can be made at this stage. Here is a quick step-by-step guide on using the comment functionality in Adobe Reader to submit your changes. 1. Select the Comment bar at the top of page to View or Add Comments. This will open the Annotations toolbar. 2. To note text that needs to be altered, like a subtitle or your affiliation, you may use the Highlight Text tool. Once the text is highlighted, right-click on the highlighted text and add your comment. Please be specific, and include what the text currently says and what you would like it to be changed to.

How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

How to Submit Proof Corrections Using Adobe Reader

Using Adobe Reader is the easiest way to submit your proposed amendments for your IGI Global proof. If you don’t have Adobe Reader, you can download it for free at http://get.adobe.com/reader/. The comment functionality makes it simple for you, the contributor, to mark up the PDF. It also makes it simple for the IGI Global staff to understand exactly what you are requesting to ensure the most flawless end result possible.

Please note, however, that at this point in the process the only things you should be checking for are:

Spelling of Names and Affiliations, Accuracy of Chapter Titles and Subtitles, Figure/Table Accuracy, Minor Spelling Errors/Typos, Equation Display

As chapters should have been professionally copy edited and submitted in their final form, please remember that no major changes to the text can be made at this stage.

Here is a quick step-by-step guide on using the comment functionality in Adobe Reader to submit your changes.

1. Select the Comment bar at the top of page to View or Add Comments. This will open the Annotations toolbar.

2. To note text that needs to be altered, like a subtitle or your affiliation, you may use the Highlight Text

tool. Once the text is highlighted, right-click on the highlighted text and add your comment. Please be specific, and include what the text currently says and what you would like it to be changed to.

Page 2: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

3. If you would like text inserted, like a missing coma or punctuation mark, please use the Insert Text at Cursor tool. Please make sure to include exactly what you want inserted in the comment box.

4. If you would like text removed, such as an erroneous duplicate word or punctuation mark, please use the

Add Note to Replace Text tool and state specifically what you would like removed.

Page 3: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Christian Kittl, Karl-Franzens-U., Austria Jayne Klenner-Moore, King’s College, USA Shirlee-ann Knight, Edith Cowan U., Australia Agnes Kukulska-Hulme, The Open U., UK Kwan Min Lee, U. of Southern California, USA Marshall Lewis, Westpac, New Zealand Heide Lukosch, Delft U. of Technology, The Netherlands Andrew Luxton-Reilly, U. of Auckland, New Zealand Kathy Lynch, U. of the Sunshine Coast, Australia Ross Malaga, Montclair State U., USA Masood Masoodian, Waikato U., New Zealand David Metcalf, U. of Central Florida, USA Warren Midgley, U. of Southern Queensland, Australia Mahnaz Moallem, National Science Foundation, USA Julian Newman, Glasgow Caledonian U., UK Norbert Pachler, U. of London, UK Krassie Petrova, AUT U., New Zealand Christoph Pimmer, U. of Applied Sciences, Switzerland Amarolinda Zanela Saccol, U. of Vale do Rio dos Sinos, Brazil Daniyar Sapargaliyev, International Academy of Business,

Kazakhstan Eunice Sari, Online Learning Community for Teacher Professional

Development, Finland Abdolhossein Sarrafzadeh, Unitec, New Zealand Lori Scarlatos, Stony Brook U., USA Eric Seneca, Our Lady of the Lake College, USA Robina Shaheen, Open U., UK Marcus Specht, Open U. of the Netherlands, The Netherlands Susan Stoney, Edith Cowan U., Australia Thomas Sweeney, U. of Nottingham, UK Siobhán Thomas, Pervasive Learning, UK Mark Tyler, Griffith U., Australia Ruth Wallace, Charles Darwin U., Australia Marilyn Wells, Central Queensland U., Australia Janet Williams, U. of Glamorgan Business School, UK Jocelyn Wishart, U. of Bristol, UK Jane Yau, Malmö U., Sweden Ronda Zelezny-Green, London U., UK

International Editorial Review Board:

Sohaib Ahmed, Massey U., New Zealand Panagiotes Anastasiades, U. of Crete, Greece Trish Andrews, U. of Queensland, Australia Rajarathinam Arangarasan, The Raj Organization, USA Inmaculada Arnedillo-Sánchez, Trinity College, Ireland Margaret Baguley, U. of Southern Queensland, Australia Brenda Bannan, George Mason U., USA Adele Botha, Meraka Institute, South Africa Maiga Chang, Athabasca U., Canada Yunhi Chang, Dankook U., Korea Dragan Cisic, U. of Rijeka, Croatia Thom Cochrane, AUT U., New Zealand John Cook, U. of the West of England, UK Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland U., Australia Rogerio De Paula, Emerging Markets Platforms Group at Intel

Corporation, Brazil Peter Doolittle, Virginia Tech, USA Tony Dowden, U. of Southern Queensland, Australia Laurel Dyson, U. of Technology, Australia Kay Fielden, Unitec, New Zealand Elizabeth Fitzgerald, The Open U., UK Bob Folden, Texas A&M U.-Commerce, USA Rahul Ganguly, U. of Southern Queensland, Australia Margarete Grimus, Technical U. of Graz, Austria Dion Hoe-Lian Goh, Nanyang Technological U., Singapore Tiong-Thye Goh, Victoria U. of Wellington, New Zealand Sam Goundar, United Nations U., Macau Joachim Griesbaum, U. of Hildesheim, Germany Louise Hawkins, Central Queensland U., Australia Aleksej Heinze, U. of Salford, UK Debbie Holley, Anglia Ruskin U., UK Andreas Holzinger, Medical U. Graz, Austria Joaquim Jorge, Technical U. of Lisboa, Portugal Terry Kidd, U. of Texas Health Science Center, USA Michelle Kilburn, Southeast Missouri State U., USA Andrew Kitchenham, U. of Northern British Columbia, Canada

Lindsay Johnston, Managing DirectorChristina Henning, Production Editor

Allyson Stengel, Managing Editor

Jeff Snyder, Copy EditorSean Eckman, Development EditorIan Leister, Production Assistant

IGI Editorial:

Editor-in-Chief: David Parsons, Massey U., Auckland, New Zealand

International Advisory Board: A.Y. Al-Zoubi, Princess Sumaya U. for Technology, Jordan Marcelo Milrad, Linnaeus U., Sweden Hiroaki Ogata, Tokushima U., Japan Jaime Sánchez, U. of Chile, Chile Mike Sharples, The Open U., UK

Associate Editors: Hokyoung Ryu, Hanyang U., Korea Elizabeth Stacey, Elizabeth Stacey Educational Consulting, Australia Rosemary Stockdale, Swinburne U. of Technology, Australia John Traxler, U. of Wolverhampton, UK Norman Vaughan, Mount Royal U., Canada Giasemi Vavoula, U. of Leicester, UK

IJMBL Editorial Board

Page 4: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

The International Journal of Mobile and Blended Learning is indexed or listed in the following: ACM Digital Library; Applied Social Sciences Index & Abstracts (ASSIA); Bacon’s Media Directory; Cabell’s Directories; Compendex (Elsevier Engineering Index); DBLP; GetCited; Google Scholar; INSPEC; JournalTOCs; Library & Information Science Abstracts (LISA); MediaFinder; Norwegian Social Science Data Services (NSD); PsycINFO®; SCOPUS; The Index of Information Systems Journals; The Standard Periodical Directory; Ulrich’s Periodicals Directory

CopyrightThe International Journal of Mobile and Blended Learning (IJMBL) (ISSN 1941-8647; eISSN 1941-8655), Copyright © 2014 IGI Global. All rights, including translation into other languages reserved by the publisher. No part of this journal may be reproduced or used in any form or by any means without written permission from the publisher, except for noncommercial, educational use including classroom teaching purposes. Product or company names used in this journal are for identification purposes only. Inclusion of the names of the products or companies does not indicate a claim of ownership by IGI Global of the trademark or registered trademark. The views expressed in this journal are those of the authors but not necessarily of IGI Global.

Editorial Prefaceiv David Parsons, Massey University, Auckland, New Zealand

Research Articles1 Integrating Mobile Learning in an Undergraduate Course: An Exploration of Affordances and Challenges

for Learners in UAEFawzi Ishtaiwa, Al Ain University of Science and Technology, Al Ain, United Arab Emirates

19 Mobile Voting Systems for Creating Collaboration Environments and Getting Immediate Feedback: A New Curriculum Model of a University LectureSvetlana Titova, Lomonosov Moscow State University, Moscow RussiaTord Talmo, Sør-Trøndelag University College, Trondheim, Norway

37 Development and Use of an EFL Reading Practice Application for an Android Tablet ComputerYasushige Ishikawa, Kyoto University of Foreign Studies, Kyoto, JapanCraig Smith, Kyoto University of Foreign Studies, Kyoto, JapanMutsumi Kondo, Tezukayamagakuin University, Osakasayama, JapanIchiro Akano, Kyoto University of Foreign Studies, Kyoto, JapanKate Maher, Kyoto University of Foreign Studies, Kyoto, JapanNorihisa Wada, IE Institute Co., Ltd., Tokyo, Japan

53 Delivering and Assessing Learning Material through Gquest: A Case Study on Patient EducationGiordano Lanzola, Department of Computer, Electrical and Biomedical Engineering, University of Pavia, Pavia, ItalyGermana Ginardi, Department of Computer, Electrical and Biomedical Engineering, University of Pavia, Pavia, ItalyPaola Russo, Department of Computer, Electrical and Biomedical Engineering, University of Pavia, Pavia, ItalySilvana Quaglini, Department of Computer, Electrical and Biomedical Engineering, University of Pavia, Pavia, Italy

Table of Contents

July-September 2014, Vol. 6, No. 3

International Journal of Mobile and Blended

Learning

Page 5: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014 19

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

ABSTRACTMobile devices can enhance learning and teaching by providing instant feedback and better diagnosis of learning problems, helping design new assessment models, enhancing learner autonomy and creating new formats of enquiry-based activities. The objective of this paper is to investigate the pedagogical impact of mobile voting tools. The auhtors’ research demonstrated that Student Response System (SRS) supported ap-proaches influenced not only lecture design - time management, the mode of material presentation, activity switch patterns - but also learner-teacher interaction, student collaboration and output, formats of activities and tasks. SRS-supported lectures help instructors gradually move towards flipped classrooms and MOOC lecturing. The authors’ analysis, based on qualitative and quantitative data collected from two student groups (56 undergraduate students) in the 2012-2013 academic year, showed that SRS supported lectures encour-aged foreign language learners to produce more output in the target language, improved their intercultural competence and language skills and enhanced their motivation.

Mobile Voting Systems for Creating Collaboration Environments and Getting

Immediate Feedback:A New Curriculum Model of

a University LectureSvetlana Titova, Lomonosov Moscow State University, Moscow Russia

Tord Talmo, Sør-Trøndelag University College, Trondheim, Norway

Keywords: Collaboration Environment, Formative Assessment, Immediate Feedback, Intercultural Competence, M-Learning, Mobile Voting Tools

INTRODUCTION

ICT integration into teaching and learning is one of the pivotal trends of modernization of higher education in the Russian Federation.

The new national standards of higher education which were introduced in 2011-2012 contain several references to the use of ICT: modern technologies and web resources have to become an integral part of the curriculum; student ICT competence is included into both professional and research competencies and skills; 65% of DOI: 10.4018/ijmbl.2014070102

Page 6: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

20 International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014

all classes should be conducted in an interac-tive learning environment - webinars, slide presentations, round table discussions, case studies - whereas the lectures traditional for Russian universities are to constitute no more than 35% of all classes (Titova, 2012).

Without any doubt, higher education institutions and universities do not have to be driven only by imperatives, ICT procure-ment strategies and plans for the development of their estates. It is possible to transform institutional strategies by building continual research on student practices with technology into the practice of teaching and by creating environments where students and teachers are in ongoing dialogue (Kukulska-Hulme & Jones, 2011a). When students arrive at university they already have certain skills and competences in a variety of practices related to learning and the use of digital and networked technologies. Therefore educators have, first of all, to meet the expectations of a new generation of young learners who are commonly referred as the Net Generation (Tapscott, 2009) and Digital Natives (Prensky, 2009) whose perception of the responsibilities and roles of themselves in relation to lecturers and universities has changed drastically. Teachers who would like to make creative use of new technologies and to support collaborative, learner-oriented environments need to follow a transformational approach to the development of traditional skills alongside digital literacies (Dudeney, Hockly, Pegrum, 2013). This approach has to be viewed as the transformation of education from “a contrived performance, on a stage, to a shared experience of a contingent reality that no-one, lecturer or student, has experienced before” (Traxler, 2010, p.14).

This paper, supported by both current m-learning theory and enquiry-based learning theory, focuses on working out a new educa-tional design of the university lecture within a high level collaboration environment.

THEORETICAL FRAMEWORK

Mobile Technologies: The Pedagogical Potential to Transform a Traditional University Lecture Design and to Create a High Level Collaboration Environment

Initially mobile technologies and digital devices were used in education just for a limited number of activities and mostly as an alternative way to get access to learning materials. Nowadays as mobile apps have become the globally dominant technology and digital devices are “curiously both pervasive and ubiquitous, both conspicuous and unobtrusive, both noteworthy and taken-for-granted in the lives of most of the people” (Traxler, 2010, p.3) we witness the proliferation of mobile learning due to numerous publications which have proved conclusively that mobile technologies can not only enhance but also transform learning/teaching experience in many ways because they help: enhance learner autonomy as they offer better opportunities to acquire skills at one’s own pace that may be missing when using shared computer facilities (Kukulska-Hulme, 2010), empower learners to work outside of the classroom with a freedom that is difficult to achieve with more tradi-tional technologies such as desktop computers (Traxler, 2009); deliver educational experiences that would otherwise be difficult or impos-sible; provide new forms of content dispersion like course casts, moblogs, and Twitter feeds (Kumar, 2010); offer immediate diagnosis of learning problems and design new assessment models (Talmo, Sivertsen Korpås, Mellingsæter & Einum, 2012); create mobile networking collaboration and provide instant feedback (DeGani, Martin, Stead & Wade 2010; Voelkel & Bennett, 2013); modify educational environ-ment of online courses (Kuklev, 2010); create new formats of problem solving tasks based on augmented reality, geo-location awareness and video-capture (Cook, 2010; Driver, 2012).

Collaboration is a critical element of learning. It is often interpreted as social inter-

Page 7: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014 21

action, conversation and dialogue which are fundamental to learning from a socio-cultural perspective (Vygotsky, 1978).

Many researchers today highlight social aspects of mobile technologies, proposing com-plex frameworks of m-learning pedagogy built on Vygotsky’s theory foregrounding the impor-tance of conversations in educational context (Laurillard, 2007; Sharples, Taylor & Vavoula, 2007). Danaher, Gururajan, and Hafeez-Baig (2009) propose a m-learning framework based on three key principles: engagement, presence and flexibility. Presence is interpreted as inter-action which is sub-divided into three types: cognitive (student-content), social (peer) and teaching (student-teacher).

Kearney, Schuck, Burden and Aubusson (2012) argue that the key constructs of m-learn-ing pedagogy are authenticity, collaboration and personalization. The authenticity feature provides opportunities for contextualized, par-ticipatory, situated learning; the collaboration feature captures the often-reported conversa-tional, connected aspects of m-learning while the personalization feature has strong implica-tions for ownership, agency and autonomous learning. They distinguish between the two sub-scales of the collaboration construct - low level and high level collaboration that are crucial for our research. The high level collaboration involves deep, dynamic dialogue mediated by mobile networking environment and learner-generated content creation (Kearney, Schuck, Burden & Aubusson, 2012).

Mobile technologies enable instructors to create a high level collaboration environment (HLCE) based on an enquiry-based learning approach which inspires students to learn for themselves, bringing a genuinely research-like approach to the subject. This interactive, dia-logic model of learning is similar to the processes of participation in research (Sambell, 2010). The particular emphasis in this case is placed on fostering the development of collaborative, informal communities in which students learn by seeing and engaging with other people’s approaches. Ubiquitous access to information mediated by mobile devices potentially enables

a paradigmatic shift in education, it changes the way classes are managed and the instructor’s role (Betty, 2004). Kahn and O’Rourke (2005) argue that enquiry-based learning encourages students to actively explore and seek out new evidence for themselves and can help support the development of peer networks and relationships with staff. This approach implies a fundamental change in the philosophy of teaching and learn-ing, mobile devices and tools are particularly applicable as they effectively act as accelerators of the social discourse (DeGani, Martin, Stead & Wade, 2010).

Mobile networks enable learners to create HLCE where they can communicate with their peers, instructors and other specialists any time, get access to any data available on the net across time and place, share and exchange their own content - now everyone can produce content to learn - and everyone can discuss and share it “anywhere/anytime and just-in-time, just-for-them” (Traxler, 2010, p.14). In research into language teaching, the most efficient and frequently used mobile tools used for collabora-tive in-class activities are social network tools, moblogs, instant messaging apps, mobile voting systems (Dudeney, Hockly & Pegrum, 2013).

Any kind of collaborative activity is evaluated not just on overall outcomes but on group dynamics (Johnson, Adams & Cummins, 2012). More than that, “collaborative activities are best assessed collaboratively: this might include students’ self-assessment of their own contributions alongside with peer-assessment of other participants contributions” (Pallof & Pratt, 2009, p.9). Self- and peer –assessment, which can be formative or summative, is not only motivating but it enables students to develop their feedback skills and techniques while interacting with group-mates and giv-ing extensive feedback (Dudeney, Hockly & Pegrum, 2013). However, large class sizes make it difficult to offer frequent formative assess-ments in combination with high quality, timely feedback without implementing automated tools such as mobile voting systems into the teaching process (Talmo, Sivertsen Korpås, Mellingsæter & Einum, 2012; Voelkel & Bennett, 2013).

Page 8: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

22 International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014

In other words, mobile technologies change the way teachers have access to learning mate-rials, present them, interact inside and outside the classroom, and assess and evaluate learners’ participation. They liberate learning environ-ments from “the standardized straightjacket of the methodology (how one learns), the content (what one learns), the spaces (where one learns), the time (when one learns), and the social (with whom one learns) of the current educational paradigm” (Cavallo, 2012, p. 2).

On the basis of what has been said above we came to the conclusion that a high level collabo-ration environment which can be created with the help of mobile network commonly results in transformation of four main constituents of any teaching process - material presentation, tasks and activities, feedback, evaluation and assessment (Figure 1).

Wireless Voting Tools in the Educational Context

Electronic voting systems, also known as Au-dience Response Systems, or clickers, which directly introduce dialogue and interactivity between teacher and student, have been used successfully within the context of the classroom for the last decade (Bruff, 2009; Dangel & Wang, 2008; Fies & Marshall, 2006; Rubner, 2012).

Quite a few mobile voting tools (Socrative, Pol-lEverywhere, Xorro-Q, Mentimeter, MbClick, The SMART Response interactive response sys-tem, etc.) are currently available on the market. These tools share some common technological characteristics to facilitate material presentation and feedback: an audience can interact with the presenter’s computer and any interactive display or white board and can respond using whatever mobile device they own; there is no need for expensive and bulky devices; present-ers can send questions to participants’ devices who can vote and reply in different formats including a basic A, B, C style or with full text answers; questions can be created on the fly, or setup beforehand using the session leader’s own web page; polling results can be presented in a variety of formats (histograms, pie charts, etc); some tools have software that allows response data to be downloaded into other spreadsheet programs or course management systems; they can be used in small group training sessions or in a large auditorium of 200 people or more.

Many researchers have analyzed the best pedagogical practices for using these tools: they allow for anonymous participation and add a game-like approach to the classroom environ-ment (Martyn, 2007); initially used to promote active learning in large classrooms, they can be used successfully in small classrooms as well

Figure 1. HLCE created on the basis of mobile technologies

Page 9: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014 23

(Gilbert, 2005); they can can turn multiple-choice questions - often seen to be as limited as assessment tools - into effective tools for engaging all students during class, students are more invested in participating in discussion and are more likely to have generated some ideas to share in that discussion (Bruff, 2009); polling results can be saved to spreadsheet programs for semester-long analyses that may inform subsequent curriculum development (Hodges, 2010); peer evaluation mediated by polling tools provides honest, constructive feedback and promotes more engaged class discussion (Bruff, 2010); they can promote deep learning when teaching and questioning strategies center on higher-level thinking skills and increase stu-dent engagement by providing prompt feedback (Dangel & Wang, 2008); help design formative assessment activities (Rubner, 2012). Mobile voting tools are, however, very challenging. They require instructors to rethink their instruc-tion to leverage their potential advantages (Tarr & Beasley, 2012). Teachers may start with just minor changes, but major pedagogical changes may also be introduced.

The Student Response System that was piloted in our research is a web based mobile voting system designed by HiST (Norway) to enable asking multiple choice questions dur-ing teaching sessions in classroom or distance learning.1 Since 2009 it has been used by university and school teachers from 17 differ-ent countries. It allows students to respond to questions asked by teachers anonymously in the classroom, making every student’s voice heard at the lecture. SRS enables instructors to get instant assessment of tests, to evaluate group dynamics, to visualize group results immediately and to conduct feedback with the class by polling their opinion. Automatically generated feedback is followed by post-test activities provided by the lecturer aiming at clarification of common misconceptions. After that students may be given one more attempt to re-vote on the misconceived question.

The research reported here, which ad-dresses the pedagogical impact of SRS at HiST since 2009, showed significant improvement

especially in student motivation and academic performance when SRS was implemented into language classes (Talmo, Sivertsen Korpås, Mellingsæter & Einum, 2012). SRS supported tests integrated into courses of science and en-gineering education department (HiST) encour-aged student peer discussions and peer instruc-tion, facilitated learners’ engagement, enabled them to become actively involved in discussion improving learner academic performance and research skills (Arnesen, 2012; Nielsen, 2012). The technological characteristics and pedagogi-cal potential of SRS are summed up in Table 1.

Methodology and Objectives of the Research

The key objectives of a long-term research proj-ect Mobile devices in the Language Classroom: theory and practice which was launched in 2011 at the Department of Foreign Languages and Area Studies, Lomonosov Moscow State Uni-versity, are to evaluate learners’ and instructors’ preparedness to integrate mobile technologies into the foreign language classroom and to work out mobile learning strategies to create collaborative language learning environments. The research results proved that the pressure for mobile device implementation came from students, and that it is necessary to work out a pedagogical framework for mobile technology implementation into the traditional classroom to avoid undesirable consequences of their misuse in the learning experience (Titova, Talmo & Avramenko, 2013).

The first stage of our project consisted of working out sound pedagogical strategies on how to implement mobile voting tools into a traditional lecture course. One of the main objectives of the international research En-hancing Technology Awareness and Usage of m-Learning in Russia and Norway was to pilot and evaluate the pedagogical impact of SRS integration into a traditional university lecture course. The following three research questions were proposed:

Page 10: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

24 International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014

1. What is the didactic potential of SRS to create a high level interactive environment?

2. How does SRS implementation enable lec-turers to re-design a traditional university lecture course?

3. How does the SRS-supported approach change student learning and academic performance?

The general hypothesis for this research was that the students could enhance their learning following implementation of an SRS-supported approach.

Lecture Design: From a Traditional Lecture to a Flipped Classroom

The task/enquiry based learning approach and SRS implementation are central to transforma-tion of the lecture design as well as assessment and feedback patterns. SRS supported lecture design presents challenges for a lecturer because first, the content material under discussion (PowerPoint presentation) has to be re-arranged into certain chunks of 5-6 slides which are fol-lowed by a short SRS supported test that consists of 4-5 statements; second, at least three SRS sup-ported tests should be created to provide better diagnosis of learning problems and to highlight weak points of content presentation on the part of a lecturer, third, a lecturer has to be ready

Table 1. Technological characteristics and pedagogical potential of SRS

Technological Characteristics of SRS

Pedagogical Potential

Immediate test assessment and feedback

• I m m e d i a t e d i a g n o s i s o f t e a c h i n g p r o b l e m s • Instant feedback on learning problems in the large auditoriums • Group dynamics evaluation: the instructor can witness the students’ learning progress • Any aspect of student output is under control and can immediately be drawn attention to • Increase participation of all students, not just a vocal minority • Skill practice by means of formative SRS tests

Instant visualization of the test results

• E n h a n c e l e a r n e r m o t i v a t i o n • Encourage peer discussions and collaborative post-test activities • Evaluation of group dynamics

Anonymous submission of the test results

• Creation of a low anxiety environment: everybody is involved, shy or reluctant students can feel relaxed and self-confident • Correction is supportive, done in a form of collaborative activities

“Tag-It” function • Visualization of learning materials: it enables teachers to ask multiple choice questions using multimedia material such as photos and video • Maintain students’ attention longer

Time counter is installed into SRS

• No time for cheating

The teacher interface for SRS forms an invisible “layer” on the top of other windows and applications on your computer

• The system is very flexible and handy - no matter what program is used to ask a question, SRS is just a click away when one wants to run voting sessions

Equipment necessary: one In terne t -enabled teacher computer and Internet-enabled student mobile devices (wireless or cable access)

• Te a c h i n g i n t e c h n o l o g i c a l l y l i m i t e d e n v i r o n m e n t s • N o n e e d f o r b u l k y c o s t l y e q u i p m e n t • No need for profound tech preparation

Use of student own devices • No need for tech instructions - familiar devices

Page 11: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014 25

with some enquiry-based activities to initiate post-test group discussion or brainstorming.

We have attempted to map out and compare time management of the traditional lecture (low level interaction) and the SRS supported lecture (high level interaction) in Table 2.

SRS is likely to become a supportive mobile tool for lecturers who would like to implement the flipped classroom, which is an example of substitutional and augmentative use of tech-nologies (Tucker, 2012) because traditional content presentation is removed from classroom time. Due to the time management pattern of SRS-supported lectures it is recommended to transform the traditional design and present the material in the form of out-of-class activities and tasks. Students are asked to watch videos or listen to lecture podcasts in their own time so the classroom time is devoted to discussion and collaboration with intensive teacher support. In other words, the challenges of learning a new content delivery system encourage faculty to think outside the box. Mobile voting system SRS enables lecturers to transform the way they present material and turn their traditional lecturing into interactive SRS supported lec-tures, then into the flipped classroom that is a valuable model of blended learning aided by open educational resources, and then, in the long run, into MOOC-style lecturing (Figure 2).

SRS Implementation Impact on Assessment and Feedback Patterns

SRS is perfectly suited to evaluating group dynamics. It was primarily used in our research for formative assessment or low stakes assess-ment, which serves to give learners feedback on their performance and provides them with a gauge of how close they are to reaching a pre-specified learning goal (Sambell, Hubbard, 2004). Formative assessment is specifically intended to provide feedback on performance to improve and accelerate learning (Sadler, 1998). For the teacher, the design of formative assessment activities is motivated by wanting to increase students’ desire to learn, to engage in self-evaluation and self-assessment and to take control of their own learning (Rubner, 2012). SRS provides instructors with an opportunity to quickly determine the level of class under-standing at any given point in time, without an extra burden of grading (Hodges, 2010). SRS implementation allows for significant (table 2) feedback pattern changes and material as-sessment re-design. We offer the following framework of SRS supported lecture (Figure 3).

HLCE is created by initiating group dis-cussions or brainstorming on the basis of SRS supported activities aimed to figure out the

Table 2. Time management comparison of the traditional lecture vs. SRS supported lecture

Traditional Lecture SRS Supported Lecture

Material Presentation 80-90 Minutes 40-50 minutes PP presentation

Material Assessment and Collaboration activities

Weekly tests - 15 minutes

Brainstorming - 0-15 minutes

Brief Group Discussion - 0-15 minutes

L o w C o n t e x t interaction

Questions for lecturer 0 - 1 0 m i n u t e s Questions (if any) are asked orally after presentation

0 - 1 0 m i n u t e s Questions (if any) are sent via mobile instant messaging apps (Twi t t e r, SMS, What’s up app, Google Talk) during presentation

Page 12: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

26 International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014

correct answer to the test statement. This kind of formative feedback that is got from peers resembles ‘internal feedback’ as defined by Nicol and MacFarlane-Dick (2006), in which feedback is generated in relation to peers’ per-spectives rather than transmitted by the lecturer. The collaborative post-test activities as well as instant messaging apps also help highlight weak points of material presentation and transform students’ approach to their own learning as a matter of active enquiry and meaning-making rather than seeing themselves as passive recipi-ents of their lecturers’ knowledge. SRS provides deeper conceptual understanding when used

with a peer instruction methodology: “Engag-ing students in peer discussions can challenge them to generate explanations and convincing arguments for their solution and in this way also facilitate deeper understanding of scien-tific phenomena” (Nielsen, 2012, p.45). On the other hand, students need more guidance, more practice at tackling assessment-related activity and more feedback on their learning than is traditionally the case in many university courses (Sambell, 2010).

Figure 2. Traditional lecture course transformation on the basis of SRS implementation

Figure 3. SRS supported lecture framework

Page 13: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014 27

Participants

The participants of the research were 56 (12 male and 44 female) second year undergradu-ate Russian students enrolled at Lomonosov Moscow State University in Russia. Students aged 19-22 were Intercultural Communication Studies Majors who took part in SRS pilot-ing as volunteers during two semesters of the 2012-2013 academic year in the lecture course Introduction to American Studies. The objec-tives of this course, which is read in English, are twofold: to help learners to develop, on the one hand, their intercultural skills by gaining a better understanding of the contemporary U.S. (beliefs, values, traditions, geography, political and economic situation, education, religious and social life) and on the other hand to develop their language skills (listening, reading, speak-ing). The language competence of students was B1-B2 according to the European Language Framework. Written consent was obtained for collection, analysis and publishing of learner data. Students were informed that the survey was anonymous and that they could not be identified by their answers.

Data Collection

Data collection was done in three cycles that took place in the academic year 2012-2013:

1. Pre-study evaluation of ICT (mobile) com-petence of experimental group students and their attitude to mobile learning before SRS implementation (36 students - 30 female, 6 female);

2. Intervention of SRS supported tests (3 per lecture) as formative assessment tools and re-design of the traditional lecture pattern (figure 3). Comparison of summative test result data of the control (20 students - 16 female, 4 male) and experimental groups (36 students); calculation of the average time of student speech production of the experimental group on the basis of video recording;

3. Post-study evaluation of learner experience and attitude (30 students - 24 female, 6 male) to SRS supported lectures.

REPORTS OF FINDINGS AND DATA ANALYSIS

Cycle 1

The pre-study questionnaire was aimed to evaluate the ICT (mobile) competence of 36 learners of the experimental group. The online questionnaire, which consisted of 25 multiple choice questions, was published on surveymonkey.com. The participants were sent a link via e-mail. It was adapted from the one used for our previous study (Titova, Talmo & Avramenko, 2013), and comprised three sec-tions on 1. student mobile technology skills (10 questions); 2. their experience in mobile devices and app use for in-class and out-of-class work (10 questions) and 3. their attitude to mobile device implementation in the language class-room (5 questions). The data were subjected to a descriptive statistical analysis enabling us to examine both our students’ mobile competence and their readiness to integrate mobile devices into the learning process. The data analysis for section 1 of the survey demonstrated that the majority of students had a very advanced level of mobile competence and skills. We present some examples of student mobile technology skills in Table 3.

Student experience in mobile device use for in class and out-of-class work (section 2) is summed up in Table 4.

Students are likely to use mobile devices and apps on their own both in class and out-side the classroom as an access to reference materials (dictionaries, encyclopedias), for multimedia material playback (podcasts, video casts) and as a means of interaction with group mates via Twitter, moblogs, e-mail. Only a few students had experience in the use of mobile apps designed for learning foreign languages (mostly pronunciation skills). At lecture courses students commonly used their digital devices

Page 14: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

28 International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014

to make a record of a lecturer’s speech, to take photos of lecture slides or to take lecture notes. Unfortunately, mobile supported activities were not incorporated into course syllabi so students never used mobile devices for educational collaboration with peers or for getting instant feedback from instructors.

Answering the questions that revealed their attitude to mobile device integration into the language classroom (section 3) the predominant majority of students (34) said that they would like to use their own mobile devices for class activities, 35 were not against the bring your own device approach.

Table 3. Examples of students’ mobile technology skills

Statements Number of Students in %

Can download an app 95%

Can record a presentation 90%

Can save and text materials on their device 84%

Can use mobile dictionaries and encyclopedias 100%

Can share links with group mates 96%

Can search the web via mobile access 98%

Can publish video and audio file online via smart phone 82%

Can write a post in a moblog 96%

Can set up a moblog 84%

Can use instant messaging apps 100%

Table 4. Student experience in mobile device use in their learning

Questions Number of

Students in %

Use of mobile devices on their own in class and outside classroom

every day 78%

to access reference materials (dictionaries, encyclopedias) outside the classroom every day

95%

for multimedia material playback (podcasts, video casts) 65%

as a means of interaction with group mates via Twitter, moblogs, e-mail 68%

to make a record of a lecturer’s speech 82%

to take photos of lecture slides 96%

to take notes of lectures 30%

U s e o f mobile a p p s i s incorpo-rated into course syllabus

At seminars

for learning foreign languages for in-class activities 12%

for group and pair interaction activities or mobile networking activities 0%

At lectures for educational collaboration and interaction with peers or instructor 0%

for getting instant feedback from instructor 0%

Page 15: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014 29

Data analysis demonstrated, first, on aver-age, students had an advanced level of mobile competence; second, the most convenient mobile device to use in our classroom was a smart phone or a tablet computer; third, that technologically and psychologically students of the target group were ready to regularly use their own mobile devices both in the classroom and for autonomous work.

Cycle 2

Learners of the control group had a traditional university lecture course. For the second group of learners - the experimental one – an SRS-sup-ported lecture course was introduced. Students of the control and experimental groups during the course were supposed to fulfil practically the same compulsory in-class and out-of-class activities which made up their final course grade: pass 3 summative tests (2 midterm and 1 final); do weekly reading to participate in three course colloquiums (for the control group) and in weekly SRS supported tests and post-test discussions (for the experimental group); write an essay; create a group web project. Formative assessment was provided in the form of in-class SRS tests, usually 3 tests per lecture. SRS sup-ported tests were multiple choice (single best answer out of a number of options) or multiple answers (several correct choices out of a number of options). The variety of question types was chosen to promote understanding rather than

memorization. The test topics were based on the lecture material. SRS supported tests were presented in class with PowerPoint. Students responded with their smart phones or tablets. They had access to SRS tests by using Wi-Fi in class.

The learners from both groups were given the same midterm and final tests. These tests were used for summative assessment. Aver-age scores were included to compare overall performance of the control and experimental groups after the implementation of the interven-tion. The data collected on the overall scores of 3 summative tests in two groups during two semesters of the academic year 2012-2013 - control and experimental groups - suggest that introduction of the SRS-supported approach helped improve the academic performance of the experimental group in overall results of the midterm 2 and final tests more than the control group. The diagram in Figure 4 shows a chart of the percentage of the overall test scores of the control and experimental groups.

It has to be mentioned that the average score of midterm 1 of the control group (Figure 4) was a little bit higher (64%) than that of the experimental one (62%). This may be explained by the fact that midterm 1 took place after four weekly lectures so one month’s experience with SRS was not enough for students and the lecturer to become accustomed to the more active format required by SRS use. It can also take time for an instructor to learn how to design tests that

Figure 4. A chart of the overall test scores of the control and experimental groups

Page 16: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

30 International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014

include good-quality questions (Bruff, 2010), and an instructor must be well prepared to en-sure that the expected outcomes are achieved.

The likely interpretation of the improve-ment in academic performance of the experi-mental group in overall results of the midterm 2 and final tests (Figure 4) is that the results of regular formative SRS supported tests based on the lecture and required reading materials helped the instructor determine what difficul-ties students had with different areas of their course, and to what extent. It was also helpful in designing better quality questions and feedback to improve their understanding and knowledge of the subject. The increase in the overall exam results was encouraging, but not conclusive to show that only SRS tests were beneficial. One more reason for better academic performance is that students of the experimental group were actively involved into post-test activities based on active learning theory (Rubner, 2012). How-ever, it could also be due to the lecture design transformation, the division of the lecture mate-rial into chunks or flips and an activity switch framework (Tucker, 2012).

The SRS-supported approach encouraged students of the experimental group to produce more output in the target language (English) due to a number of post-test activities aimed to improve the learners’ speaking skills and enhance their vocabulary. We calculated the time of student speech production of the experi-mental group on the basis of video recording

of three SRS supported lectures. On average, students were involved in speech production approximately for 20-30 minutes per lecture. The calculation for the semester is thus: 15 lectures x 30 minutes= 450 minutes=7.5 hours of speech production.

Cycle 3

The intervention data were supplemented by student feedback gained from a post-study paper-based questionnaire and some interviews conducted after the final test. The post-study questionnaire contained 7 questions in the for-mat of Likert four-level scale and 3 free-text comments aiming to get student views on the strengths and weaknesses of SRS-supported lectures. The questionnaire was completed by 30 students (24 female, 6 male) of the experimental group. Responses are provided in Table 5.

In response to statement 1, students aver-aged 3.0, a finding that indicates that a new lecture design prepared them well for midterm and final tests. The majority of students com-mented favorably on the fact that SRS-supported tests helped them understand the topic in focus and get ready for midterm and final tests with responses to the second statement averaging 3.5 and to the third statement 3.1. In response to statement 4, the average was 3.6. This sug-gests that a majority of the students indicated that the SRS-supported approach made them do required and recommended reading to complete

Table 5.Results and mean scores of the post-study questionnaire

Strongly Disagree

Disagree Agree Strongly Agree

Mean Score

1. The new lecture design prepared me well for SRS tests 0 3 20 7 3,0

2. SRS tests helped me understand the topic in focus 1 2 19 8 3,5

3. SRS tests helped me get ready for midterms and final a lot 0 6 15 9 3,1

4. SRS tests and post-test activities made me read a lot at home 0 2 8 20 3,6

5. SRS tests were frustrating, they complicated my learning a lot 7 21 2 0 1,9

6. Instant feedback was very supportive and encouraging for my learning

0 1 15 14 3,5

7. Activity switching kept me be involved during the lectures 0 0 9 21 3,7

Page 17: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014 31

in-class SRS tests successfully and to take part in post-test activities.

Statements 5 to 7 were designed to elicit the students’ attitude to the SRS-supported ap-proach. In reaction to statement 5 the majority of students disagree that SRS tests were frustrating and complicated their learning a lot. The average for statement 5 was 1.9. The largely positive reaction to statements 6 and 7, where the mean scores were 3.5 and 3.6 respectively, emphasizes that immediate feedback on test results was very supportive and encouraging for student learn-ing, that the activity switch approach (material presentation - SRS test - post test activities) kept them involved during lectures.

Some free-text comments provided ad-ditional insight into learner experiences and revealed their positive attitude to the SRS-supported approach:

1. “Mobile devices are the best tools to be used for collaborative work.”

2. “The use of mobile devices and tasks based on SRS was fun and changed my attitude to learning.”

3. “It was not just a traditional lecture course, it was a permanent interaction and col-laboration with my group mates and the instructor, I mean, it was a kind of active learning course.”

4. “We were not passive learners, we worked hard to contribute even during lecture time, it was a very unusual and challenging experience.”

5. “SRS based tests are motivating and challenging.”

Student answers indicated that they had an overall positive outlook regarding the SRS ap-proach to university lecture courses. However, some participants noted initial difficulties in dealing with this approach. They commented on the challenging nature of weekly tests and post-test activities. Neverthelss, they claim that this approach improved their overall satisfaction with the program of study because of an innova-tive way of interaction in large lecture formats. There was general agreement that smart phones

and tablets were the most handy and suitable devices to use in large auditoriums. Some of them commented that they did understand what an active learning approach meant in practice. Rather than taking notes from slides or pictures of slides students were involved in tests, dis-cussions, polling and brainstorming activities. Students appreciated the prompt feedback they got on their own understanding of material and pointed out the motivating nature of immediate response on tests. They also mentioned that the pair discussion time of post-test activities was valuable because it gave them a chance to learn from each other and a peer’s explanation could be more helpful to them than an instructor’s explanation. These findings were confirmed by researcher observations.

RESEARCH LIMITATIONS

The study has some limitations. Firstly, the first version of SRS which was used for the research could not pinpoint the errors of individual learn-ers and save the data of the group dynamics. So, unfortunately, instructors didn’t have an opportunity to collect and analyze the data of group dynamics on weekly SRS supported tests. Recently (in May 2013) the second updated version of SRS was introduced that enables instructors to handle test results in an easier way by exporting them into different formats (Excel, charts.) Now the system includes a database that stores the results for each session held. Such data can be valuable to a teacher when reviewing the design and delivery of lectures and other course materials (Rubner, 2012). A further limitation was that to figure out the SRS implementation influence on student performance we analyzed and compared only midterm and final test result data. Essay and group project grades were not analyzed. Another limitation was that although the SRS-supported approach encouraged students to produce more output in the target language due to a number of post-test activities aimed to improve learners’ speaking skills and enhance their vocabulary,

Page 18: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

32 International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014

we have not evaluated improvements in learn-ers’ language skills.

SUGGESTIONS FOR FURTHER RESEARCH

Although an SRS-supported approach enables instructors to create HLCE and, according to our survey results, influenced our learners’ academic attainment and motivation, there is still much room for improvement. First, it is necessary to introduce new formats of interac-tive in-class activities based on instant messag-ing tools because SRS provides teachers only with a one-way but instant kind of feedback support. Second, we are planning to pilot a more advanced mobile assessment system - PeLe with SRS installed as an assessment tool both for summative and formative purposes because this tool enables instructors to save test results of individuals and group dynamics, to give students the opportunity to go through as many attempts as they want and to provide more test formats. Third, for creating HLCE it is recommended to analyze the impact of learners’ more habitual mobile social apps and instant messaging services on their motivation and class performance and output. Fourth, an-other direction of further research consists of crafting questions that help students to engage more meaningfully with course content and to foster critical thinking skills.

CONCLUSION

Many researchers argue that universities today are less well aligned to the learner needs and expectations that they bring in with them due to fast developing digital and mobile technologies and ICT resources which become the dominant infrastructure for knowledge (Kukulska-Hulme, Pettit, Bradley, Carvalho, Herrington, Kennedy & Walker, 2011b; Tapscot & Williams, 2010). Unfortunately, in many Russian universities a traditional lecture course presupposes transmis-sion of the content material that is designed and presented by the lecturer to students who are

looked at as passive recipients of knowledge. The framework discussed in this paper, based on SRS implementation and HLCE, will assist instructors’ understanding of unique challenges in emerging m-learning environments. SRS implementation supported by task/enquiry based learning approach becomes crucial for transforming the design of the traditional lecture and student approach to learning as a matter of active enquiry, rather than seeing themselves as passive recipients of their lecturers’ knowledge. Learners expressed positive attitude towards the use of mobile tools at university lecture courses. They said that this approach was very differ-ent from traditional university lecture courses.

The framework discussed in this paper is by no means prescriptive. While such a pedagogi-cal framework provides a spotlight to examine m-learning experiences, account still needs to be taken of the ways of driving student moti-vation in HLCE, working out valid criteria for evaluation of mobile supported collaborative activities, introducing new formats of interac-tive in-class and out-of-class activities based on instant messaging tools, etc.

Mobile voting tools are not mature enough to provide any kind of feedback and support instructors in assessment for summative and formative purposes. It might take some time before we can see a significant breakthrough in mobile assessment and feedback tools. However, the research described here suggests that mobile voting tools available today can be successfully integrated into the language classroom “enabling teachers to design for learning beyond the boundaries of their institu-tion” (Kukulska-Hulme & Jones, 2011a, p.67), encouraging language learners to produce more output in the target language and improve their intercultural competence and language skills.

REFERENCES

Arnesen, K. (2012). Experiences with Use of Vari-ous Pedagogical Methods Using Student Response System. In Proceedings from the 11th European Conference on e-Learning (pp. 20-27). Reading, UK: Academic Publishing Limited.

Page 19: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014 33

Beatty, I. (2004). Transforming Student Learning with Classroom Communication Systems. EDUCAUSE Research Bulletin, 3. Colorado: ECAR. Retrieved December 29, 2013, from http://net.educause.edu/ir/library/pdf/ERB0403.pdf

Bruff, D. (2009). Teaching with classroom response systems: Creating active learning environments. San Francisco: Jossey-Bass.

Bruff, D. (2010). Multiple-choice questions you wouldn’t put on a test: Promoting deep learning us-ing clickers. Essays on Teaching Excellence, 21(3), 25-34. Retrieved December 12, 2013, from: http://www.podnetwork.org/publications/teachingexcel-lence/09-10/V21,%20N3%20Bruff.pdf

Cavallo, D. (2012). Liberating Learning: How Ubiq-uitous Access to Connected Computational Devices Releases Education from the Tyranny of Information Recall. In Program of the 7th IEEE International Con-ference on Wireless, and Ubiquitous Technologies in Education (p. 2). Japan: Kagawa University Press.

Cook, J. (2010). Mobile phones as mediating tools within augmented contexts for development. In E. Brown (Ed.), Education in the wild: contextual and location-based mobile learning in action (pp.23-26). University of Nottingham, UK: Learning Sciences Research Institute. doi:10.4018/jmbl.2010070101

Danaher, P. A., Gururajan, R., & Hafeez-Baig, A. (2009). Transforming the practice of mobile learning: Promoting pedagogical innovation through educa-tional principles and strategies that work. In H. Ryu & D. P. Parsons (Eds.), Innovative mobile learning: Techniques and technologies (pp. 21–46). Hershey, PA and New York: Information Science Reference/IGI Global. doi:10.4018/978-1-60566-062-2.ch002

Dangel, H. L., & Wang, C. X. (2008). Student re-sponse systems in higher education: Moving beyond linear teaching and surface learning. Journal of Educational Technology Development and Exchange, 1(1), 93–104.

DeGani, A., Martin, G., Stead, G., & Wade, F. (2010). E-learning Standards for an M-learning world – in-forming the development of e-learning standards for the mobile web. Research in Learning Technology, 25(3), 181–185. http://www.researchinlearning-technology.net/index.php/rlt/article/view/19153 Retrieved December 12, 2013

Driver, P. (2012). Pervasive Games and Mobile Technologies for Embodied Language Learning. In-ternational Journal of Computer-Assisted Language Learning and Teaching, 2(4), 23–37. doi:10.4018/ijcallt.2012100104

Dudeney, G., Hockly, N., & Pegrum, M. (2013). Digital Literacies: Research and Resources in Language Learning. UK: Pearson.

Fies, C., & Marshall, J. (2006). Classroom response systems: A review of the literature. Journal of Sci-ence Education and Technology, 15(1), 101–109. doi:10.1007/s10956-006-0360-1

Gilbert, A. (in press) (2005). New for back-to-school: ‘Clickers’. CNET News. Retrieved December 12, 2013, from: http://news.cnet.com/New-for-back-to-school-Clickers/2100-1041_3-5819171.html

Hodges, L. (2010). Engaging students, assessing learning: Just a click away. Essays on Teaching Excel-lence, 21, 18-24. Retrieved December 29, 2013 from http://cft.vanderbilt.edu/library/articles-and-essays/essays-on-teaching-excellence/

Johnson, L., Adams, S., & Cummins, M. (2012). The NMC Horizon Report: 2008 Australia-New Zealand Edition. Austin, TX: The New Media Consortium. Retrieved December 12, 2013, from http://nmc.org/pdf/2008-Horizon-Report-ANZ.pdf

Kahn, P., & O’Rourke, K. (2005). Guide to Cur-riculum Design: Enquiry-Based Learning. Retrieved December 12, 2013, from: https://www.academia.edu/460509/Guide_to_Curriculum_Design_Enqui-ry-Based_Learning

Kearney, M. Schuck,http://www.researchinlearn-ingtechnology.net/index.php/rlt/article/view/14406/html - AF0001 S., Burden, K., & Aubusson,http://www.researchinlearningtechnology.net/index.php/rlt/article/view/14406/html - AF0001 P. (2012). Viewing mobile learning from a pedagogical per-spective. Research in Learning Technology Journal, 20 (1), 21-34. Retrieved December 29, 2013, from http://www.researchinlearningtechnology.net/index.php/rlt/article/view/14406/html#AF0001

Kuklev, V. A. (2010). M-learning within the context of open online courses. Unpublished Doctoral Thesis, Ulyanovsk Technical University, Russia. Retrieved December 12, 2013, from http://www.dissercat.com/content/stanovlenie-sistemy-mobilnogo-obucheni-ya-v-otkrytom-distantsionnom-obrazovanii

Kukulska-Hulme, A. (2010). Mobile learning for quality education and social inclusion. Moscow: UNESCO IITI.

Page 20: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

34 International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014

Kukulska-Hulme, A., & Jones, Ch. (2011a). The next generation: design and the infrastructure for learning in a mobile and networked world. In (Eds.) Olofs-son, A. D. and Lindberg, J. Ola, Informed Design of Educational Technologies in Higher Education: Enhanced Learning and Teaching (pp. 57–78). Hershey, PA: Information Science Reference (an Imprint of IGI Global).

Kukulska-Hulme, A., Pettit, J., Bradley, L., Carv-alho, A., Herrington, A., Kennedy, D., & Walker, A. (2011b). Mature students using mobile devices in life and learning. International Journal of Mobile and Blended Learning, 3(1), 18–52. doi:10.4018/jmbl.2011010102

Kumar, S. (2010). Blackboards to Blackberries: Mo-bile Learning Buzzes Across Schools and Universi-ties. Retrieved December 12, 2013, from http://www.learningsolutionsmag.com/authors/315/sesh-kumar

Laurillard, D. (2007). Pedagogical forms of mobile learning: framing research questions. In N. Pachler (Ed.), Mobile learning: Towards a research agenda (pp. 153–176). London: WLE Centre, Institute of Education.

Martyn, M. (2007). Clickers in the classroom: An active learning approach. EDUCAUSE Quarterly, (2): 71–74. http://net.educause.edu/ir/library/pdf/EQM0729.pdf Retrieved December 12, 2013

Nicol, D., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199–218. doi:10.1080/03075070600572090

Nielsen, K. (2012). How the Initial Thinking Period affects Student Argumentation during Peer Instruc-tion: Students’ Experiences Versus Observations. In: Student Response Systems in Science and Engi-neering Education. Unpublished Doctoral Thesis, Norwegian University of Science and Technology (NTNU), Trondheim.

Pallof, R., & Pratt, K. (2009). Assessing the Online Learner: Resources and Strategies for faculty. San Francisco: Jossey-Bass.

Prensky, M. (2009). H. Sapiens Digital: From Digital Immigrants and Digital Natives to Digital Wisdom. Innovate: Journal of Online Education, 5(3). Retrieved December 29, 2013 from http://www.editlib.org/p/104264

Rubner, G. (2012). MbClick: An Electronic Voting System that returns individual feedback. Retrieved December 12, 2013, from: http://www.heacad-emy.ac.uk/assets/documents/stem-conference/gees/Geoff_Rubner.pdf

Sadler, D. R. (1998). Formative assessment: Revisiting the territory. Assessment in Educa-tion: Principles, Policy & Practice, 5(1), 77–84. doi:10.1080/0969595980050104

Sambell, K. (2010). Enquiry-based learning and for-mative assessment environments: Student perspec-tives. Practitioner Research in Higher Education. University of Cambria, 4(1), 52–61.

Sambell, K., & Hubbard, A. (2004). The role of formative ‘low-stakes’ assessment in supporting nontraditional students’ retention and progression in higher education: Student perspectives. Widening Participation and Lifelong Learning, 6(2), 25–36.

Sharples, M., Taylor, J., & Vavoula, G. (2007). A theory of learning for the mobile age. In (Ed.) R. An-drews & C. Haythornthwaite, The SAGE handbook of e-learning research (pp. 221–224). London: Sage.

Talmo, T., & Sivertsen Korpås, G., Mellingsæter, & M., Einum, E. (2012). Experiences with Use of New Digital Learning Environments to Increase Academic and Social Competence. In Proceedings of the 5th International Conference of Education, Research and Innovation (pp 4540-4545). Madrid, Spain.

Tapscott, D. (2009). Grown up digital: How the Net Generation is changing your world. New York, NY: McGraw-Hill.

Tarr, T., & Beasley, J. (2012). Tips for Using Clickers in the Classroom. Retrieved December 12, 2013, from http://ctl.iupui.edu/Resources/Teaching-Strategies/Tips-for-Using-Clickers-in-the-Classroom

Titova, S. (2012). Developing Of ICT Competence Of Language Teachers Through An Online Professional Development Course In Moodle. In 6th International Technology, Education and Development Confer-ence Proceedings (pp. 4739-4746). Valencia, Spain. Retrieved December 15, 2013, from http://library.iated.org/view/TITOVA2012DEV

Titova, S., Talmo, T., & Avramenko, A. (2013). Lan-guage Acquisition Through Mobile Technologies: A New Fad Or An Unavoidable Necessity? In Proceed-ings of EDULEARN13 Conference (pp. 5046-5050). Spain, Barcelona. Retrieved December 12, 2013, from http://library.iated.org/view/TITOVA2013LAN

Page 21: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014 35

Traxler, J. (2009). Learning in a mobile age. Inter-national Journal of Mobile and Blended Learning, 1(1), 1–12. doi:10.4018/jmbl.2009010101

Traxler, J. (2010). The ‘learner experience’ of mo-biles, mobility and connectedness. In Background paper to presentation ELESIG Symposium: Digital Futures. UK: University of Reading. Retrieved December 12, 2013, from http://cloudworks.ac.uk/cloud/view/3472

Tucker, B. (2012). The Flipped Classroom. Educa-tion Next, 12(1), 82–83. http://educationnext.org/the-flipped-classroom/ Retrieved December 23, 2013

Voelkel, S., & Bennett, D. (2013). Combining the formative with the summative: The development of a two-stage online test to encourage engagement and provide personal feedback in large classes. Research in Learning Technology, 21(1), 75–92. http://www.researchinlearningtechnology.net/index.php/rlt/article/view/19153 Retrieved December 23, 2013

Vygotsky, L. (1978). Mind in society. Cambridge, MA: MIT Press.

Page 22: How to Submit Proof Corrections Using Adobe Reader...Rob Cooper, Southampton Solent U., UK Patrick Danaher, U. of Southern Queensland, Australia Linda De George-Walker, Central Queensland

Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

36 International Journal of Mobile and Blended Learning, 6(3), 19-36, July-September 2014

Svetlana Titova, PhD, Vice Dean, Professor of Foreign Languages Methodology Department, School of Foreign Languages and Area Studies, Lomonosov Moscow State University (Russian Federation), has over 24-year experience in EFL teaching, training and publishing in foreign languages education and e-learning. She is a material developer and a tutor of online professional development courses on IT and mobile competence of FL teachers. Her research areas are ICT integration into EFL classroom, mobile learning, professional development of foreign language teachers. She coordinates several research projects - Mobile devices in Language Classroom: theory and practice; two international project supported by SUI (Norway) Enhancing Technology Awareness and Usage of m-Learning in Russia and Norway and MOBILL, Norwegian partner - Sør-Trøndelag University (Trondheim). Up to now she has published more than 80 articles and books, she is an author and a coordinator of the educational website Learning and Teaching with the Web aims at facilitating the process of implementing ICT and mobile technologies into learning experience. She is a member of international professional organizations - EuroCall, IATEFL, NATE, etc.

Tord Talmo is employed at Sør-Trøndelag University College (HiST) as an assistant professor. He holds a master degree in Norwegian from University of Trondheim (NTNU), Norway. During the last 11 years MSc. Talmo has worked in Sør-Trøndelag University College. MSc. Talmo has been involved in several projects co-funded by the European Union and/or regional offices like SIU. Currently Talmo is working in the iLike-project, aiming at utilizing handheld devices in language training. MSc. Talmo is specializing towards Language training, New Learning Environments and methodology and application of technical services being used in class. Talmo has published several articles relating to his project involvement the last three years.