14
Assessing university students’ perceptions of their Physics instructors’ TPACK development in two contexts Yahui Chang, Syh-Jong Jang and Yang-Hsueh Chen Yahui Chang is an assistant professor in the School of Education at Shaanxi Normal University, Xian, Shaanxi Province, China. Syh-Jong Jang is a professor in the Graduate School of Education at Chung-Yuan Christian University, Chung-Li, Taiwan.Yang-Hsueh Chen is an assistant professor in the Department of Education at National University of Tainan, Tainan, Taiwan. Address for correspondence: Prof Syh-Jong Jang, 200 Chung-bei Road, Gradu- ate School of Education, Chung-Yuan Christian University, Chung-Li, Taiwan. Email: [email protected] Abstract Technological Pedagogical and Content Knowledge (TPACK) has been gaining traction among educational researchers; however, studies documenting university students’ per- ceptions of their teachers’ TPACK remain limited.This study intends to investigate the professional development of two physics instructors through the lens of the TPACK framework. Moreover, this study spans an 18-week semester within both the contexts of Taiwan and China. Multiple data were collected and analyzed, including the pretest and posttest TPACK surveys, instructor interviews, in-class observations and students’ feed- back and opinions. The results revealed that John’s instructional representations and strategies and technology integration and application scores increased significantly, as well as Mike’s knowledge of students’ understanding score showing a significant increase from the middle to the end of semester. John (Taiwan) emphasized life examples and the use of multimedia while Mike (China) chose to emphasize students’ knowledge and evaluation. Such results showed different teaching characteristics in the two con- texts. Implications and suggestions are put forward based on the results of this study. Introduction Many university instructors find teaching to be quite stressful upon entering the profession; however, over time and with experience, they gradually develop a collection of knowledge, skills and strategies to help cut down their stress levels. Although the developed teaching styles or strategies may help the instructor cope with their daily routines, it might also pose danger to their professional development (Jang, 2011). In today’s world, it cannot be denied that educa- tional technology has gradually become a critical role in our life, probably because applications of technology to learning environments could be closely linked to students’ learning achieve- ment (Kopcha, 2010). Angeli and Valanides (2009) argued that if teachers learn how to make good use of technology (information and communication technology), they are more likely to create better learning environments for students. However, successfully assimilating technolo- gies into instruction is different from a haphazard addition of a new technology into the cur- riculum on a whim. Among other things, teachers must have adequate pedagogical content knowledge (PCK) and technological knowledge to capitalize teaching effectiveness and efficiency (Jang, 2010). Numerous studies (eg, Archambault & Barnett, 2010; Chai et al, 2010; Jang & Tsai, 2012; Koh et al, 2010; Lee & Tsai, 2010) have emphasizedTechnological Pedagogical and Content Knowledge (TPACK) to help university teachers gain preferable outcomes of learning and instruction. British Journal of Educational Technology (2014) doi:10.1111/bjet.12192 © 2014 British Educational Research Association

Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

Embed Size (px)

Citation preview

Page 1: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

Assessing university students’ perceptions of their Physicsinstructors’ TPACK development in two contexts

Yahui Chang, Syh-Jong Jang and Yang-Hsueh Chen

Yahui Chang is an assistant professor in the School of Education at Shaanxi Normal University, Xian, ShaanxiProvince, China. Syh-Jong Jang is a professor in the Graduate School of Education at Chung-Yuan ChristianUniversity, Chung-Li, Taiwan.Yang-Hsueh Chen is an assistant professor in the Department of Education at NationalUniversity of Tainan, Tainan, Taiwan. Address for correspondence: Prof Syh-Jong Jang, 200 Chung-bei Road, Gradu-ate School of Education, Chung-Yuan Christian University, Chung-Li, Taiwan. Email: [email protected]

AbstractTechnological Pedagogical and Content Knowledge (TPACK) has been gaining tractionamong educational researchers; however, studies documenting university students’ per-ceptions of their teachers’ TPACK remain limited. This study intends to investigate theprofessional development of two physics instructors through the lens of the TPACKframework. Moreover, this study spans an 18-week semester within both the contexts ofTaiwan and China. Multiple data were collected and analyzed, including the pretest andposttest TPACK surveys, instructor interviews, in-class observations and students’ feed-back and opinions. The results revealed that John’s instructional representations andstrategies and technology integration and application scores increased significantly,as well as Mike’s knowledge of students’ understanding score showing a significantincrease from the middle to the end of semester. John (Taiwan) emphasized life examplesand the use of multimedia while Mike (China) chose to emphasize students’ knowledgeand evaluation. Such results showed different teaching characteristics in the two con-texts. Implications and suggestions are put forward based on the results of this study.

IntroductionMany university instructors find teaching to be quite stressful upon entering the profession;however, over time and with experience, they gradually develop a collection of knowledge, skillsand strategies to help cut down their stress levels. Although the developed teaching styles orstrategies may help the instructor cope with their daily routines, it might also pose danger totheir professional development (Jang, 2011). In today’s world, it cannot be denied that educa-tional technology has gradually become a critical role in our life, probably because applicationsof technology to learning environments could be closely linked to students’ learning achieve-ment (Kopcha, 2010). Angeli and Valanides (2009) argued that if teachers learn how to makegood use of technology (information and communication technology), they are more likely tocreate better learning environments for students. However, successfully assimilating technolo-gies into instruction is different from a haphazard addition of a new technology into the cur-riculum on a whim. Among other things, teachers must have adequate pedagogical contentknowledge (PCK) and technological knowledge to capitalize teaching effectiveness and efficiency(Jang, 2010). Numerous studies (eg, Archambault & Barnett, 2010; Chai et al, 2010; Jang &Tsai, 2012; Koh et al, 2010; Lee & Tsai, 2010) have emphasized Technological Pedagogical andContent Knowledge (TPACK) to help university teachers gain preferable outcomes of learningand instruction.

British Journal of Educational Technology (2014)doi:10.1111/bjet.12192

© 2014 British Educational Research Association

Page 2: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

There are a number of existent surveys that measure TPACK; however, most of these surveysinvestigate P-12 teachers. Few studies have been carried out to measure university teachers’TPACK knowledge. What is more, most TPACK questionnaires and surveys are designed only forteachers’ self-description and overlook students’ opinions of their teachers’ knowledge.

Shih and Chuang (2013), who developed a questionnaire to evaluate college students’ percep-tions of their professor’s TPACK, noted that students perceived instructional and environmentalinfluences on their learning processes and outcomes. Tuan, Chang, Wang and Treagust (2000)further pointed out that the students’ perception of their teachers’ knowledge provides an under-standing of how students think and learn. This allows for the development of more cohesiveclassroom processes. Students naturally expect their teachers to have a vast amount of knowl-edge of the subject they teach and to be able to teach this knowledge effectively. Such expectationsimply that students were actively assessing their teacher’s PCK. In a study by Jaskyte, Taylorand Smariga (2009), students’ ranking of innovative teacher behaviors was quite different from

Practitioner NotesWhat is already known about this topic

• Several Technological Pedagogical and Content Knowledge (TPACK) surveys havesince been designed and those that have been validated through exploratory or con-firmatory factor analyses with large sample respondents generally report difficultywith isolating all seven constructs (Archambault & Barnett, 2010; Chai, Koh & Tsai,2010; Jang & Tsai, 2012; Koh, Chai & Tsai, 2010; Lee & Tsai, 2010).

• Upon reviewing TPACK-related questionnaires and surveys, these studies were most ininvestigating preservice and in-service teachers’ TPACK (Angeli & Valanides, 2009;Chai et al, 2010; Kramarski & Michalsky, 2010; Niess, 2005).

What this paper adds

• Focusing on the professional development of two college Physics instructors throughthe lens of the TPACK framework.

• Using a survey instrument to assess college students’ perceptions of their instructor’sTPACK.

• Using the same TPACK instrument to facilitate teachers’ self-reflection of their teach-ing performance.

• Exploring/contrasting different teaching patterns and TPACK development of collegePhysics instructors across two contexts (Taiwan and China).

Implications for practice and/or policy

• The use of survey helped us understand the overall teaching performance of theinstructors from students’ points of view and provided the instructors with materialsfor teaching reflection. Furthermore, compared with traditional end-of-semesterevaluations that only produce a few feedback and do not leave time for teachingimprovement in the same class because the semester has ended (Jang, 2011), thedesign of this study facilitates the collection of many student opinions through open-ended questions and provides a diagnostic function to allow the university instructorsto make changes after a given period of teaching.

• We believe that tacit uses of qualitative and quantitative data make the instrumentadaptive and flexible, enabling us to capture college science instructors’ TPACK andtrack their knowledge development.

2 British Journal of Educational Technology

© 2014 British Educational Research Association

Page 3: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

that of the faculty, even though the two groups identified similar characteristics of innovativeteaching. Jaskyte et al’s study suggests that the views of both the instructor and the studentsare important sources of feedback when examining codifiable teacher knowledge in classroomcontexts.

Following the literature reviewed above, we contend that although students’ perceptions may notbe exactly the same as teacher’s self-perceptions, multi-student surveys provide a relatively objec-tive account and secondary analysis of instructors’ methods in line with reviewing their TPACK.The practicality of surveying and collecting student-perceived instruments from a large sample ofstudents is another benefit. The main purpose of this study was to investigate and plot any changein students’ views of their Physics instructors’ TPACK over the course of one semester in thecontexts of Taiwan and China. Two research questions guided this study:

1. Do the university students’ perceptions of the case instructors’ TPACK change from the middleto the end of the semester?

2. What are the differences between Taiwan and China Physics instructors’ TPACK?

Theoretical frameworkShulman (1986) claimed that when studying teachers’ knowledge of professional development,we should combine the domains of content and pedagogy, rather than looking at each particulardomain separately. He further proposed the PCK model that consists of pedagogical knowledge(PK), content knowledge (CK) and PCK. The concept of TPACK (see Figure 1 and Table 1) waselaborated from PCK (Angeli & Valanides, 2009; Cox & Graham, 2009; Koehler & Mishra, 2005;Mishra & Koehler, 2006; Niess et al, 2009), which stands for technological knowledge (TK) thatis contextually situated within content, pedagogical knowledge and the interrelated knowledgebetween the two.

Figure 1: The TPACK diagram (Archambault & Barnett, 2010, p. 1657)

Physics instructors’ TPACK development in two contexts 3

© 2014 British Educational Research Association

Page 4: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

In a qualitative study that targeted preservice teachers and their development of TPACK, Pamuk(2011) found that while preservice teachers had cultivated backgrounds using technology (ie,TK), a lack of pedagogical knowledge and experience had limited their appropriate use of tech-nology in their instruction. Pamuk concluded that PCK was an important factor in the overalldevelopment of TPACK and teachers must prioritize their cultivation of PCK before integratingtechnologies in their classrooms. The Pamuk (2011) study provides evidence that PCK is a criticalbuilding block of TPACK; it also portrays the practical utility of deriving the TPACK frameworkfrom Shulman’s PCK theorizing.

A strand of TPACK research focuses on measuring and validating the constructs under the TPACKframework. Based on Schmidt et al’s (2009) work, Chai et al (2010) developed a TPACK instru-ment to measure 889 preservice teachers’ self-perceived TPACK. By omitting three intermediaryknowledge constructs (PCK, technological content knowledge [TCK] and technological pedagogi-cal knowledge [TPK]), they attained a confirmatory four-factor model with well-fitted indexes.However, the reduced four-factor model left out vital information about intermediary stages ofTPACK formation. Chai, Koh, Tsai and Tan (2011) further addressed the construct validity of theseven TPACK components based on survey responses of 834 preservice teachers in Singapore.While a better model fit was yielded as compared with other TPACK studies, still only five out ofthe seven constructs were defined. Jang and Tsai (2012) studied Taiwanese elementary Math-ematics and Science teachers’ TPACK in terms of interactive white board usage. Exploratoryfactor analysis showed that while TK and CK were clearly identified, items from PK and PCK wereloaded together and items from TPK, TCK and TPACK were loaded as a group. The above studiesreflected the “construct boundary issue” (Graham, 2011) of the TPACK framework, that is, onlythree to five factors were retrieved respectively, deviating from Koehler and Mishra’s (2005)seven-factor theorizing. Angeli and Valanides (2009) noted that some of the components ofTPACK have similar definitions, such as TCK and TPK, which jeopardizes the precision of theTPACK framework.

In order to better capture the gist of teacher knowledge, Jang and Chen (2013) constructed a“transformative” TPACK survey. This deviated from the majority of the TPACK instrumentsthat were designed directly from the seven-component TPACK diagram (see Figure 1). Theauthors started with the essence of Schulman’s PCK theorizing, including subject matterknowledge (SMK), instructional representation and strategies (IRS) and knowledge of students’

Table 1: The TPACK knowledge types and their descriptions

Knowledge type Description

Technology knowledge (TK) Knowledge and skills of traditional, current and emergingtechnologies.

Content knowledge (CK) Knowledge about the subject matter for teaching and learning.Pedagogical knowledge (PK) Knowledge about methods and process of teaching, such as

classroom management, assessment and student teaching.Pedagogical content

knowledge (PCK)The tacit of blending content and pedagogy for developing better

teaching practices.Technological content

knowledge (TCK)The tacit of blending content and technology for developing better

teaching practices.Technological pedagogical

knowledge (TPK)Knowledge of the affordances of technologies and what teaching

strategies can be combined with those affordances to leveragelearning outcomes.

Technological Pedagogicaland Content Knowledge(TPACK)

Teachers’ understanding of the interplay among content, pedagogyand technology, as well as the procedural knowledge of integratingtechnologies into their teaching routines.

4 British Journal of Educational Technology

© 2014 British Educational Research Association

Page 5: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

understandings (KSU). Then, technology integration and application (TIA) was further added tothe instrument. Conceptually, the TIA dimension includes TK, TCK, TPK and TPACK of the TPACKdiagram. Notably, Jang and Chen’s work measured college students’ perceptions of their profes-sors’ TPACK, differing greatly from most of the existing surveys that evaluated P-12 teachers’self-described knowledge and skills.

MethodsThis section presents the professional development of two university Physics instructors throughthe lens of TPACK. Our rationale was that the TPACK of science instructors is flexible and can befacilitated through feedback from peers, students and experts, as well as self-reflection (Jang,2011). Such feedback and reflection help to understand the reasons for using some instructionalstrategies as well as how improvement in instruction can have a positive effect on students’learning (Lee, 2005). As the voices of both instructors and students were deemed important inunderstanding college instructors’ TPACK, we collected data from both sides. More specifically,Jang and Chen’s (2013) TPACK questionnaire (see Appendix for a full list of questionnaire items)was administered to students while Physics instructors underwent observation and interview.

ParticipantsThis study considers two college instructors of the Department of Physics at two different uni-versities, one in Taiwan and the other in the Shaanxi Province of China, and the students of theirGeneral Physics courses. With regard to their backgrounds, John (Taiwan) had 3 years of post-doctoral research experience in Physics and 4 years of undergraduate teaching experience. Onthe other hand, Mike (China) earned a doctorate degree in Physics and had 5 years of under-graduate teaching experience. The two instructors were both enthusiastic about teaching. Also,they were both motivated to gain new knowledge and make changes accordingly. The question-naire was used to examine the students’ perceptions of the two Physics instructors’ TPACK in themiddle and at the end of the semester. Completed questionnaires were gathered from 108 collegestudents for this study. Among the 108 students, 56 first-year university students came fromJohn’s General Physics classes and 52 first-year university students came from Mike’s GeneralPhysics class.

Research design and data collectionThe study employed a quantitative, pre-post research design aided by qualitative data from thestudents and instructors. The pre-post method enables us to identify the two Physics instructors’TPACK profiles at two data points and to track their changes in TPACK scores. Additionally, thequalitative data (ie, observations and interviews) from the views of the instructors and studentssupplement or elaborate the quantitative findings. We believe that such an approach is “more likelyto capture the complex, multifaceted aspects of teaching and learning” (Kagan, 1990, p. 459).

The data collected consisted of the following sources: (1) survey results pertaining to universitystudents’ perception of instructor’s TPACK; (2) observation notes written by the researchersduring the instructors’ science lectures and instructional activities; and (3) two interviews witheach of the science instructors. Interview recordings were transcribed verbatim. The worksheetsand other documents developed by the instructors and students were treated as supplementarymaterials for this study.

During the first workshop, we had discussions with the two chosen instructors. The researchersintroduced the concepts of TPACK and guided the instructors to discuss and share their teachingexperiences. We then provided consultation and feedback to them in terms of teaching strategiesthrough the subsequent workshops. In addition, the questionnaire was employed at the mid-semester (week 9) to understand students’ perceptions of the instructors’ instruction in order forthe instructors to reflect upon and improve their teaching approaches. The survey employed

Physics instructors’ TPACK development in two contexts 5

© 2014 British Educational Research Association

Page 6: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

in this study also included one open-ended question for students to comment on the course. Inorder to understand the two instructors’ development of TPACK and the actual teaching situa-tion, the questionnaire was distributed to the students in the middle and at the end of the semesterrespectively.

We carried out two observations (ie, the third and sixth week) of the two instructors’ classes toassess their actual teaching practices before the mid-semester survey. During the observation, thefocus was on recording the instructors’ instructional representations and strategies and howthey incorporated technology into their lessons, if at all. After the mid-semester, we carried outtwo observations (ie, the 12th and 15th week) again on the two instructors’ classes. The focus ofthese two classroom observations was on whether changes occurred in these two instructors’instructional demonstrations, strategies and technology use.

Following the observations were the first interviews of the two instructors at the mid-semester.Theinterviews were employed for the sake of explanation and to support the statistical analysis andaverage ratings given to the instructors by the students. We also conducted the second interviewat the end of semester. This time we wanted to know if the two instructors’ TPACK had changed.Multiple data sources were collected and then provided to the instructors for further discussion andreflection, including the pretest and the posttest TPACK worksheets and survey data.

Data analysisFor the quantitative data, the SPSS statistical software (SPSS, Inc., Chicago, IL, USA) was employedto perform a basic descriptive statistical analysis on the 5-point TPACK questionnaire returned inthe middle of the semester, and the instructors were informed of the analysis results, including themean and standard deviation of each question, the average score for each aspect, and the feedbackand suggestions from the students. These results were provided to the instructors as a way to elicitreflection for improving their teaching approaches. Then, the questionnaire was administeredagain to students at the end of the semester, in addition to the basic descriptive statistical analysis.A paired sample t-test was also conducted to analyze the statistics to allow comparison between theinstructor’s pretest and posttest scores.

Qualitative data were a combination of documentary interpretation and qualitative analysis(Bogdan & Biklen, 1998). The qualitative data were mainly collected from observations andinterviews with the two instructors. In order to understand the ways in which the instructors’knowledge about science teaching influenced enactment, the results of the coding schememeasuring the two instructors’ knowledge were compared and contrasted with the one related totheir practices in order to identify commonalities. Independent examination of data by each ofthe researchers and comparisons of results were used to establish the interrater credibility of ourfindings. Whenever disagreements occurred, the coders discussed their different opinions andeventually agreed on one person’s idea or a compromised idea from all researchers. In order toemploy the constant comparison and the triangulation methods, we compared data from inter-views with the instructors, observations of their teaching, students’ opinions and other comple-mentary documents, such as their assignments and worksheets (Merriam, 2009). Through suchefforts, we can explain the conjecture changes and development of the two instructors’ TPACK.

FindingsTable 2 and Table 3 display the statistics of students’ responses to the TPACK performance in thefour categories of the two instructors, John and Mike respectively. In them, we have highlightedthe mean scores and standard deviations of the pretests and posttests of both instructors. Overall,the statistics reveal that both instructors’ TPACK performances in each category are higher thanthe mean score 3.0. We found that John’s IRS and TIA scores increased significantly from themiddle to the end of the semester, but no significant changes in SMK and KSU scores were

6 British Journal of Educational Technology

© 2014 British Educational Research Association

Page 7: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

detected. Mike’s KSU score increased significantly from the middle to the end of the semester;however, no significant changes in SMK, IRS and TIA scores were detected. In order to furtherunderstand the significant changes of IRS, KSU and TIA dimensions of the two instructors’TPACK, each item of the dimensions in the pretest and the posttest was further evaluated.

IRSThe case of John: emphasizing life examples and demonstrationsTable 4 shows the average IRS scores of John for each question. The paired sample t-test resultsindicate significant increase on questions 11, 13, 15 and 19 in the aspect of IRS. As for IRS,

Table 2: The statistics of John’s TPACK performance in the four categories

Category

Pretest Posttest

tM SD M SD

Subject matter knowledge and belief 3.87 0.66 4.02 0.47 1.27Instructional representation and strategies 3.62 0.62 3.97 0.55 2.37*Knowledge of students’ understanding 3.72 0.68 3.87 0.56 1.28Technology integration and application 4.00 0.85 4.36 0.74 2.22*

*p < .05. M, mean; SD, standard deviation; TPACK, Technological Pedagogical and Content Knowledge.

Table 3: The statistics of Mike’s TPACK performance in the four categories

Category

Pretest Posttest

tM SD M SD

Subject matter knowledge and belief 4.06 0.52 4.30 0.41 1.63Instructional representation and strategies 3.40 0.63 3.76 0.85 1.79Knowledge of students’ understanding 3.31 0.47 4.04 0.81 4.06**Technology integration and application 4.26 0.78 4.22 0.72 −0.21

**p < .01. M, mean; SD, standard deviation; TPACK, Technological Pedagogical and Content Knowledge.

Table 4: The statistics of John’s TPACK performance in the IRS category

Item

Pretest Posttest

tM SD M SD

11 3.83 0.94 4.27 0.67 3.56*12 3.72 0.67 3.83 0.92 0.5213 3.22 0.92 4.28 0.67 3.81*14 3.23 0.84 3.27 0.61 0.5715 2.83 0.98 3.94 0.54 4.17*16 3.33 0.96 3.67 0.84 1.8517 2.31 0.92 2.50 0.81 0.7218 3.54 0.76 3.46 0.71 −0.3919 3.35 0.82 4.38 0.80 3.79*20 3.50 0.86 3.69 0.90 0.90

*p < .05. IRS, instructional representation and strategies; M, mean; SD,standard deviation; TPACK, Technological Pedagogical and ContentKnowledge.

Physics instructors’ TPACK development in two contexts 7

© 2014 British Educational Research Association

Page 8: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

Item 13 (“My teacher’s teaching methods keep me interested in this subject”) and Item 19 (“Myteacher creates a classroom circumstance to promote my interest for learning”) indicated thatthe students could perceive the instructor’s efforts in creating an interesting teaching and learn-ing atmosphere. On the other hand, Item 11 (“My teacher uses appropriate examples to explainconcepts related to subject matter”) and Item 15 (“My teacher uses demonstrations to helpexplain the main concept”) show significant differences, suggesting that John adopts appropriateexamples or demonstrations as his instructional representations and strategies.

The students’ feedback on the mid-semester survey showed that some students thought thatJohn’s teaching pace was slightly fast.

S1: The teaching pace is slightly fast.

S2: The teaching is good. However, it makes his teaching better the teaching pace could be slower.

In order to know about why John’s teaching pace was too fast, an interview was further con-ducted to understand the instructor’s use of teaching strategies.

John: According to students’ feedbacks from the pretest, they think my teaching pace is slightly fast. Thismight be because I am under the pressure to finish the first seven chapters before mid-semester; while afterthe mid-semester, there are only four chapters to go. I have compared my teaching pace with other instruc-tors, and it turns out that I have to keep this teaching pace to follow the syllabus. On the other hand, I thinkthe way I use PowerPoint to present my lectures might also be one of the reasons that make my teaching pacefaster. (First interview)

From the class observation, John often used both PowerPoint and the blackboard in his teachingand seldom used life examples to explain the physics concepts. When he used PowerPoint tointroduce the formula and concepts, it seems fast to these students.

John used PowerPoint to clearly present the content that students have learned in high school. Then he usedthe blackboard to demonstrate the calculations step by step to derive the formulas. (First observation)

Upon learning this information from the first survey, John started adjusting his teaching pace inaccordance with students’ abilities and feedback. He also improved his teaching strategy andrepresentations by using more real-life and diverse examples to explicate physics concepts.

R: In the IRS aspect, what kind of real-life examples did you use to help you explain the physics concepts?

John: I used the high-tech life example which encouraged students to think critically and asked questions.More specifically I used roller-skating as the example to explain the principle of momentum. (Secondinterview)

By means of in-class observation, the researchers found that John used demonstrations to explaindifficult and abstract concepts. An example is provided below:

John first used the stop-pool-shot as the demonstration to explicate the collision principle. He furtherexplained that a professional billiard player needs to understand the principles of how to shoot the rightangles. (Second observation)

The examples shown above indicated that John made some changes to his teaching representa-tions and strategies between the middle and the end of semester. He connected the physicsprinciples and formula to student’s real-life experiences. In addition, he provided high-tech exam-ples, including roller-skating as an example of the momentum principle and stop-pool-shot as anexample of the collision principle.

KSUThe case of Mike: emphasizing students’ knowledge and evaluationTable 5 shows the average KSU scores of Mike for each question. The paired sample t-testresults indicate significant increase on questions 23 in the aspect of KSU. Item 23 (“My teacher’squestions evaluate my understanding of a topic”) indicated that the students could perceive theinstructor’s efforts in teaching for the knowledge of the students needed.

8 British Journal of Educational Technology

© 2014 British Educational Research Association

Page 9: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

To address this finding, the researchers found that Mike liked to use PowerPoint quickly inpresenting the major physics concepts. He used pop quizzes to examine students’ learning, asstudents thought the homework was excessive. The test questions were proper to evaluate theunderstanding of a topic.

S3: I learn much from the course. However, the instructor teaches too fast and requests too many quizzes.

S4: The progress of the instructor’s teaching is too fast and the loading of homework is pretty heavy.

I saw Mike used PPT to describe the definitions, and explain the meaning of the principles and theirapplications; I did not usually see other practical technologies. The instructor usually emphasized popquizzes, so that students could always remain alert in their self-consciousness in the learning process. (Firstobservation)

By means of in-class observation, the researchers found that Mike gives the structure of thephysical knowledge to students. He teaches his research field of the frontiers of knowledge tostudents in his instruction. However, owing to course progress and different majors, it was hardto choose the subject for the different level students.

I found that most of the time Mike used the knowledge structure to explain the contents. It seemed thatstudents do not quite understand these abstract concepts. Under such circumstance it is difficult andtime-consuming to explain the contents. (Second observation)

During the second interview, Mike exhibited that he could make use of the content of the subjectand structure of physics to teach students. Moreover, the instructor chose appropriate content ofthe subject and structure of physics by taking into consideration the course units and students’learning evaluation.

Mike: In general, I use PPT to summarize and explain concepts. One of the reasons is that before the middleof the semester, the focus of the course is to talk more about theories, which are more difficult . . . However,after the middle of the semester, the focus of the course is to apply the theories; and thus, some actualphenomena can be used as examples. The course contents of the first half of this semester are theories, whilethose of the second half of this semester are theories and their actual applications. (Second interview)

TIAThe case of John: emphasizing multimedia and web technologiesJohn’s average scores for each TIA item are shown in Table 6. As shown, there was a significantincrease in the Physics instructors’ TIA subscale scores from pretest to posttest, which means thatstudents perceived that the instructor increased the incorporation of technology into teaching.When we examined individual survey items, we found that the scores of Items 29 and 32 had themost significant increases. Specifically, Item 29 relates to the utilization for teaching content of aspecific course unit and Item 32 relates to the professor’s synergy of technology, pedagogy and

Table 5: The statistics of Mike’s TPACK performance in the KSU category

Item

Pretest Posttest

tM SD M SD

21 3.39 0.86 3.88 1.18 1.7522 3.79 0.83 3.88 1.07 0.3823 3.71 0.85 4.23 0.86 2.21*24 3.79 0.83 4.08 0.74 1.3525 4.00 0.67 3.88 1.18 0.4526 4.50 0.69 4.46 0.71 0.20

*p < .05. KSU, knowledge of students’ understandings; M, mean; SD,standard deviation; TPACK, Technological Pedagogical and ContentKnowledge.

Physics instructors’ TPACK development in two contexts 9

© 2014 British Educational Research Association

Page 10: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

content knowledge. The above results indicated that the Physics instructor not only has knowl-edge about technology but also knows how to integrate it with course content and pedagogy.

It was found in the interview with John that the instructor generally increased the use of tech-nological resources and was able to combine relevant videos with online Flash animations toadd supplementary descriptions in the course, which enabled the students to understand theapplication of relevant abstract concepts through hands-on operation or simulation process. Forexample:

R: As for the application of technology, there are significant differences in some questions and the averagescores of all the questions increased . . . Would you give a more explicit explanation about the technologiesused, such as software or animation to strengthen the description about technology integration andapplication.

John: I looked for some Flashes as teaching materials. For example, when I give lectures on the topic of heat,the illumination of motorcycle headlamp is 7100K, that of the general sunlight is 6500K, and that of yellowlight is 3500–3600K. If Flash is used to change the coordinates of temperature, the distribution curve willchange. (Second interview)

Furthermore, when explaining actual application or relevant experiments, John increased the useof relevant videos to help students understand them. Moreover, this study further probed into theinstructor’s use of resources such as videos and teaching technologies. For example:

R: Did you find the teaching videos at school?

John: There are many such videos on Youtube and I found most of them directly from the Internet.

R: Will you edit or arrange the Flashes and videos found online or use them directly?

John: After downloading these videos, I would edit them slightly, convert the files, and remove irrelevantparts. And then, I would link them to the PPT; therefore, the students sometimes may obtain the PPT withhyperlinked videos. (Second interview)

Discussion and implicationsThe use of surveys helped us understand the overall teaching performance of the instructorsfrom the students’ points of view and provided the instructors with the materials for teachingreflection. Traditional end-of-semester evaluations produce little feedback and are too late to helpinstructors make teaching improvement in the same class because the semester has ended (Jang,2011). The design of this study facilitates the collection of many student opinions throughopen-ended questions and provides a diagnostic function to allow the university instructors tomake changes after a given period of teaching. In other words, the research design allows forreflective thinking as well as timely modifications (Clegg, Tan & Saeidi, 2002).

Table 6: The statistics of John’s TPACK performance in the TIA category

Item

Pretest Posttest

tM SD M SD

27 4.66 0.56 4.73 0.78 0.4028 3.50 1.17 3.96 1.08 2.0029 3.62 1.17 4.31 0.93 3.05*30 4.04 1.08 4.35 0.94 1.7831 4.08 0.84 4.27 0.92 1.1532 4.04 1.04 4.46 0.71 2.19*33 4.12 0.99 4.42 0.81 1.44

*p < .05. M, mean; SD, standard deviation; TIA, technology integrationand application; TPACK, Technological Pedagogical and ContentKnowledge.

10 British Journal of Educational Technology

© 2014 British Educational Research Association

Page 11: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

John’s TIA and IRS scores increased significantly from the middle to the end of semester. John’sincrement of TPACK scores was evidenced by making use of multimedia and Internet technolo-gies to teach students during the semester. John also showed that he used appropriate examplesand demonstrations as his instructional representations and strategies. Magnusson, Krajcik andBorko (1999) emphasized that an instructors’ teaching reflection and experiences are the mainfactors in the development of PCK (De Jong, Van Driel & Verloop, 2005; Loughran, Mulhall &Berry, 2004). Nilsson (2008) highlighted that a teachers’ knowledge base can be positivelyaffected by a teacher’s reflection. Her view on the growth of the TPACK knowledge is in line withthe process of transformation in this study.

For KSU, Mike valued student knowledge and evaluation; he used tests frequently to gauge studentlearning and absorption of the material. The mean scores of the pretest and posttest showed a levelof significant difference. However, students responded that assignments and tests were too numer-ous and the teaching pace felt too rushed. The questionnaire, in this instance, provided valuabledata for Mike to make a diagnostic evaluation of the student opinions and adjust the level ofhomework and tests for the course (Jang, Tsai & Chen, 2013).

Looking at their respective contexts, John (Taiwan) emphasized life examples and use of multi-media, and Mike (China) stressed students’ knowledge and evaluation. Although it cannot beinferred that their teaching styles are representative of other instructors of universities in Taiwanand China, they show different teaching strategies and methods in their contexts. We contendthat it would be beneficial if the two instructors collaborate and learn from each other.

Via the TPACK survey, we found college students’ perceptions of their Physics instructors’ TPACKchanged from mid to the end of semester. In essence, university instructors tend to have large egosin the traditional classroom. With the survey developed in this study, researchers can understandstudents’ perceptions and in turn determine whether the instructors have achieved the expectedgoals or not (Jang, 2011; Tuan et al, 2000). Moreover, the instructors can gain more teaching andlearning experiences by participating in investigations. The instructors’ growth of TPACK knowl-edge is not restricted to a few observations or interview data, but according to the judgments of allthe students (De Jong et al, 2005; Jang, 2010). However, quantitative survey data cannot portrayfactors behind instructors’ professional development, nor does it allow for assessment of content-specific details. To cross-validate research results, the researchers collected supplemental qualita-tive data including interviews, observations and open-ended opinions of students.

The TPACK instrument is content neutral for general science, in the sense that there are no itemsthat reference content-specific ideas, knowledge or practices relating to physics. Nevertheless,science instructors’ TPACK emphasizes the specific science content (Clermont, Borko & Krajcik,1994; van Driel, Verloop & de Vos, 1998) and provides insight into teachers’ use of technologyto transform the subject matter into understandable formats (Jimoyiannis, 2010). The fact thatthe TPACK is administered in the context of a specific classroom relevant to the instructor of thatcourse provides a tacit appeal to the specific content associated with that specific classroomcontext. So, although the instrument itself does not directly and explicitly measure content-specific aspects of TPACK, the content specificity is implied by virtue of the fact that it is imple-mented in situ. In this study, we included qualitative data (observations and interviews) from twoPhysics teachers to supplement survey results. We believe that tacit uses of qualitative andquantitative data make the instrument adaptive and flexible, enabling us to capture collegescience instructors’ TPACK and track their knowledge development.

Because of practical concerns, this study focuses on TPACK scores of two physics instructors, onein Taiwan and the other in China. To some extent, this limits the generalizability of this study.Moreover, this study is exploratory in nature, and no experimental conclusions could be made atthis research stage. Future studies may employ experimental design with our student-perceived

Physics instructors’ TPACK development in two contexts 11

© 2014 British Educational Research Association

Page 12: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

TPACK instrument to gain more robust findings. More studies are suggested to track teachers’TPACK in Taiwan and China, and perhaps in Western contexts.

Several suggestions are proposed in the research process as a reference for revisions of futurequestionnaires. First, during the interview process, the instructors specifically indicated thatauthentic feedback and opinions from students can indeed urge an instructor to reflect on theirteaching and make improvements. In addition, instructors or researchers should encouragestudents to answer the open-ended items in the questionnaire as an explanation of quantitativescores. Finally, we plan to administer the survey periodically to novice college instructors and usethe results to inform researchers of specific times or events when each knowledge domain isdeveloped. This information will provide valuable insight into college science instructors’ devel-opment of TPACK, as well as providing feedback for effective approaches that facilitate theirongoing professional development.

ReferencesAngeli, C. & Valanides, N. (2009). Epistemological and methodological issues for the conceptualization,

development, and assessment of ICT-TPCK: advances in technological pedagogical content knowledge(TPCK). Computers & Education, 52, 154–168.

Archambault, L. M. & Barnett, J. H. (2010). Revisiting technological pedagogical content knowledge: explor-ing the TPACK framework. Computers & Education, 55, 4, 1656–1662.

Bogdan, R. C. & Biklen, S. K. (1998). Qualitative research for education: an introduction to theory and methods(3rd ed.). Boston: Allyn and Bacon.

Chai, C. S., Koh, J. H. L. & Tsai, C.-C. (2010). Facilitating pre-service teachers’ development of Technological,Pedagogical, and Content Knowledge (TPACK). Educational Technology & Society, 13, 4, 63–73.

Chai, C. S., Koh, J. H. L., Tsai, C.-C. & Tan, L. L. W. (2011). Modeling primary school pre-service teachers’Technological Pedagogical Content Knowledge (TPACK) for meaningful learning with information andcommunication technology (ICT). Computers & Education, 57, 1184–1193.

Clegg, S., Tan, J. & Saeidi, S. (2002). Reflecting or acting? Reflective practice and continuing professionaldevelopment in higher education. Reflective Practice, 3, 1, 131–146.

Clermont, C. P., Borko, H. & Krajcik, J. S. (1994). Comparative study of the pedagogical content knowledgeof experienced and novice chemical demonstrators. Journal of Research in Science Teaching, 31, 419–441.

Cox, S. & Graham, C. R. (2009). Diagramming TPACK in practice: using and elaborated model of the TPACKframework to analyze and depict teacher knowledge. TechTrends, 53, 5, 60–69.

De Jong, O., Van Driel, J. & Verloop, N. (2005). Preservice teachers’ pedagogical content knowledge of usingparticle models in teaching chemistry. Journal of Research in Science Teaching, 42, 8, 947–964.

van Driel, J., Verloop, N. & de Vos, W. (1998). Developing science teachers’ pedagogical content knowledge.Journal of Research in Science Teaching, 35, 6, 673–695.

Graham, C. R. (2011). Theoretical considerations for understanding technological pedagogical contentknowledge (TPACK). Computers & Education, 57, 3, 1953–1960.

Jang, S.-J. (2010). Integrating the interactive whiteboard and peer coaching to develop the TPACK ofsecondary science teachers. Computers & Education, 55, 4, 1744–1751.

Jang, S.-J. (2011). Assessing college students’ perceptions of a case teacher’s pedagogical content knowledgeusing a newly developed instrument. Higher Education, 61, 6, 663–678.

Jang, S.-J. & Chen, K.-C. (2013). Development of an instrument to assess university students’ perceptions oftheir science instructors’ TPACK. Journal of Modern Education Review, 3, 10, 771–783.

Jang, S.-J. & Tsai, M.-F. (2012). Exploring the TPACK of Taiwanese elementary mathematics and scienceteachers with respect to use of interactive whiteboards. Computers & Education, 59, 2, 327–338.

Jang, S.-J., Tsai, M.-F. & Chen, H.-Y. (2013). Development of PCK for novice and experienced universityphysics instructors: a case study. Teaching in Higher Education, 18, 1, 27–39.

Jaskyte, K., Taylor, H. & Smariga, R. (2009). Student and faculty perceptions of innovative teaching.Creativity Research Journal, 21, 1, 111–116.

Jimoyiannis, A. (2010). Designing and implementing an integrated technological pedagogical science knowl-edge framework for science teachers’ professional development. Computers & Education, 55, 1259–1269.

Kagan, D. M. (1990). Ways of evaluating teacher cognition: inferences concerning the Goldilocks principle.Review of Educational Research, 60, 3, 419–469.

Koehler, M. J. & Mishra, P. (2005). What happens when teachers design educational technology? Thedevelopment of technological pedagogical content knowledge. Journal of Educational Computing Research,32, 2, 131–152.

12 British Journal of Educational Technology

© 2014 British Educational Research Association

Page 13: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

Koh, J., Chai, C. S. & Tsai, C. C. (2010). Examining the technological pedagogical content knowledge ofSingapore preservice teachers with a large-scale survey. Journal of Computer Assisted Learning, 26, 563–573.

Kopcha, T. J. (2010). A systems-based approach to technology integration using mentoring and communi-ties of practice. Educational Technology Research & Development, 58, 2, 175–190.

Kramarski, B. & Michalsky, T. (2010). Preparing preservice teachers for self-regulated learning in thecontext of technological pedagogical content knowledge. Learning and Instruction, 20, 5, 434–447.

Lee, H.-J. (2005). Understanding and assessing preservice teachers’ reflective thinking. Teaching and TeacherEducation, 21, 699–715.

Lee, M. H. & Tsai, C.-C. (2010). Exploring teachers’ perceived self efficacy and technological pedagogicalcontent knowledge with respect to educational use of the world wide web. Instructional Science, 38,1–21.

Loughran, J. J., Mulhall, P. & Berry, A. (2004). In search of pedagogical content knowledge in science:developing ways of articulating and documenting professional practice. Journal of Research in ScienceTeaching, 41, 370–391.

Magnusson, S., Krajcik, J. & Borko, H. (1999). Nature, sources, and development of pedagogical contentknowledge for science teaching. In J. Gess-Newsome & N. Lederman (Eds), Examining pedagogical contentknowledge (pp. 95–132). Netherlands: Kluwer.

Merriam, S. B. (2009). Qualitative research: a guide to design and implementation. San Francisco, CA: Jossey-Bass.Mishra, P. & Koehler, M. J. (2006). Technological pedagogical content knowledge: a framework for teacher

knowledge. Teachers College Record, 108, 6, 1017–1054.Niess, M. L. (2005). Preparing teachers to teach science and mathematics with technology: developing a

technology pedagogical content knowledge. Teaching and Teacher Education, 21, 509–523.Niess, M. L., Ronau, R. N., Shafer, K. G., Driskell, S. O., Harper, S. R., Johnston, C. et al (2009). Mathematics

teacher TPACK standards and development model. Contemporary Issues in Technology and Teacher Educa-tion, 9, 1, 4–24.

Nilsson, P. (2008). Teaching for understanding: the complex nature of pedagogical content knowledge inpre-service education. International Journal of Science Education, 30, 10, 1281–1299.

Pamuk, S. (2011). Understanding preservice teachers’ technology use through TPACK Framework. Journalof Computer Assisted Learning, 28, 5, 425–439.

Schmidt, D. A., Baran, E., Thompson, A. D., Mishra, P., Koehler, M. J. & Shin, T. S. (2009). Technologicalpedagogical content knowledge (TPACK): the development and validation of an assessment instrumentfor preservice teachers. Journal of Research on Technology in Education, 42, 2, 123–149.

Shih, C. L. & Chuang, H. H. (2013). The development and validation of an instrument for assessing collegestudents’ perceptions of faculty knowledge in technology supported class environments. Computers &Education, 63, 109–118.

Shulman, L. (1986). Those who understand: knowledge growth in teaching. Educational Researcher, 15, 2,4–14.

Tuan, H., Chang, H., Wang, K. & Treagust, D. (2000). The development of an instrument for assessingstudents’ perceptions of teachers’ knowledge. International Journal of Science Education, 22, 4, 385–398.

AppendixThe TPACK InstrumentQuestionnaire on University Students’ Perceptions of Instructor’s TPACK

A. Subject Matter Knowledge (SMK)1. My teacher knows the content he/she is teaching.2. My teacher explains clearly the content of the subject.3. My teacher knows how theories or principles of the subject have been developed.4. My teacher selects the appropriate content for students.5. My teacher knows the answers to questions that we ask about the subject.6. My teacher explains the impact of subject matter on society.7. My teacher knows the whole structure and direction of this subject matter.8. My teacher makes me clearly understand objectives of this course.9. My teacher pays attention to students’ reaction during class and adjusts his/her teaching

attitude.10. My teacher’s belief or value in teaching is active and aggressive.

Physics instructors’ TPACK development in two contexts 13

© 2014 British Educational Research Association

Page 14: Assessing university students' perceptions of their Physics instructors' TPACK development in two contexts

B. Instructional Representation and Strategies (IRS)11. My teacher uses appropriate examples to explain concepts related to subject matter.12. My teacher uses familiar analogies to explain concepts of subject matter.13. My teacher’s teaching methods keep me interested in this subject.14. My teacher provides opportunities for me to express my views during class.15. My teacher uses demonstrations to help explain the main concept.16. My teacher uses a variety of teaching approaches to transform subject matter into com-

prehensible knowledge.17. My teacher adopts group discussion or cooperative learning.18. My teacher provides an appropriate interaction or good atmosphere.19. My teacher creates a classroom circumstance to promote my interest for learning.20. My teacher prepares some additional teaching materials.

C. Knowledge of Students’ Understandings (KSU)21. My teacher realizes students’ prior knowledge before class.22. My teacher knows students’ learning difficulties of subject before class.23. My teacher’s questions evaluate my understanding of a topic.24. My teacher’s assessment methods evaluate my understanding of the subject.25. My teacher uses different approaches (questions, discussion, etc) to find out whether I

understand.26. My teacher’s assignments facilitate my understanding of the subject.

D. Technology Integration and Application (TIA)27. My teacher knows how to use multimedia (eg, PowerPoint and animation, etc) for

teaching.28. My teacher knows how to use web technologies (eg, teaching website, blog and distance

learning) for teaching.29. My teacher is able to choose multimedia and web technologies that enhance his/her

teaching for a specific course unit.30. My teacher is able to use technology to enhance our understanding and learning of

lessons.31. My teacher is able to use technology to enrich the teaching content and materials.32. My teacher is able to integrate content, technology and teaching methods in his/her

teaching.33. My teacher is able to choose diverse technologies and teaching methods for a specific

course unit.

Comments:In this course, if you have any learning difficulty or opinion, please describe it as follows.

________________________________________________________________________________

14 British Journal of Educational Technology

© 2014 British Educational Research Association