77
Multichannel Learning System (MLS) Data Analysis and Collection Plan Prepared By: 711 th HPW/RHAS, Office of Naval Research (ONR) Reserve Component (RC), and Lockheed Martin International Training Team (LMITT)

Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Embed Size (px)

Citation preview

Page 1: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Multichannel Learning System (MLS)Data Analysis and Collection Plan

Prepared By:711th HPW/RHAS,

Office of Naval Research (ONR) Reserve Component (RC), and Lockheed Martin International Training Team (LMITT)

23 December 2014

Page 2: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Table of Course materials

1 Introduction..............................................................................................................................11.1 Document Overview........................................................................................................21.2 System Description..........................................................................................................2

2 Administrative Information.....................................................................................................22.1 General.............................................................................................................................22.2 Responsibilities................................................................................................................2

2.2.1 MLS Project Sponsor...............................................................................................22.2.2 U.S. Air Force Research Lab (AFRL) Warfighter Readiness Research Division (711 HPW/RHA).................................................................................................................32.2.3 Office of Naval Research Reserve Component (ONR RC).....................................32.2.4 Lockheed Martin International Training Team (LMITT).......................................42.2.5 GIRAF PM Services GmbH/siebenundvierzig ING GmbH & Co. KG..................42.2.6 Cranfield University................................................................................................52.2.7 Partner Nations........................................................................................................5

2.3 Deviations from the Test Plan.........................................................................................63 Release of Information.............................................................................................................63.1 Assessment Data..............................................................................................................6

3.1.1 Research Ethics & Informed Consent......................................................................63.1.2 Proprietary Data.......................................................................................................73.1.3 Proprietary MLS Data Collection Portal.................................................................73.1.4 Release of Information to the Press or Commercial Vendors.................................7

4 Concept Evaluation Activities.................................................................................................75 Evaluation Goals, Objectives and Analysis.............................................................................95.1 Evaluation Criteria.........................................................................................................105.2 Evaluation Approach.....................................................................................................115.3 Test limitations and Uncontrolled Variables.................................................................115.4 Data Analysis.................................................................................................................115.5 Data Analysis.................................................................................................................395.6 Data Collection..............................................................................................................395.7 Analysis Summary Questions........................................................................................40

6 Reports...................................................................................................................................406.1 Final Reports..................................................................................................................40

Figures

Figure 5-1 -MLS Data Collection Website....................................................................................40

i

Page 3: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

1 Introduction

The Department of State (DoS), Department of Defense (DOD) and other US Government organizations use training and education as a prime tool for achieving strategic objectives, establishing rapport, opening communication, building capacity and maturing relationships with Coalition and international partners. Training is the critical component in the International Security Cooperation Program managed by the Naval Education & Training Security Assistance Field Activity (NETSAFA), United States Marine Corps Security Cooperation Group (MCSCG), Air Force Security Assistance Training Squadron (AFSAT), United States Coast Guard International Affairs and Foreign Policy (USCG/IA), and the U.S. Army Security Assistance Training Field Activity (SAFTA). The Defense Security Cooperation Agency (DSCA) directs, administers and supervises the execution of the Security Cooperation Programs managed by NETSAFA, MCSGC, AFSAT, USCG/IA and SATFA and Defense Institute of Security Assistance Management (DISAM) provides professional education to the staffs of the aforementioned organizations and to the Security Cooperation Offices (SCO).

The Multichannel Learning System (MLS) is a NETSAFA, DISAM and MCSGC research project focused on identifying the best methods for providing distance education that would prepare international military students for a resident training experience in the United States. It will convert selected course material from the International Military Student Pre-Departure Briefing (IMSPDB) into multiple formats to support individual learning approaches (i.e., e-book, e-learning, mobile applications, and video). It has two goals, specifically, (1) identify the best methods for providing distance education in order to prepare international military students for a resident training experience in the United States, and (2) convert course material from the International Military Student Pre-Departure Briefing (IMSPDB) course into multiple formats to support individual learning approaches.

The project will evaluate the use of four learning formats (i.e., e-book, e-learning, mobile app, and video) delivered via several channels (i.e., personal computers and mobile devices). The evaluations will (1) identify the best methods for providing distance education for international military students, and (2) evaluate the effectiveness of having multiple learning formats to support Security Cooperation Education and Training Program (SCETP) requirements.

A Concept Evaluation will occur in January-March 2015 and include both International Military Students (IMS) and International participants. The IMS populations derived from a data pull from the Security Assistance Network (SAN) Web Training database that identifies IMS are enrolled at US Military training sites in the United States during the Concept Evaluation timeframe. US/International partners will identify no fewer than twenty (20) volunteers to evaluate the IMSPDB course material. Each volunteer will evaluate two learning formats in order to assist in identifying the best methods for providing distance education and evaluate the learning effectiveness of the learning approaches.

The MLS Project has the endorsement of the Defense Security Cooperation Agency (DSCA), based on the project’s capability to ensure coalition partners have global access to the

1

Page 4: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Military Services’ state of the art online education and training programs. At the end of this project, the U.S. will have an enhanced operational training capability with participating partner nations, and these capabilities will be integrated into other Security Cooperation Education and Training Programs (SCETP) managed by DSCA.

1.1 Document Overview

This document addresses the evaluation approach to determine if the research project met the intended research goals.

The goals of the MLS research study is to (1) identify the best methods for providing distance education for international military students, and (2) evaluate the effectiveness of having multiple learning formats to support Security Cooperation Education and Training Program (SCETP) requirements.

1.2 System Description

The Multichannel Learning System (MLS) Project will use selected course material from the International Military Student Pre-Departure Briefing (IMSPDB). The course material will be converted into four different formats (i.e., e-book, e-learning, mobile app, and video) to support individual learning approaches.

Each research volunteer will evaluate two different learning formats, take a demographic survey, and pre- and post-test questionnaire on the course material related to Culture, Individualism, and Punctuality.

2 Administrative Information

2.1 General

This section describes the MLS Testing & Evaluation Team, the project sponsor, 711th HPW/RHAS, Reserve Component (ONR Program 38 personnel), Defence Academy of the United Kingdom (DAUK), Support Contractors and other US/International Government Personnel.

2.2 Responsibilities

2.2.1 MLS Project Sponsor

The Multichannel Learning System (MLS) is a NETSAFA, MCSCG and DISAM project, which is focused on identifying the best methods for providing distance education that would prepare international military students for a resident training experience in the United States and developing course material from the International Military Student Pre-Departure Briefing

2

Page 5: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

(IMSPDB) informational course into multiple formats to support individual learning approaches (i.e., e-learning, mobile applications, videos, e-book, etc.).

2.2.2U.S. Air Force Research Lab (AFRL) Warfighter Readiness Research Division (711 HPW/RHA)

The U.S. Air Force Research Lab (AFRL) Warfighter Research Readiness Division (711 HPW/RHA) is the US Government Lead for deploying the MLS Testing & Evaluation (T&E) Plan, and assessing the ability of the MLS Project to meet the goals and objective. The T&E Working Group will conduct an evaluation of the learning formats (i.e., e-book, e-learning, mobile app, and video) in a multinational environment.

The 711 HPW/RHA will work with the ONR Research Component (ONR RC), all MLS Working Groups (i.e.,, Learning Course material and Research Protocol), GIRAF PM Services GmbH and Siebenundvierzig ING GmbH & Co. KG to ensure the Testing & Evaluation Strategy collects the necessary data to ascertain MLS’s ability to support the evaluation of learning formats in the multinational environment.

RHA’s responsibilities include:

a) Promulgate and make changes, as necessary, to this Test Plan;

b) Develop, in consonance with ONR RC and Lockheed Martin’s International Training Team (LMITT) both technical and non-technical evaluation measures;

c) Provide a test plan briefing during appropriate meetings and workshops;

d) Provide advice to the MLS Project Manager, including keeping all of the MLS Management Team, informed of any condition that may affect execution of this assessment;

e) Collaborate with LMITT and other US/International Organizations providing support to the Testing & Evaluation Working Group;

f) Coordinate a data collection and analysis strategy to ensure the evaluation approach is comprehensive-in-nature;

g) Conduct analysis of data collected;

h) Assist in the preparation of the final analysis for inclusion in the MLS Final Report; and

i) Provide recommendations on how the lessons learned will support SCETP requirements.

2.2.3Office of Naval Research Reserve Component (ONR RC)

3

Page 6: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

The Office of Naval Research Reserve Component (ONR RC) will provide technical and evaluation advice and assistance to the 711 HPW/RHA, MLS Management Team and LMITT as co-lead on the Testing & Evaluation Working Group.

ONR RC will ensure the Testing & Evaluation Strategy identified collects the necessary data to ascertain MLS’s ability to support the multinational, multicultural environment. ONR RC’s responsibilities include:

a) Participate in Testing & Evaluation Working Group meetings, teleconferences, and other activities required to determine the optimum statistical evaluation approaches;

b) Support the analysis and data identification procedures;

c) Conduct analysis of data collected; and

d) Support the development of Testing & Evaluation Briefings.

2.2.4Lockheed Martin International Training Team (LMITT)

a) Perform as the Principle Advisor on Human Research Protection Program (HRPP) Issues by providing oversight on all data collection, analysis of the test results, and publish appropriate reports in accordance with the Human Research Protection Program (HRPP) Guidelines and the MLS Management Plan;

b) Develop the appropriate Informed Consent Forms and seek Institutional Review Board (IRB) Approvals, as required;

c) Provide input for test plan, scenario, and use case development;

d) Provide input to analysis, assessment plans, and reports;

e) Conduct analysis of data collected; and

f) Incorporate inputs from the Testing & Evaluation Team into the MLS Final Report.

2.2.5GIRAF PM Services GmbH/siebenundvierzig ING GmbH & Co. KG

GIRAF PM Services GmbH will monitor the MLS Data Collection Portal to ensure the data collected during the evaluations meets US, EU and International Research Protocol Requirements. They will ensure that the data collect by Concept Evaluation participants is in an easy-to-use format for analyses purposes.

Siebenundvierzig ING GmbH & Co. KG will maintain and administer the MLS Data Collection Portal since they are the developer of the portal’s approach. They will coordinate the data collection process with LMITT to ensure all data collection and research protocol requirements are met and that the portal provides access to the data collection effort to support post-evaluation analysis. As the sole owner and developer of the MLS Data Portal, they will develop a proprietary collection suite, to include:

4

Page 7: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

a) A Comma Separated Values (CSV) database will ensure Research Protocol Requirements are fulfilled; specifically,

b) Safeguard the database to ensure tight controls in order to ensure no human subject uses another human subject’s PIN to gain unauthorized access.

c) If a human subject tries to enter the system without the authorized PIN, Figure 2-1 will appear.

d) The Data Collection Portal will check the PIN each time the human subject attempts to enter the portal to (1) confirm the PIN and (2) check the last survey question completed. This ensures that human subjects cannot change responses once provided.

e) After the Concept Evaluation has completed, the CSV file will reviewed by the MLS Extramural Investigators before it is provided for analysis to the Testing & Evaluation Team.

f) Once the Concept Evaluation analysis has been completed a comprehensive CSV file, containing the data from both Evaluation will be merged and reviewed by the MLS Extramural Investigators before it is provided to the Testing & Evaluation Team for a comprehensive analysis.

g) After the comprehensive analysis and report is delivered, all data related to the MLS Concept Evaluation will be destroyed.

2.2.6Cranfield University

Participate in all Testing & Evaluation Working Group meetings, teleconferences, and other activities required for determining optimal testing and evaluation processes. The Cranfield University (CU) team are considered ‘subject matter experts’ in the field of Learning Assessments; therefore, they are a representative of the United Kingdom’s Ministry of Defence (MoD) Testing & Evaluation Team. CU’s responsibilities include:

a) Participate in all Testing & Evaluation Working Group meetings, teleconferences, and other activities required to determine the optimum statistical evaluation approaches;

b) Support the 711 HPW/RHA Testing & Evaluation Team in conducting the analysis and data identification procedures;

c) Complete the Collaborative Institutional Training Initiative (CITI) Human Research Protection Program (HRPP) On-Line Training Course since they are conducting additional research related to m-Learning;

d) Conduct analysis of data collected; and

e) Support the development of Testing & Evaluation Briefings, as required.

5

Page 8: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

2.2.7Partner Nations

Each of the international partners will participate in Testing and Evaluation Working Group. The international representatives will provide inputs on the development of the questionnaire as well as assist in evaluating the initial formats developed by DISAM.

Each international partner will also participate in the Research Protocol Working Group to support the development of the Informed Consent Form and apply to respective Institutional Review Boards (IRB) for approval to participate in the MLS Concept Evaluation.

2.3 Deviations from the Test Plan

711 HPW/RHA Project Lead is authorized to deviate from this Test Plan as the operational situations and Professional Military Judgment (PMJ) dictates as long as it aligns with the MLS Project goals and objectives, keeping the MLS Project Manager and the MLS Management Team updated on what changes have been made and potential impacts.

3 Release of Information

3.1 Assessment Data

The following provides the strategy on the release of MLS Concept Evaluation Assessment Data:

a) Operational assessment data will, as expeditiously as possible, be initially be shared only within the MLS Project Management Team;

b) The release of data will not be provided during the actual conduct of the evaluation until after a valid assessment can be made based on all data collected;

c) All data will be gathered in a means to ensure it protects the rights, privacy and confidentially of all human subjects; and

d) The MLS Final Analysis Report will be released within 15-days of the assessment period, and as part of the MLS Final Report.

3.1.1Research Ethics & Informed Consent

The Testing & Evaluation Process will meet the guidelines established by the US Department of Defense Directive (DODD) 5400.11 (DOD Privacy Program, DODD 3216.2 (Protection of Human Subjects in DoD-Supported Research, SECNAVINST 3900.39D, 32CFR219 (Common Rule), Canadian Tri-Council Policy Ethical Conduct for Research , European Union (EU) Data Protection Requirements and UK Ministry of Defence JSP 536 (Research Ethics Committee).

6

Page 9: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

a) Each participant must first register, complete a demographics questionnaire and take the pre-test.

b) Once registered, each participant will evaluate two different learning formats of the IMSPDB course material (i.e., e-book, e-learning, mobile app, or video). Each format contains course material related to American and Military Cultures, Cultures, Individualism, Punctuality, Informality, Egalitarianism, Short Term Mentality and Daily Life in America. However, only the Culture, Individualism, and Punctuality are testable.

c) After evaluating the learning formats, each participant will complete the post-test questionnaire and the survey.

d) Each participant will have the right to answer or decline to answer each question.

3.1.2 Proprietary Data

The MLS Testing & Evaluation Data does not contain any proprietary or personal data. The collection of data from the assessment will be provided in aggregate form.

3.1.3 MLS Data Collection Portal

The MLS Data Collection Portal developed by Siebenundvierzig ING GmbH & Co. KG..

3.1.4 Release of Information to the Press or Commercial Vendors

No data will be released to the press or to any other organization not involved in the MLS Project without approval of the MLS Management Leads (i.e., NETSAFA and DISAM) prior to the release of the MLS Final Report.

The final report will be released 90-days after the MLS Project terminates, and all information is available at https://wss.apan.org/1539/JKO/MLS/SitePages/Home.aspx.

4 Concept Evaluation Activities

Each evaluation volunteer will be required to complete the following steps.

a) Complete a demographic online questionnaire

1) Age: 18-22, 23-25, 26-30, 31-40, 41-50, 51- 60, over 60, Decline to Answer2) Gender: Male, Female, Decline to Answer3) Profession: Military Enlisted, Military Officer, Civilian, Decline to Answer4) Job Occupation: Free Text fill in the blank5) Have you attended US Military Training in the United States in the last two

years?: Yes, No, Decline to Answer

7

Page 10: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

b) Take a Pre-Test Questions in order to obtain a baseline of their Culture, Individualism, and Punctuality knowledge

a) Punctuality applies to all of the following EXCEPT. . .b) American military personnel have identical values and interest.c) Which of the following IS NOT true about Individualism?d) Which of the following statements IS NOT true about American Culture?e) The phrase “each person has a broad range of personal qualities, interest and

priorities” refers to. . . .

c) Evaluate two different learning formats (i.e., e-book, e-learning, mobile app, and video).

d) Take the Post-Test Questionnaire in order to obtain their understanding of American Culture, Individualism, and Punctuality.

a) Which of the following IS NOT true about Individualism?b) Which of the following IS NOT true about American Culture?c) Punctuality applies to all of the following EXCEPT. . .d) The term “melting pot” refers to which of the following sentences?e) Which word refers to “each person making their own choices and not wanting

to depend too much on others”?

e) Complete a questionnaire that request feedback on technology as well as the learning experience.

a) Which devices did you use for evaluating the learning formats?b) Which learning formats did you evaluation?c) I have control over the pace and sequencing of my learning process.d) I may not have been able to learn the course material if provided in only one

format.e) I feel confident that I have a good understanding of American Culture based on

the training.f) I feel confident that I can help a colleague understand American Culture.g) I am willing to share my experiences and lessons learned with those going on

training in the U.Sh) When I had trouble understanding the material, I used another format to help

clarify my understanding.i) In addition to the testable course material, I viewed the following course

material. . .j) The goals of the training were clearly defined.k) Overall, I was satisfied with this course.l) What did you like most about this learning experience?m) What did you like least about this learning experience?

8

Page 11: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

n) List five words that express your learning experience.

f) Complete a questionnaire related to the specific evaluation format.

a) E-book Questions

(a) The e-book was easy to read.(b) The e-book format was easy to use.(c) The e-book was easy to understand.(d) The e-book hyperlinks allowed me to access additional learning material.(e) The e-book was well constructed.(f) The e-book is a valuable learning tool.

b) Mobile App Questions

(a) The mobile app was easy to read.(b) The mobile app format was easy to use.(c) The mobile app course material was easy to understand.(d) The mobile app hyperlinks allowed me to access additional learning

material.(e) The mobile app was well constructed.(f) The mobile app is a valuable learning tool.

c) Video Questions

(a) The video length was acceptable.(b) The video format was easy to use.(c) The video course material was easy to understand.(d) The video was clearly presented.(e) A video is a valuable learning tool.(f) The video hyperlinks allowed me to access additional learning material.(g) It was easy to start/stop the video

d) E-learning Questions

(a) The e-learning format was easy to use.(b) The e-learning course material was easy to understand.(c) The e-learning course material was clearly presented.(d) E-learning is a valuable learning tool.(e) The e-learning hyperlinks allowed me to access additional learning material.

5 Evaluation Goals, Objectives and Analysis

The goals of the Concept Evaluation is to (1) identify the best methods for providing distance education that would help international military students (IMS) for a resident training

9

Page 12: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

experience in the United States, and (2) evaluate the effectiveness of having multiple learning formats to support Security Cooperation Education and Training Program (SCETP) requirements. The objective of the MLS Concept Evaluation is to develop a capability that ensures IMS have a positive experience alongside their U.S.-counterparts.

In order to determine if the Concept Evaluation met the MLS Project Goals and Objectives, the evaluation questions will focus on the following:

a) Evaluate the technology usage (i.e., determine what devices were used to evaluate the learning formats).

b) Evaluate the usability, or the ease-of-use, of the learning formats. Specifically, how easy was it for users to accomplish the basic task the first time they encountered the format, how pleasant was it to use the multiple learning formats, and how pleasant were the features of each learning format to use (e.g., hyperlinks, start/stop features, etc.).

c) Evaluate the usefulness and utility of having multiple formats available to support IMS preparedness for a resident training experienced in the United States.

d) Evaluate the learning transfer based on the pre- and post-test questionnaires.

e) Evaluate the interest/desirability to ascertain if the users had an interest in the learning and their desire to pass the information along to their colleagues or other persons traveling to the United States.

5.1 Evaluation Criteria

The evaluation criterion is based on evaluating the technology used, usability, utility, usefulness, learning, and desirability/interest of multiple learning formats to support IMS preparedness for a resident training experience in the United States.

The following provides the Terms of Reference for each of the evaluation area:

a) Technology Used: An understanding of the types of technology devices used to evaluate the learning formats (i.e., personal computer; android, apple or kindle devices).

b) Usability: How easy and pleasant were the multiple formats to use (i.e., learnability, user satisfaction, and ease-of-use).

c) Learning: Did learning transfer occur? Did the user feel confident about sharing their learning experience with others?

d) Usefulness: What is the benefit or availability of using multiple learning formats for this type of training?

10

Page 13: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

e) Desirability/Interest: Based on the learning format evaluations, did the user finding the learning medium of interest and desirable?

f) Utility – Did each of the learning formats provide a format that enhanced learning (i.e., was the learning format easy to use, did the hyperlinks support additional learning, etc.)?

5.2 Evaluation Approach

The Concept Evaluation will evaluate the different learning formats that contain the International Military Student Pre-Departure Briefing ((MSPDB) course material related to the American and Military Cultures material.

5.3 Test limitations and Uncontrolled Variables

During the Concept Evaluation there will be several ‘uncontrolled variables;’ specifically,

a. User Limitations . Although the IMSPDB course material was provided in several formats (i.e., e-book, e-learning, mobile app, and video), there is a limited evaluation capability for users of Apple devices since the Mobile App is only available to Android users and Apple iPad/iPhone user cannot view flash-based course material.

b. Windows-based Systems . The mobile application is not supported on a Windows-based laptop or mobile device. Therefore, concept evaluation participants, that use a Windows-based phone or laptop, will not be able to evaluate the mobile-app based format.

c. Responses to Questions . In accordance with the Human Research Protection Program, each participant cannot be forced to answer any question they do not desire. Therefore, it is expected there will be questions that participants do not wish to respond.

5.4 Data Analysis

Data will reside on the MLS Data Collection Portal. Each human subject is assigned a unique personal identification number (PIN) which will consist of a four-digit numerical value. The first two numbers will represent an MLS-assigned Country Code, and the last two numbers is a generated Research Identification number that is in sequential order by Country Code.

The data analysis process will focus on evaluating

11

Page 14: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Registration Process

Q1: What is your age? Demographic User Interface: Likert scale (18-22; 22-25; 25 – 30; 31 – 40; 41 – 50; 51 – 60; Over 60; Decline to Answer)Data display: Line Graph and Data Table

A line graph and table of each age group and their percentages

Q2: What is your gender? Demographic User Interface: Likert scale (Male, Female, Decline to Answer)Data display: Line Graph and Data Table

A line graph and table of the gender types and their percentages

Q3: What is your profession? Demographic User Interface: Likert scales (Military Enlisted, Military Officer, Civilian, Decline to Answer)Data display: Line Graph and Data Table

A line graph and table of professions and their percentages

Q4: What is your Job Occupation?

Demographic User Interface: Open ended test fieldData display: Line Graph and Data Table

A line graph and table of job occupations and their percentages

12

Page 15: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q5: Have you attended US Military Training in the States in the last two years?

Demographic User Interface: Likert Scale (Yes, No, Decline to Answer)

Analysis: Use z-score to analyze difference between those who have and who have not attended previous training

Data display: Line Graph, Data Table

A line graph and table showing those that attended compared to those who have not attended.

Pre-Test Questionnaire

Q1: Punctuality applies to all of the following EXCEPT

Q2: American military personnel have identical values and interest.Q3: Which of the following IS NOT true about Individualism?

Q4: Which of the following statements IS NOT true about American Culture?Q5: The phrase “each person has a broad range of personal qualities, interest and priorities” refers to

Learning User Interface: Likert Scale with answer options

Analysis: Use z score to analyze difference between agrees scores and disagree scores.

Data display: Line Graph, Data Table

A line graph and table of the responses and their percentagesShow descriptive stats including 95% CI before the evaluation starts

Post-Test Questions

13

Page 16: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q1: Which of the following IS NOT true about Individualism?

Q2: Which of the following IS NOT true about American Culture?

Q3: Punctuality applies to all of the following EXCEPT

Q4: The term “melting pot” refers to which of the following sentences?

Q5: Which word refers to “each person making their own choices and not wanting to depend too much on others”?

Learning User Interface: Likert Scale with answer optionsAnalysis: Use z score to analyze difference between agrees scores and disagree scores.Data display: Line Graph, Data Table

A line graph and table of the responses and their percentages

Show descriptive stats including 95% CI on the results based on the evaluation

General Questions

Q1: Which devices did you use for evaluating the learning formats? (Select all that apply)

Technical User Interface: Likert scale (Personal Computer, Android Tablet, Android Mobile Phone, Apple iPad/ iPod, Apple Mobile Phone, Amazon Kindle, Decline to Answer).

Data display: Line Graph, Data Table

A line graph and table of formats evaluated and their percentagesCompare the baseline number of devices used and an aggregated values displayed as radar chart

14

Page 17: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q2: Which learning formats did you evaluation? (Select all that apply)

Utility User Interface: Likert Scale (e-book, Mobile App, Video, E-learning, Decline to Answer)

Data display: Line graph, Data Table

A line graph and table of devices used and their percentages

Compare the baseline number of learning formats evaluated and an aggregated values displayed as radar chart

Q3: I have control over the pace and sequencing of my learning process.

Utility User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show descriptive stats including 95% CI of the results

15

Page 18: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q4: I may not have been able to learn the course material if provided in only one format.

Utility User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show descriptive stats including 95% CI of the results

Q5: I feel confident that I have a good understanding of American Culture based on the training.

Learning User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show descriptive stats including 95% CI of the results

16

Page 19: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q6: I feel confident that I can help a colleague understand American Culture.

Learning User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show descriptive stats including 95% CI of the results

Q7: I am willing to share my experiences and lessons learned with those going on training in the U.S.

Learning User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show descriptive stats including 95% CI of the results

17

Page 20: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q8: When I had trouble understanding the material, I used another format to help clarify my understanding.

Utility User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show descriptive stats including 95% CI of the results

Q9: In addition to the testable course material, I viewed the following course material. (Select all that apply)

Interest User Interface: Likert Scale (Informality, Egalitarianism, Short Term Mentality, Daily Life in America)Data display: Data Table

A line graph and table of devices used and their percentages

Compare the baseline number of additional course material viewed as aggregated values displayed as radar chart

18

Page 21: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q10: The goals of the training were clearly defined.

Usability (Satisfaction) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show descriptive stats including 95% CI of the results

Q11: Overall, I was satisfied with this course

Usability (Satisfaction) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show descriptive stats including 95% CI of the results

19

Page 22: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q12: Was there a learning format (i.e., e-book, e-learning, mobile app or video) that you preferred but were unable to evaluate? Please explain.

Interest User Interface: Open-ended Text Question

Analysis: Synonym, Pareto, Relationship Mapping, High Frequency Pairwise

Show a table and descriptive stats of the most common terms used to reflect the positive experience

Display a relationship mapping of what users liked most about the learning experience

Display a relationship mapping overall as well as by device

Q13: What did you like most about this learning experience?

Interest User Interface: Open-ended Text Question

Analysis: Synonym, Pareto, Relationship Mapping, High Frequency Pairwise

Show a table and descriptive stats of the most common terms used to reflect the positive experience

Display a relationship mapping of what users liked most about the learning experience

Display a relationship mapping overall as well as by device

Q14: What did you like least about this learning experience?

Interest User Interface: Open-ended Text Question

Analysis: Synonym, Pareto, Relationship Mapping, High Frequency Pairwise

Show a table and descriptive stats of the most common terms used to reflect the less-than-positive experience

Display a relationship mapping of what users liked least about the learning experience

Display a relationship mapping overall as well as by device

20

Page 23: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q15: List five words that express your learning experience.

Interest User Interface: Open-ended Text Question

Analysis: Synonym, Pareto, Relationship Mapping, High Frequency Pairwise

Show a table and descriptive stats of the most common terms used

Display a relationship mapping of what users expressed about the learning experience

Display a relationship mapping overall as well as by device

e-book Format Evaluation

Q1: The e-book was easy to read

Usability (Satisfaction) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

21

Page 24: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q2: The e-book was easy to use

Usability (Ease of Use) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

Q3: The e-book was easy to understand

Usability (Learnability) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

22

Page 25: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q4: The e-book hyperlinks allowed me access to additional learning material

Utility User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

Q5: The e-book was well constructed

Usability (Satisfaction) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

23

Page 26: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q6: The e-book is a valuable learning tool

Utility User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

Mobile App Format Evaluation

Q1: The mobile app was easy to read

Usability (Satisfaction) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)Analysis: Use z-score to analyze difference between agree scores and disagree scores by devicesData display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

24

Page 27: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q2: The mobile app format was easy to use

Usability (Ease of Use) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

Q3: The mobile app course material was easy to understand

Usability (Learnability) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

25

Page 28: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q4: The mobile app hyperlinks allowed me access additional learning material

Utility User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

Q5: The mobile app was well constructed

Usability (Satisfaction) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

26

Page 29: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q6: The mobile app is a valuable learning tool

Utility User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

Video Format Evaluation

Q1: The video length was acceptable

Usability (Satisfaction) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

27

Page 30: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q2: The video format was easy to use

Usability (Ease of Use) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

Q3: The video course material was easy to understand

Usability (Learnability) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

28

Page 31: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q4: The video was clearly presented

Usability (Ease of Use) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

Q5: The video is a valuable learning tool

Utility User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

29

Page 32: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q6: The video hyperlinks allowed me access additional learning material

Utility User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

Q7: It was easy to start/stop the video

Utility User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

E-learning Format Evaluation

30

Page 33: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q1: The e-learning format was easy to use

Usability (Ease of Use) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

Q2: The e-learning course material was easy to understand

Usability (Learnability) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

31

Page 34: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q3: The e-learning course material was clearly presented

Usability (Satisfaction) User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

Q4: E-learning is a valuable learning tool

Utility User Interface: Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)

Analysis: Use z-score to analyze difference between agree scores and disagree scores by devices

Data display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

32

Page 35: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q5: The e-learning hyperlinks allowed me to access additional learning material

Utility Likert Scale (Strongly Disagree to Strongly Agree and Decline to Answer)Analysis: Use z-score to analyze difference between agree scores and disagree scores by devicesData display: Data Table, Scatter plot

Show overall descriptive stats including 95% CI of the results

Category Analysis

33

Page 36: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q10: The goals of the training were clearly defined

Q11: Overall, I was satisfied with this course

Q1: The e-book was easy to read

Q2: The e-book was easy to use

Q3: The e-book was easy to understand

Q5: The e-book was well constructed

Q1: The mobile app was easy to read

Q2: The mobile app format was easy to use

Q3: The mobile app course material was easy to understand

Q5: The mobile app was well constructed

Q1: The video length was acceptable

Q2: The video format was easy to use

Q3: The video course material was easy to understand

Q4: The video was clearly presented

Q1: The e-learning format was easy to use

Q2: The e-learning course material was easy to understand

Q3: The e-learning course material was clearly presented

Usability User interface: Likert scale (strongly disagree to strongly agree).

Analysis: Use z score to analyze difference between agrees scores and disagree scores.

Data display: Scatter plot

Data analysis for a Usability Summary Evaluation of the learning formats and the respective devices, which includes aggregated values displayed as radar chart

Show overall descriptive stats including 95% CI of the results overall and by learning format

34

Page 37: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q2: Which learning formats did you evaluation? (Select all that apply)

Q3: I have control over the pace and sequencing of my learning process.Q4: I may not have been able to learn the course material if provided in only one format.

Q8: When I had trouble understanding the material, I used another format to help clarify my understanding.

Q4: The e-book hyperlinks allowed me access to additional learning materialQ6: The e-book is a valuable learning toolQ4: The mobile app hyperlinks allowed me access additional learning material

Q6: The mobile app is a valuable learning tool

Q5: The video is a valuable learning tool

Q6: The video hyperlinks allowed me access additional learning materialQ7: It was easy to start/stop the videoQ4: E-learning is a valuable learning toolQ5: The e-learning hyperlinks allowed me to access additional learning material

Utility User interface: Likert scale (strongly disagree to strongly agree).

Analysis: Use z score to analyze difference between agrees scores and disagree scores.

Data display: Scatter plot

Data analysis for a Usability Summary Evaluation of the learning formats and the respective devices, which includes aggregated values displayed as radar chart

Show overall descriptive stats including 95% CI of the results overall and by learning format

35

Page 38: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q5: I feel confident that I have a good understanding of American Culture based on the training.Q6: I feel confident that I can help a colleague understand American Culture.

Q7: I am willing to share my experiences and lessons learned with those going on training in the U.S.

Learning User interface: Likert scale (strongly disagree to strongly agree).

Analysis: Use z score to analyze difference between agrees scores and disagree scores.

Data display: Scatter plot

Data analysis for a Learning Summary Evaluation of the learning formats and the respective devices, which includes aggregated values displayed as radar chart

Pre-Test

Q1: Punctuality applies to all of the following EXCEPT

Q2: American military personnel have identical values and interest.Q3: Which of the following IS NOT true about Individualism?

Q4: Which of the following statements IS NOT true about American Culture?Q5: The phrase “each person has a broad range of personal qualities, interest and priorities” refers to

Learning User Interface: Likert Scale with answer optionsAnalysis: Use z score to analyze difference between agrees scores and disagree scores.Data display: Line Graph, Data Table

A line graph and table of the responses and their percentages

Show descriptive stats including 95% CI on the resultsCompare previous scores-post scores to determine if improvements can be attributed to the presentation of learning course material in multiple formats

36

Page 39: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Q9: In addition to the testable course material, I viewed the following course material. (Select all that apply)Q12: Was there a learning format (i.e., e-book, e-learning, mobile app or video) that you preferred but were unable to evaluate?

Q13: What did you like most about this learning experience?Q14: What did you like least about this learning experience?

Q15: List five words that express your learning experience.

Interest User interface: Likert scale (strongly disagree to strongly agree).

Data display: Scatter plot

Data analysis for an Interest Summary Evaluation of the learning formats and the respective devices, which includes aggregated values displayed as radar chart

The combination of all Usability and Utility Questions by devices

Useful User interface: Likert scale (strongly disagree to strongly agree).

Analysis: Use z score to analyze difference between agrees scores and disagree scores.

Data display: Scatter plot

Data analysis for a Summary Evaluation the Usefulness of the learning formats and the respective devices, which includes aggregated values displayed as radar chart

Show overall descriptive stats including 95% CI of the results overall and by learning format

Cross Correlation Analysis

37

Page 40: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Demographics:

Q1: What is your age?Q2: What is your gender?

Q3: What is your profession?Learning Formats:

Q2: Which learning formats did you evaluation? (Select all that apply)

Choice of Learning Formats vs Demographics

User Interface: Likert Scale with answer optionsAnalysis: Use z score to analyze difference between agrees scores and disagree scores.Data display: Line Graph, Data Table

Determine if there is a correlation between the learning formats and the demographics (i.e., age, gender and profession)

Show overall descriptive stats including 95% CI of the results by learning format

Demographics:Q1: What is your age?

Q2: What is your gender?Q3: What is your profession?

Technology:Q1: Which devices did you use for evaluating the learning formats? (Select all that apply)

Choice of Technology vs. Demographics

User Interface: Likert Scale with answer options

Analysis: Use z score to analyze difference between agrees scores and disagree scores.

Data display: Line Graph, Data Table

Determine if there is a correlation between technology selection and demographics (i.e., age, gender and profession)Show overall descriptive stats including 95% CI of the results technology

Technology:

Q1: Which devices did you use for evaluating the learning formats? (Select all that apply)Learning Formats:

Q2: Which learning formats did you evaluation? (Select all that apply)

Choice of Formats vs. Technology

User Interface: Likert Scale with answer optionsData display: Line Graph, Data Table

Determine if there is a correlation between technology and learning formats

38

Page 41: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

Questionnaire Survey Evaluation Attribute Survey Analysis

Data Analysis

Technology:

Q1: Which devices did you use for evaluating the learning formats? (Select all that apply)Usability

The combination of all Usability Questions by devicesLearning

The combination of all Learning Questions by devices

UsefulnessThe combination of all Usefulness Questions by devices

Choice of Technology vs. Usability, Learning and Usefulness

User Interface: Likert Scale with answer optionsAnalysis: Use z score to analyze difference between agrees scores and disagree scores.Data display: Line Graph, Data Table

Determine if there is a correlation between technology and usability, learning and usefulnessShow overall descriptive stats including 95% CI of the results comparing the technology with usability, learning and usefulness

39

Page 42: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

5.5 Data Analysis

The following section outlines the analytical approach to conduct a cross correlation of the MLS survey data in order to draw inferences and present trends found in the data. This section will cross correlate various combinations of statistically significant measures of demographic data (Age, Gender, Profession, Military Rank/Experience, etc) with delivery formats ( E-book, Mobile App, Video, E-learning) across the following evaluation criteria; Technology (i.e., personal computer; android, apple or kindle devices), Usability (how easy and pleasant were the multiple formats), Learning (user satisfaction, and ease-of-use), Usefulness (Desirability/Interest) and Utility.

The following table will provide cross correlation of the types of delivery formats against the following data domains: This analysis will answer the following critical questions:

a) What role does Demographics have on the choice of learning formats?b) What role does Demographics and Technology have on the choice of learning

formats?c) What role does the choice of Technology have on the choice of learning formats? d) What role does the choice of Technology have on the Usability, Learning, and

Usefulness?

Delivery format ( e-book, e-learning, mobile app, video) vs (a) Demographics (Age, vs Gender, vs Profession, vs Mil Rank/Experience), (b) Technology (type of device, etc.) and (c) Usability and Ease of Use

5.6 Data Collection

Upon completion of the Concept Evaluation, all data will be downloadable and provided to the Testing & Evaluation Analysis Team, as shown in Figure 5-1. This includes:

1) An Excel Spreadsheet will contain either the raw data or a summary file.

2) A Summary Sheet will display each PIN, the respective question and the human subjects’ response.

3) A Summary Table will display the unique ID, which is not the same as the PIN, the respective question and the participant’s response.

4) Summary Charts provide graphics based on data requirements.

5) The Participants Progress Tab will show each PIN assigned, Registration, Pre-Test, Post-Test, and Completion Status.

40

Page 43: Data Collection and Analysis Plan Template - APAN - Sign In Web viewAdministrative Information. General. This section describes the MLS Testing & Evaluation Team, the project sponsor,

6) The User Course material Tab will show the PIN and the Course material Format selected by the respective participant to evaluate.

7) The Registration Database is only for use by the US Extramural Investigator to upload PINs for the evaluation.

Figure 5-1 -MLS Data Collection Website

5.7 Analysis Summary QuestionsAnalysis Summary Questions:

a) What are the predominant trends for the particular training audience?b) How can the training delivery formats be optimized to the particular training audience?

6 Reports

6.1 Final Report

The Final Report will be released after it has been approved the MLS Project Manager. Once approved, the Lockheed Martin International Training Team (LMITT) will provide the report in both a standard reporting format as well as in an e-book.

41