35
Data Gathering Tools & Processes for Action Research Workshop Series Session 2 of 6

OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Data Gathering Tools & Processes for Action ResearchWorkshop Series Session 2 of 6

A Solution Tree Assessment Literacy Series

Cassandra ErkensA PLC and Assessment Associate

[email protected]

Page 2: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Tools to Support Action Research

Guide to the document:

General Research Info Qualitative QuantitativeNotes Qualitative Research Quantitative ResearchData Management Considerations

Surveys/questionnaires Likert Scales, Proficiency Scales and Rubrics

Sources for more information Interviews, Focus Groups, Group Discussions, and Panel Presentations

Comparability Tools

Journal entries / Anecdotal data

Notes:

Action Research can employ both qualitative and quantitative research methodologies. Social science methods (studying human conditions or phenomena) typically rely on multiple methods to balance perspectives.

Traditional experimental models that employ the controlled study approach (one group receives a treatment and the other group does not) are generally less acceptable in the social science arena. It would be considered at best bad practice and at worst malpractice to withhold a positive practice or impose a negative practice on humans – especially minors without legal rights to consent or withhold consent.

Qualitative data can be quantified (e.g. 82% of the students agreed that tracking their own results increased their attention to detail . . . . ) and quantitative data can be qualified (e.g. although the numbers show little gain in student achievement in a semester’s time frame, teachers observed a significant increase in motivation that they believe will eventually result in more substantial gains).

Back to the top

Assessment Literacy Action Research Tools, 2© Erkens, 2014

Page 3: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Qualitative Research

What is it? Qualitative research is an investigatory process that emphasizes the importance of examining variables and phenomena in the natural settings in which they are found. Open-ended data gathering tools and methods lead to qualitative data in the form of direct quotes from the researchers and/or from the stakeholders in the study. The resulting data are subjective.

Qualitative Data Is Valuable

Produces more in-depth, comprehensive information to fully describe the context or background to a given phenomena. It provides perceptual data that can describe context, setting, politics, variables, key players, and dynamics or interactions in and among all components.

Engages input a wide variety of formats: observations, interviews, surveys, questionnaires, focus groups, group discussions, and the like.

Considerations for gathering qualitative data

Maintain the big picture - always know how each question supports the overall data required.

Identify the purpose behind every question asked and assure that the question remains focused on its targeted expectation (e.g. do not accidently measure unintended concepts).

Anticipate how the information generated from each question will need to be organized and categorized.

Avoid gathering unnecessary data. Ask questions that respondents would be willing to answer. Establish guidelines/norms/protocols that will protect participants and maintain

anonymity when needed. Create directions, prompts or questions that are easily understood and would be

accurately interpreted by all participants.

Qualitative tools/processes:

Surveys/questionnaires Interviews / Focus groups / Group discussions / Panel discussions Journal entries / Anecdotal data

Back to the top

Surveys / Questionnaires

Assessment Literacy Action Research Tools, 3© Erkens, 2014

Page 4: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Surveys and questionnaires are tools (usually in paper or online forms) created to elicit the feelings, beliefs, experiences, perceptions, or attitudes of some sample of individuals. As data gathering instruments, the tools could generate closed (choose from a list of options or rate your experience using a scale) or open-ended (provide your personal thoughts/contributions with no forced choices or rating scales) results.

For best results, such tools are most frequently designed to be brief with carefully preplanned set of questions aimed at generating specific information to meet a particular need for research information about a pertinent topic. Data are gathered from a particular population or sampling of the population, so demographic data are most often requested at the beginning of the tool.

Developing good questions:

Avoid questions that have more than 1 part to them. (words like ‘and’ or ‘or’ – e.g. do you favor or oppose changing grading policies and grading software?) Respondents cannot clarify if they favor one but oppose the other.

Avoid embedding assumptions in the question. (e.g. When you graduate from high school, will you go to college or launch directly into your career? What about going to technical school or heading into the military or taking a year to travel abroad? All other possibilities have been eliminated in the assumption that there are only 2 options.)

Clarify all terminology. Be as specific as possible with ‘educationese’ like academic or technical terms (e.g. not everyone understands what ‘formative assessment’ is or means).

Break challenging questions / complex sentences into smaller parts. Avoid using emotional, leading, or evocative language that can bias responses. (e.g. do

you believe it’s time to abandon our failing grading systems?) Balance questions to allow for positive or negative responses, making either option safe

for respondents. (e.g. do you support or oppose giving students options to retake tests?) Identify early how you will sort and interpret the results. (e.g. if you are seeking to find

the negative impact of something, but half of your questions are framed in the positive, it will make interpreting the results much harder).

Back to the top

Interviews, Focus Groups, Group Discussions and Panel Presentations

These qualitative tools are used to gather personal input from individuals or groups. They can be quantified (Likert scales for self rating, proficiency scales for observations, rubric scores for responses, etc.) Much thought must be given to the design of the questions or prompts, the recording processes (note taking, video or audio recording, observations, etc.), and the quantifying tools or features if needed.

When the data are strictly qualitative, the researchers must cut/paste and organize all of the comments based on themes or patterns that emerge across the respondents.

Assessment Literacy Action Research Tools, 4© Erkens, 2014

Page 5: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

When asking questions or providing prompts remain focused and few. Create tools such as the following that highlight and maintain the original purpose behind the question so the interviewer can maintain focus with the interviewee if needed.

Interviews: An oral process that involves one on one questions and answers. The process may be conducted face to face or remotely through technological connections. The interviewer generally records interviewee’s responses in some manner and may have a proficiency scale or rubric to qualify the answers provided. Interviews provide a great method to collect required data in systematic ways for targeted questions. Unlike surveys or questionnaires, interviews allow for non verbal communication which can add additional meaning to the sender’s message. Interviews also extend the opportunity for the interviewer to provide follow up questions for clarifications to provided answers as needed.

Focus Group: A collective interviewing process used with a small group of individuals who are provided with questions for answering. Respondents answer as they are moved to answer and when it seems a question has been sufficiently answered, another question is asked. Focus groups are often gathered based on commonalities among group members and the goal is to narrow the findings based on common themes that emerge. Focus groups are great research based tools that enable a group of observers to gather perceptual, preference, or opinion based data quickly. In focus groups like interviews, the interviewers will use recording tools. Typically, the data are managed by seeking common themes that emerge based on individual questions.

Group Discussions: A structured interactive oral process in which the exchange of ideas takes place in a systematic way. Group dynamics may hinder or enhance the process. As a research method, group discussions require the use of protocols and roles to minimize the negative impacts that group dynamics may have on the results. Unlike focus groups, group discussions will not have an interviewer providing a stream of questions; rather, a particular problem, topic, issue or situation is posed at the beginning and then the group works collectively to solve the problem, generate new understanding, develop an insight or new understanding, or create alternatives.

Panel Discussions: A structured, oral process in which a select sample of experts in a particular field or knowledgeable participants in a given situation are gathered to provide input to a prompt or phenomena. Panel discussions are more formal than focus groups and observers can strategically target questions to specific participants on the panel. While focus groups tend to gather groups with like characteristics and perspectives, panel discussions tend to gather a broad array of perspectives for a broadening of understandings that follow the questions.

A few examples follow:

Sample Interview (or focus group) ProtocolUnderstanding Leadership in a PLC Setting

Name of Interviewer: Date of Interview:

Name of Interviewee: Role of Interviewee:Assessment Literacy Action Research Tools, 5

© Erkens, 2014

Page 6: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

No. Question Purpose1. First, we’re interviewing educators we consider to be

strong leaders in the work of PLCs. Do you believe there is a difference in leading within a PLC from leading within a traditional system? If so, what would that be? If not, what’s the same about it?

Identify distinctions of PLC leadership

2. If you were to isolate the 3 leadership practices that most help you be effective in leading PLC work, what would those 3 practices be and why are each important?

Practice 1

Practice 2

Practice 3

Identify key leadership practices in leading PLC work and establish value of each strategy

3. How did you identify that those three practices were needed for yourself and what did you have to do to ‘grow’ them in your own leadership work? Please be specific about any/all strategies you used to develop these practices:

Practice 1

Practice 2

Practice 3

Dig Deeper exploration: How did you know you needed each? What indicators were you watching as you

developed each to make sure you had it? Would your colleagues agree that these practices

are in place for you? Why or why not? What books, mentors, events, etc. were most

supportive to your learning curve?

Identify ability to see and develop a leadership practice in order to lead effectively.

4. Are the practices you have been describing practices that should be encouraged at all levels of a PLC organization?

Dig deeper questions:

Transferability of practice

Assessment Literacy Action Research Tools, 6© Erkens, 2014

Page 7: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

if so, how do you establish / nurture / empower that practice in others with whom you work?

If not, why not?

5. You have talked about how you identified and developed these practices for yourself as a leader. How do you ‘grow’ these same three practices in your colleagues?

Dig deeper questions:What specific strategies have you used to teach the practice, to monitor for the practice, to celebrate the practice? How do you know when it’s in place as a practice?

Is it transferable as a practice? Why or why not?

Transferability of practice

6. Describe some of the setbacks and mistakes that you might have experienced in your PLC journey when the practices you have defined were not in place or were not fully mastered by you in advance. Be specific with examples in your explanation.

Set backs of the practice(s) when not in use.

7. Describe some of the greatest successes or triumphs that you might have experienced as a result of the practices you employ. Be specific with examples in your explanation.

Celebration of the practice(s) used well.

8. Is there anything else that you’d like to tell me about leading in a PLC? Any question I’ve not asked or any important idea I might not have discovered during our time together?

Open option to fill in for that which we might not have thought to ask.

Back to the top

Student Panel

Purpose

Assessment Literacy Action Research Tools, 7© Erkens, 2014

Page 8: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Educators seek information from a broad based student population regarding what works in school and what doesn’t. Student input will be used to help educators make improved decisions to support continued student achievement and social/emotional well-being.

Guidelines

We would like to hear from everyone. So, we invite everyone to participate at multiple points during the panel discussion.

You have the right to pass. Ask questions if you aren’t sure what we mean in our initial question to you. We would like you to be really honest because we want to learn from what you have to

say. Avoid personal comments about teachers and please don’t use names. Everything you say here today will be kept confidential. We might reference the

comments we heard today as we think about what we should be doing to help students, but we will not reference who made the comments.

Nothing you say here today will affect your grades or your relationships with teachers.

Questions

Talk about what’s important to you – why do you think school is important? What purpose does school hold for you?

If you were the teacher, what would you do to help students be successful in school?

Describe a project/unit/lesson where you learned a lot…what was it that helped you learn?

Describe a project/unit/lesson where you learned little or nothing…what was it that got in the way?

When you don’t understand something, what helps you understand? What do you do?

What gets in the way of you being successful?

Besides grades, how do you know when you have learned something?

If a teacher is working with a student who doesn’t get it, or isn’t connecting with the teacher, what recommendations would you make for that teacher?

What advice would you give to younger students?

What plans do you have after high school? Who or what helped you make these plans?

Do you have any other questions or comments to make before we wrap up?Assessment Literacy Action Research Tools, 8

© Erkens, 2014

Page 9: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Back to the top

Journal Entries/anecdotal notes

Journal entries are formal written contributions that are often used to record the researcher(s)’s actions, beliefs, presuppositions, choices, or evolving insights during the research process. Journal entries can record quotes from participants being studied, direct observation notes, specific details or alternations in plans, etc. Researchers can use their own journals or review the journals of those they are studying (with prior permission, of course).

Teacher generated observations can provide insight into what’s happening in the classroom as well as what’s happening behind the scenes (instructionally). The connection between the two spaces (classroom observations and teacher’s thinking) can clarify the effectiveness of the strategies employed. Observations must be recorded immediately, before the moment is lost and ‘past’ memories alter the reality. And, because observational data /anecdotal note taking is at risk of being lost, it’s important to employ a consistent process (e.g. randomly select a handful of students to watch for 10 min. a day; select a different group to watch each Monday, and so on) with a formal note taking mechanism (paper notebook with color coding system / handheld device with note taking option that uploads pictures, video, audio, etc.) in which to record all of the quotes and observations made. Before each entry, keep track of the date, students’ initials, and then add any other pertinent information relative to the context or problem at hand.

Back to the top

Quantitative Data

What is it? Quantitative research is employed to find relationships between independent and dependent variables within a provided context. The research attempts to gather data by objective methods to reveal findings that involve relationships, predictions, and comparisons. The findings are reported in the form of numbers, logic, and convergent reasoning. The process is considered to be more objective than qualitative research in that the research methods attempt to remove the investigator and the feelings or biases of the human condition from the investigation.

Quantitative Data Is Valuable

Produces ‘clean’ data (free from subjectivity) that are logical and finding should be replicable.

Classifies features, evaluates phenomena numerically, resulting in frameworks or models to explain phenomena.

The research study can usually be replicated or repeated, given its high reliability.

Assessment Literacy Action Research Tools, 9© Erkens, 2014

Page 10: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Considerations for gathering quantitative data

Gather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments and create a basis for comparison.

Practice scoring / employing tools with consistency when working in a team to generate reliable and valid data.

Avoid inferring causality, particularly in nonrandomized designs or without further experimentation.

Explore anomalies and trends with persistence to find root causes. Report surprises in findings or changes in research protocols that occurred along the way. Provide a context that describes assumptions, gaps, or exclusions made or discovered

along the way.

Quantitative tools/processes:

Use scales / proficiency levels to ‘measure’ the observations or comments When working in a team, use tools with consistent questions and processes for gathering

data Arrange data in tables, charts, figures, or other non-textual forms for comparability

features. Sample tools include:

o Data Tableso Run chartso Bar graphso Pie chartso Affinity diagrams / wordles (word clouds)

Back to the top

Likert Scales, Proficiency Scales and Rubrics

A Likert Scale is a continuum of numbers used to assign a quantitative value to a qualitative response (e.g. 1 = disagree and 5 = agree: on a scale of 1 to 5, how much to you agree that homework should be used for practice?) Likert scales can vary in range (1 – 100; 1 – 10; 1 – 5), but 1 – 5 is the most common range because the tighter the range, the more consistent the results.

Patterned after the design of a Likert Scale, a Proficiency Scale is an assessment instrument used to measure a learner’s ability to demonstrate competency against a given standard or set of standards. It is comprised of a set of descriptions of what a learner can do in a specific knowledge or skill set. Each level in the scale describes a stage in the learner’s development of competence from insufficient to mastery as he/she completes the standard. Scales can be used to

Assessment Literacy Action Research Tools, 10© Erkens, 2014

Page 11: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

score summative assessments so that a single number results in regards to the entire standard. They can also be used to score single items on a selected response or short constructed response assessment items.

A Rubric is an assessment instrument used to measure a learner’s ability (proficiency levels) to demonstrate competencies against a given set of criteria. A rubric is comprised of descriptive language to communicate expectations of quality for a performance task or tasks, which will have multiple criteria for success. Rubrics are used to remove complexity and subjectivity by describing clear and specific expectations for quality in the performance.

Both scales and rubrics require the following features: A focus on measuring learning descriptively and objectively Quality criteria based on the standard expectations and the field of expertise for that

standard A stated and consistent range of knowledge or skill that outlines developmental

sophistication from insufficient to masterful Descriptions of the specific performance characteristics arranged in the levels of

developmental sophistication.

Samples follow:

A Likert Scale can be used more than once for a single question. For example, a researcher could measure for both how much something is valued and how much something is practiced (values/practice inventory):

PROFICIENCY SCALE FOR SUMMARIZE

3 2 1

Assessment Literacy Action Research Tools, 11© Erkens, 2014

Page 12: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

___I can determine key ideas or events/points of a topic from a text___I am concise (to the point) when stating the key ideas/points. ___I can utilize key vocabulary (when appropriate) in my summary.___I can organize my summary in a way that aligns with the text structure.

___I can determine few key ideas or events/points of a topic from a text___I can be somewhat concise (to the point) when stating the key ideas/points. ___I can utilize few key vocabulary (when appropriate) in my summary.___I can somewhat organize my summary.

___I can not determine key ideas or events/points of a topic from a text___I can not be concise (to the point) when stating the key ideas/points. ___I can not utilize key vocabulary (when appropriate) in my summary.___I can not organize my summary.

RUBRIC FOR SUMMARIZE

Back to the top

Scales or rubrics can be used to monitor student work or evaluate observations. Scales like the following can be used to observe student discussions:

Rubric 1: Engineering Engaged ThinkingAssessment Literacy Action Research Tools, 12

© Erkens, 2014

Page 13: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

4Almost all students, almost all of the time, are verbally engaged in the high cognitive demand tasks of strategic thinking and extended thinking. It is clear that the students are demonstrating a deep understanding as exhibited through planning, using evidence, and cognitive reasoning (Assess, Revise, Critique, Draw Conclusions, Differentiate, Formulate, Hypothesize, Cite Evidence). It is clear that students have engaged in investigation that requires time to think and process multiple conditions of the problem (Synthesize, Analyze, Prove, Connect, Design, Apply Concepts). All conversations are sustained at a high level of challenge or difficulty.

3The majority of the students are engaged in at least one major activity during the lesson in which they are verbally engaged in the high cognitive demand task of strategic thinking. There is a large body of evidence during the activity that students are demonstrating a deep understanding as exhibited through planning, using evidence, and cognitive reasoning (Assess, Revise, Critique, Draw Conclusions, Differentiate, Formulate, Hypothesize, Cite Evidence). Most of the conversations are sustained at a high level of challenge or difficulty, but some of the conversation are sustained at a level of medium difficulty.

2Students are primarily engaged in skills and concepts. A little of their conversation might require students to engage in strategic thinking (one or two of the following: Assess, Revise, Critique, Draw Conclusions, Differentiate, Formulate, Hypothesize, Cite Evidence), but the majority of the time they are processing knowledge and skills at a level of application (Infer, Identify Patterns, Modify, Predict, Distinguish, Compare). Most of the conversations are sustained at a medium level of challenge or difficulty, but some of the conversations are sustained at a low level of difficulty.

1Students are beginning to demonstrate a shift from basic recall of concepts, facts, definitions, and processes (Recite, Recall, Label, Naming, Define, Identify, Match, List, Draw, Calculate), to engaging in skills and concepts that draw upon recalled knowledge (Infer, Identify Patterns, Modify, Predict, Distinguish, Compare). Most of the conversations are sustained at a medium level of challenge or difficulty, but some of the conversations are sustained at a low level of difficulty.

© C Erkens, 2014

Back to the top

Assessment Literacy Action Research Tools, 13© Erkens, 2014

Page 14: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Rubric 2: Activating Learners as Collaborative Resources

4In addition to all of level 3, students engage in productive, collaborative group work. They are synergistic, capitalizing and blending on each others’ strengths and weaknesses:

a. The dialogue builds coherently on participants’ ideas to promote improved collective understanding of a theme or topic.

b. Students expand their collective insights and repertoire of skills and strategies to address errors and gaps in understanding.

c. Students establish a social norm of excellence for all relying on social pressure and collective support to motivate and encourage all learners to achieve mastery.

3Students collaboratively increase understanding by engaging in disciplined inquiry, rich dialogue, and active debate. In doing so, they challenge each others’ thinking, extend current thinking, and create new possibilities. The conversation involves sharing of ideas and is not completely scripted or controlled by one party (as in teacher-recitation). In addition students provide formal and informal peer feedback in the moment that aligns with teacher expectations and clarifies gaps in understanding, misconceptions in concepts, or errors in reasoning.

2Students are beginning to engage in dialogue, and debate without depending on provided protocols. Some of the learners have begun challenging each others’ thinking, extending current thinking, or creating new possibilities, but not all learners are fully invested in the learning component of the conversations. The mechanics (protocols and procedures) of the discussion are still focal points for many of the learners. Students can provide formal and informal peer feedback in the moment that aligns with teacher expectations but may not go as deep as to clarify gaps in understanding, misconceptions in concepts, or errors in reasoning.

1Students require guidance and instructional steps in order to participate in a dialogue or debate. The instructional focus is on the activity rather than the learning that might emerge because of the activity. The conversation is completely scripted or controlled by one party (as in teacher-recitation). Any feedback that is offered is orchestrated by the teacher and the conversations are isolated to a specific set of criteria. The quality of any feedback that is offered is contingent upon the personal criteria or interpretation of the teacher’s criteria of the individual offering the information.

© C Erkens, 2014Back to the top

Assessment Literacy Action Research Tools, 14© Erkens, 2014

Page 15: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Comparability Tools

Data Tables: Simple tables can be used to demonstrate comparable results. Data tables present numerical data inside of a grid format comprised of labeled columns and rows. Sample one shows comparative data from 3 different classrooms (how many students scored at which proficiency level per classroom) and Sample 2 shows one classroom set of data by student). Each table provides data in comparision from which inferences can be drawn:

Sample 1:

 1 2 3 4 Total

Teacher A 3 14 24 

41

Teacher B 7 20 44 3 74

Teacher C 12 24 57 10 103

Total 22 58 125 13 218

Sample 2:

Assessment Literacy Action Research Tools, 15© Erkens, 2014

Page 16: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Run Charts: Run charts are linear graphs that allow you to track improvements by displaying data in a time sequence. Time is generally displayed on the horizontal (x) axis and the measure that you are tracking is displayed on the vertical (y) access.  Using a run chart will allow you to see if improvement is really taking place by displaying a pattern of data that you can observe as you make changes to your process.  A run chart could be used to display the progress of a single learner over time or it could be used to display class averages over time.

Bar graphs: Bar graphs graphically display data by using bars of differing heights to reveal comparable values. Bar graphs can be displayed in multiple formats:

Assessment Literacy Action Research Tools, 16© Erkens, 2014

Page 17: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Assessment Literacy Action Research Tools, 17© Erkens, 2014

Page 18: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Pie chart: A pie chart is a circular chart divided into sectors to illustrate numerical proportion; each sector shows the relative size of each value.

Data can be compared within a chart and between charts

Assessment Literacy Action Research Tools, 18© Erkens, 2014

After Reading Intervention 1

Page 19: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Affinity diagrams / Wordles (word clouds): An Affinity Diagram is a tool that gathers large amounts of language data (ideas, opinions, issues) and organizes them into categories based on their natural relationships and popularity/frequency or levels of importance. The Affinity process is often used to group ideas generated by brainstorming.

If researchers use a lot of qualitative data, the affinity diagram can be used to highlight themes, anomalies and frequency of ideas found. There are processes to follow during the research process:

Step 1 Gather the data / Generate the ideasStep 2 Display ideas – usually using color-coded highlighters in text or small

sticky notes outside of the larger body of textStep 3 Sort ideas into categoriesStep 4 Create labels for each visible cluster and label the clustersStep 5 Create a diagram that reveals relationships between categories and

frequency or popularity within a category.

A Wordle is a word cloud that can quickly highlight which important words are most frequently listed in a given scanned text. The words (themes) that are most prevalent have the largest text size in the cloud image.

Assessment Literacy Action Research Tools, 19© Erkens, 2014

Page 20: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

 Back to the top

Data Management Considerations

Sampling

This research process is used to limit the scope of an investigation by selecting and studying a smaller number of the items or individuals that are part of a relatively large group. The sampling can be random or non-random (human judgment is used to identify items/individuals for study purposes).

Forms such as the following can be used for random sampling during observations.

Interval Recording Observation Form For a single student / isolated target behavior

Student’s Name: [Click here and type name]

Environment: [Click here and type name]

Target Behavior: [Click here and type name]

Assessment Literacy Action Research Tools, 20© Erkens, 2014

Page 21: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Start Time: [Click here and type name]

Observer: [Click here and type name]

Date: [Click here and type name]

10 minute observation period: 30-second intervals

9:00 9:01 9:02 9:03 9:04 9:05 9:06 9:07 9:08 9:09 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Key: + is marked for each occurrence of target behavior; - is marked if target behavior did not occur

Frequency Recording Observation FormFor multiple students /

Environment: [Click here and type name]

Target Behavior: [Click here and type name]

Start Time: [Click here and type name]

Observer: [Click here and type name]

Date: [Click here and type name]

5 minute observation period: 30-second intervals

Names 1+/-

Note 2+/-

Note 3+/-

Note 4+/-

Note 5+/-

Note

Name:Name:Name:

Key: + is marked for each occurrence of target behavior; - is marked if target behavior did not occur.Provide notes regarding what behavior was occurring at time of observation.

Assessment Literacy Action Research Tools, 21© Erkens, 2014

Page 22: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

Corroboration

Corroboration is a data process employed to assure accuracy in data findings. Data are corroborated (from source to source, from researcher to researcher, from question to question) to ensure that the research findings accurately reflect people’s perceptions, whatever they may be. Corroborating findings does not mean determining whether or not the people’s perceptions are accurate or truthful. Engaging in the process helps researchers increase their collective understanding and increases the likelihood that their findings will be credible and valuable in a larger context.  Triangulation

Triangulation is a research process used to corroborate a finding. It generally means multiple data are gathered to validate a finding. There are many variations of triangulation:

One type involves the convergence of multiple data sources (semester grades, test results, homework scores, and large scale assessment data).

Another type is methodological triangulation, which involves the convergence of data from multiple data collection sources.

A third triangulation procedure is investigator triangulation, in which multiple researchers are involved in an investigation (3 classrooms experienced the same phenomena).

Related to investigator triangulation is researcher-participant corroboration, which has also been referred to as cross-examination.

Other procedures can be used to improve understanding and/or the credibility of a study. These include research or inquiry audit, peer debriefing, and the seeking of negative cases in the field that might disconfirm interpretations.”  Back to the top

Possible resources for more information:

OK State support for academic research:http://www.okstate.edu/ag/agedcm4h/academic/aged5980a/5980/newpage16.htm

Rubrics and Assessment Strategies:http://ccar.wikispaces.com/Rubrics-+Action+Research

Marzano Research Laboratory Proficiency Scale Bank (Must create a profile to access the scales):

Assessment Literacy Action Research Tools, 22© Erkens, 2014

Page 23: OCR Document · Web viewGather data using structured tools with numerical features (e.g. Likert scales, rubrics, proficiency scales, etc.) to measure the observations or comments

http://www.marzanoresearch.com/resources/proficiency-scale-bank

U.S. Department of Health and Human Services, Health Resources and Services Administration http://www.hrsa.gov/quality/toolbox/methodology/performanceimprovement/part2.html

USC Librarieshttp://libguides.usc.edu/content.php?pid=83009&sid=615867

Wordles: http://www.wordle.net/

Back to the top

Assessment Literacy Action Research Tools, 23© Erkens, 2014