Category Archives: Peer Review

Measures of Effective Listening

Thirty-two years ago, Donald E. Powers wrote Considerations for Developing Measures of Speaking & Listening. It was published by the College Board, which expresses how important these measures are to a student’s academic success, particularly in their Advanced Placement programs, yet has not validated any standardized tests to measure these skills. This synthesis on some of the research on listening offers advice to teachers enrolled in our MOOC Teaching Speaking & Listening Skills

Research shows that students can listen 2-3 grade levels above what they can read. Listening while reading helps people have successful reading events, where they read with enjoyment and accuracy. Listening while reading has been shown to help with decoding, a fundamental part of reading. The average person talks at a rate of about 125 to 175 words per minute, while we listen and comprehend up to 450 words per minute (Carver, Johnson, & Friedman, 1970).

Listening has been identified as one of the top skills employers seek in entry-level employees as well as those being promoted. Even though most of us spend the majority of our day listening, it is the communication activity that receives the least instruction in school (Coakley & Wolvin, 1997). On average, viewers who just watched and listened to the evening news can only recall 17.2% of the content.

Listening is critical to academic success. Conaway (1982) examined an entire freshman class of over 400 students. They were given a listening test at the beginning of their first semester. After their first year of college, 49% of students scoring low on the listening test were on academic probation, while only 4.42% of those scoring high on the listening test were on academic probation. On the other hand, 68.5% of those scoring high on the listening test were considered Honors Students after the first year, while only 4.17% of those scoring low attained the same success.

Students do not have a clear concept of listening as an active process that they can control. Students find it easier to criticize the speaker as opposed to the speaker’s message (Imhof, 1998). Students report greater listening comprehension when they use the metacognitive strategies of asking pre-questions, interest management, and elaboration strategies (Imhof, 2001). Listening and nonverbal communication training significantly influences multicultural sensitivity (Timm & Schroeder, 2000).

Understanding is the goal of listening. Our friend Erik Palmer suggests before students engage in purposeful listening, their teachers should tell them what to attend to. We need to teach students what to respond to, how to respond, and when to respond. For example, today we are going to listen to five speeches. For each speech, we are only listening for LIFE. After each speaker finishes, clap, then take a minute to evaluate the level of passion they put into their speech. After that write down three suggestions on how they could improve the LIFE in their speech (i.e., instead of emphasizing: you stole my red hat, try stressing, you stole my red hat).

A classroom teacher who reads Powers (1984) College Board study will understand that speaking, listening, reading and writing are all tightly correlated. Empirically measuring oral communication skills requires many hours of assessment on small, controlled populations. It is the opposite of what we experience in public schools where it is not feasible for us to precisely measure each skill. The important takeaway here is that teachers need to prepare their students to actively listen, avoid distractions, and teach listening and speaking with core academic content by training students to evaluate how well various speaking functions are accomplished by their classmates. While there are reliability issues with classroom peer review models, the benefits of “learning by evaluation” far outweigh the negatives.

References

http://www.listen.org/WhitePaper

http://www.skillsyouneed.com/ips/listening-skills.html

http://d1025403.site.myhosting.com/files.listen.org/Facts.htm

http://www.csun.edu/~hcpas003/effective.html

#TeachWriting Coaching Student Writers

Corbin Moore and I taught an online class called Improving Historical Reading and Writing over the summer. We learned that one of the major barriers to non-ELA teachers assigning writing in their classes is simply that they don’t feel comfortable providing feedback on that writing. They are also concerned about increasing their workload. Our experiences as classroom teachers have led us to include more writing in our daily practices. We hope this chat encourages other teachers to do the same.

Q1 With the recent emphasis on increasing writing in all subjects, how has your job as a teacher changed?
Goal-setting strategies are terrific. Here is a longer paper Scott wrote about using goal-setting strategies as formative assessment.
Shorter, more frequent, focused skill-building writing tasks show great promise in increasing positive attitudes toward writing. They can be graded quickly or used for peer review.
Q2 What is your definition of effective feedback?
This John Hattie article demonstrates that feedback has a strong effect on student learning. Unfortunately, this is not always positive.
Turnitin has done some extensive research on feedback and discovered a gap between teacher and student perceptions about what constitutes effective feedback.
Q3 What strategies/tools have you found valuable in providing feedback and/or peer review?
Google Docs
Rubrics/Criteria Charts
Q4 How is coaching student writers different from teaching writing? What are the advantages to coaching versus teaching writing?
Q5  What are the best writing tools, strategies, and frameworks for teaching writing and coaching students through the writing process?
Q6 What would happen if you stopped evaluating writing and switched to coaching?
Q7 How can teaching speaking and listening skills help improve student writing?
Extra Credit

Role of Robo-Readers

Grammarly

I have increased the amount of writing in my high school World History classes over the last five years. At first, I required two DBQs per semester, then I increased that to four DBQs per semester. Next, I added a five-page research paper at the end of the school year. Now, I assign research papers during each semester. If I were to allot ten minutes of reading/grading time for each DBQ that would be 80 minutes of grading per student, multiplied by last year’s total student load of 197 for a total of 263 hours of reading and grading. Assuming I spent 30 minutes correcting each research paper, an additional 197 hours of  grading would be added to my workload. Where do I find those extra 460 hours per year? Do I neglect my family and grade non-stop every weekend? No. I use a combination of robo-readers, or automated essay scoring (AES) tools and structured peer review protocols to help my students improve their writing.

Hemmingway App

As AES has matured, a myriad of programs has proliferated that are free to educators.  Grammarly claims to find and correct ten times more mistakes than a word processor. The Hemmingway App makes writing bold and clear. PaperRater offers feedback by comparing a writer’s work to others at their grade level. It ranks each paper on a percentile scale examining originality, grammar, spelling, phrasing, transitions, academic vocabulary, voice, and style. Then it provides students with an overall grade. My students use this trio of tools to improve their writing before I ever look at it.

PaperRater

David Salmanson, a fellow history teacher and scholar, questioned my reliance on technology. The purpose of these back and forth posts is to elaborate on the continuum of use that robo-readers may develop in the K-12 ecosystem. Murphy-Paul argues a non-judgmental computer may motivate students to try, to fail and to improve more than almost any human. Research on a program called e-rater confirmed this and found that students using it wrote almost three times as many words as their peers who did not. Perelman rebuts this by pointing out robo-graders do not score by understanding meaning but by the use of gross measures, especially length and pretentious language. He feels students should not be graded by machines making faulty assumptions with proprietary algorithms.

Both of these writers make excellent points, however, classroom teachers, especially those of us in low SES public schools are going to have a difficult time improving their discipline-specific writing instruction, increasing the amount of writing assigned, not to mention providing feedback that motivates students to revise their work, prior to a final evaluation. We will need to find an appropriate balance for giving both computerized and human feedback to our students.

Mayfield maintains that automated assessment changes the locus of control, making students enlist the teacher as an ally to help them address the feedback from the computer. I have found that students in my class reluctantly revise their writing per the advice of a robo-reader, but real growth happens when students have discussions in small groups in regards to what works and what doesn’t. Asking students to write a revision memo detailing the changes they have made in each new draft helps them see writing as an iterative process instead of a one and done assignment.

Read David’s post and participate in our #sschat on this topic on July 13th at 7pm EST/4pm PST.

Robo-Readers

Student Perceptions of Writing Feedback

What Students Say About Instructor Feedback was a 2013 study that examined student perceptions of instructor feedback using the Turnitin.com platform. Students wanted timely feedback, but rarely received it: 28 percent of respondents reported that their instructors took 13+ days to provide feedback on their papers. Students preferred feedback about grammar, mechanics, composition, and structure. Students found feedback on thesis development valuable. Despite high rates of electronic submission, students did not report receiving electronic feedback at nearly the same rate.

QuickMark Categories

From The Margins analyzed nearly 30 million marks left on student papers submitted to the Turnitin.com’s service between January 2010 and May 2012. QuickMark comments are a preloaded set of 76 comments covering 4 categories that instructors can drag- and-drop onto students’ papers within the Turnitin online grading platform.

This study looked specifically at frequencies and trends teachers employed when providing margin comments. The top 15 are listed below.

Top 10 QuickMarks

This 2014 follow-up study discovered that students found face-to-face feedback “very” or “extremely effective.” 77 percent of students viewed face-to-face comments as “very” or “extremely effective,” but only 30 percent received face-to-face “very” or “extremely often.”

Students perceived general comments on their writing to be “very” or “extremely effective.” However, a smaller percentage of educators felt the same. Even though 68 percent of students reported receiving general comments “very” or “extremely often,” and 67 percent of students said this feedback type was “very” or “extremely effective,” only one-third of educators viewed general comments as “very” or “extremely effective.”

Students preferred “suggestions for improvement” over “praise or discouragement.” The greatest percentage of students found suggestions for improvement “very” or “extremely effective,” while the fewest percentage of students said the same for “praise or discouragement.”

Students and educators differed on what constituted effective feedback.  The gap between educators and students was greater than 15 percent on the majority of areas examined. The biggest difference between educator and student responses occurred with “general, overall comments about the paper” and “specific notes written in the margins.”

Recommendations

Comments recorded by voice or audio may be a time-saving substitute for face-to-face feedback. Only five percent of student respondents reported receiving voice or video comments at the same frequency as that reported by students who report receiving face-to-face feedback “very” or “extremely often” (30 percent). As a way to negotiate time pressures and still be able to provide more personalized feedback, educators might consider leveraging the use of recorded voice or audio comments to provide feedback on student work. Many grading platforms and learning management systems (LMS) offer this feature as part of their services.

This study identified a clear relationship between exposure to feedback and perceived effectiveness of feedback. Thus, it is imperative to provide students with different types of feedback, and evaluate what is helpful for them. Obviously, the more feedback students get, the more valuable it becomes. Teachers should discuss the types of feedback they typically provide to their classes. Then ask students to share what types of feedback they have found most helpful.

The definition of “effective feedback” is going to be modified within your course. Poll your class to find out what types of feedback students think would improve their writing.

Sources

http://go.turnitin.com/What-Instructors-Say-on-Student-Papers

http://go.turnitin.com/what-students-say-about-teacher-feedback

closing_the_gap_infographic

CCSS Presentation Materials

Goal-setting approaches to student writing

I wrote a paper on the results that happened after implementing this program at two high schools. I give this presentation to inspire teachers to consider alternative grading methods and increase the number of writing assignments they require of their students. I have found that over the course of the year my students can double, if not triple the amount of words they put on a page in one class period. The next trick is to partner with an English teacher, who can help them take the quantity they are now proficient in and turn it into quality writing. I have found that this level of competition really motivates students. This work has borrowed heavily from Chip Brady and the excellent curriculum at The DBQ Project, who provided inspiring professional development and encouraged me along the way.

Peer review with tech

Many high quality studies influenced my decision to start evaluating student writing quantitatively, De La Paz, S. (2005), De La Paz, S., & Felton, M. (2010), Monte-Sano (2008, 2011) and (Monte-Sano & De La Paz, 2012). I strongly feel that History/Social Science departments should report descriptive statistics about their students’ writing in order to derive a common set of writing expectations by age and grade level. Further, recent advances in automated essay scoring may make it possible for students to receive feedback from a computer before approaching the teacher to partner in improving the writing together. See this Lightside Labs Revision Assistant video and feel free to expand on this annotated bibliography tracking the major players in the automated essay scoring market. K12 teachers should provide input to the companies developing these products and the lefty-Liberal in me hopes all of these products will eventually be open source, like the PaperRater product that my students recently used on a speech project.

Peer review without tech

Most of the work I reference here came from O’Toole (2013), Brookhart (2013), and Bardine and Fulton (2008). Learning by evaluation has long been used by English teachers, it is time for history teachers to embrace the practice. If the CCSS are truly able to get us off the breadth vs. depth Historical coverage treadmill, History/Social Studies teachers are going to need tools and strategies to assess the writing they assigned. Having students read each others writing gives them much needed context. Before I wrote my dissertation, I read dozens of others on the same subject. History teachers will need to learn how to use mentor texts and provide general feedback instead of making margin notations on every paper they receive. English teachers have used peer rubrics and criteria charts to help students with their writing. It is time for history teachers to start incorporating those tools into their classrooms.

CCSS Presentation Resources

Free Plagiarism Tools

http://elearningindustry.com/top-10-free-plagiarism-detection-tools-for-teachers

Robo-Readers are Better than Human Readers

http://hechingerreport.org/content/robo-readers-arent-good-human-readers-theyre-better_17021/

Flunk the Robo-Readers

http://www.bostonglobe.com/opinion/2014/04/30/standardized-test-robo-graders-flunk/xYxc4fJPzDr42wlK6HETpO/story.html

Where Does Automated Essay Scoring Belong in K12 Education

https://www.edsurge.com/n/2014-09-22-where-does-automated-essay-scoring-belong-in-k-12-education

Free Robo-Graders

Hemingway App

http://www.hemingwayapp.com/

Revision Assistant

https://www.revisionassistant.com/#/

Paper Rater

http://www.paperrater.com/

Peer Review Tools

LDC Rubrics

http://ldc.org/how-ldc-works/modules/what-task/rubric

Sample Online Rubric

https://docs.google.com/forms/d/13tJkvBuoufBvNB885ZC8BMbGZF5o4XVuw929_i9Lrdc/edit#

Criteria Chart

https://docs.google.com/file/d/0B6qmQLlSxgMyWjRjekIwY3FkTFE/edit

Economic Systems Criteria Chart

https://drive.google.com/file/d/0B6qmQLlSxgMyZ3BnY3ZkX3VzV3M/view?usp=sharing

Peer Review Discussion Guide

https://docs.google.com/forms/d/13tJkvBuoufBvNB885ZC8BMbGZF5o4XVuw929_i9Lrdc/edit#

Economic Systems Argumentative Rubric

https://drive.google.com/file/d/0B6qmQLlSxgMyNWR6OGVlbk82OEk/view?usp=sharing

Providing Effective Feedback

https://historyrewriter.com/2014/11/10/providing-effective-feedback/

Video Tutorials

Changing Weak Writing into Strong Writing

https://www.youtube.com/watch?v=wJMyvwjpm-s

NCSS Presentation Resources

Let the live blogging from the 94th annual NCSS conference begin. I took the redeye out of LA last night and ran into an old high school friend whom I hadn’t seen in 29 years on the plane. It’s a small BIG world after all. I have fond memories of Boston’s Logan Airport, because that is where I met my wife due to a snowstorm flight delay back in 1998.

Conf Logo

Tomorrow at 9:00 am EST, I present on Innovative Social Studies Strategies in Room 310 at the Hynes Convention Center. This post will house all of the documents and lessons that I reference in my presentation. Feel free to download and repurpose them. Teachers are the best recyclers. You may view and use the slides to my presentation.

My topics are 1) Goal-setting approaches to student writing; 2) peer review (with or without tech); and 3) using social media as a prewriting strategy. I should probably acknowledge that I have stolen everything here from smarter people.

Goal-setting approaches to student writing

I wrote a paper on the results that happened after implementing this program at two high schools. I give this presentation to inspire teachers to consider alternative grading methods and increase the number of writing assignments they require of their students. I have found that over the course of the year my students can double, if not triple the amount of words they put on a page in one class period. The next trick is to partner with an English teacher, who can help them take the quantity they are now proficient in and turn it into quality writing. I have found that this level of competition really motivates students. This work has borrowed heavily from Chip Brady and the excellent curriculum at The DBQ Project, who provided inspiring professional development and encouraged me along the way.

Peer review with tech

Many high quality studies influenced my decision to start evaluate student writing quantitatively, De La Paz, S. (2005), De La Paz, S., & Felton, M. (2010), Monte-Sano (2008, 2011) and (Monte-Sano & De La Paz, 2012). I strongly feel that History/Social Science departments should report descriptive statistics about their students’ writing in order to derive a common set of writing expectations by age and grade level. Further, recent advances in automated essay scoring may make it possible for students to receive feedback from a computer before approaching the teacher to partner in improving the writing together. See this Lightside Labs Revision Assistant video and feel free to expand on this annotated bibliography tracking the major players in the automated essay scoring market. K12 teachers should provide input to the companies developing these products and the lefty-Liberal in me hopes all of these products will eventually be open source.

Peer review without tech

Most of the work I reference here came from O’Toole (2013), Brookhart (2013), and Bardine and Fulton (2008). Learning by evaluation has long been used by English teachers, it is time for history teachers to embrace the practice. If the CCSS are truly able to get us off the breadth vs. depth Historical coverage treadmill, History/Social Studies teachers are going to need tools and strategies to assess the writing they assigned. Having students read each other’s writing gives them much needed context. Before I wrote my dissertation, I read dozens of others on the same subject. History teachers will need to learn how to use mentor texts and provide general feedback instead of making margin notations on every paper they receive. English teachers have used peer rubrics and criteria charts to help students with their writing. It is time for history teachers to start incorporating those tools into their classrooms.

Social media as a prewriting strategy

Back in August, I gave a full description of the Twittercide of Socrates. My students were extremely motivated by this assignment and turned in an average of 250 words per essay. I also created an assessment where the tweets were mixed up and asked the students to put them back into sequential order by what happened before the trial, during the trial, and after the trial.

Greek Means

Here are the materials for Dr. Margarita Jimenez-Silva & Mrs. Ruth Luevanos’ presentation on Bruce Springsteen’s Sinaloa Cowboys, the lyrics, the directions, and the slides.

Please check back, as I will add more resources to this page as readers share tips and best practices. Lastly, if you would like to serve as a member of the instructional community, please fill out this form.

Creating Peer Review Systems

With the implementation of the Common Core State Standards, writing instruction will become distributed throughout the school. Writing from sources require students to respond to the ideas, events, facts, and arguments presented in texts they are assigned. Teachers can improve student literacy skills by increasing writing assignments, yet some teachers have expressed a reluctance to assign more frequent writing tasks because they fear it will increase their workload.

Peer Review

Implementing an effective peer review program with free online polling tools like surveymonkey, polleverywhere, and google forms can transfer the burden of grading from teachers to students. The grading process becomes a student-centered, learning by evaluation collaborative activity. O’Toole (2013) suggested peer assessment should be structured, with a learning design that includes “phases of activity, peer assessment, reviewing and reflecting” (p. 5). Brookhart (2013) recommended student-generated rubrics to allow for highly effective peer grading systems. Bardine and Fulton (2008) advocated using revision memos to have students explicitly address weaknesses in drafts and develop confidence in academic writing.

Peer review programs give students practice in developing the skills necessary to recognize effective thesis statements, use textual evidence, and refine arguments. Learning by evaluation significantly improves a student’s self-assessment abilities and lays the groundwork for self-improvement. Thus, learning by evaluation programs should focus on one or two aspects of effective writing, include student discussion to drive reflection about writing as an iterative process, and allow increased instructional time for student revision.

I polled English teachers at my school and found that 39% were confident in their ability to teach students how to write a thesis. After surveying our students, however, only 9% were confident in their ability to develop a thesis statement. This gap suggests teachers need to give students more practice in developing, identifying, and assessing thesis statements. Further, teachers can showcase student exemplars and improve weak thesis statements via thinkalouds. Once students gain more confidence and proficiency in writing thesis statements, teachers can move on and address other factors in effective academic writing, such as claims, rebuttals, argumentative strategies, document usage, and citations.

References

Bardine, B., & Fulton, A. (2008). Analyzing the benefits of revision memos during the writing and revision process. The Clearing House, 81(4), 149-154.

Brookhart, S. M. (2013). How to create and use rubrics for formative assessment and grading. Teacher Librarian, 40(4), 52.

O’Toole, R. (2013) Pedagogical strategies and technologies for peer assessment in Massively Open Online Courses (MOOCs). Discussion Paper. University of Warwick, Coventry, UK: University of Warwick. (Unpublished).