#SSchat on RoboReaders

Rise of The Robo-Readers

July 13 @ 4pm PST/7pm EST #sschat
co-mods: @scottmpetri & @DavidSalmanson

A primer on auto essay scoring

https://historyrewriter.com/2015/07/03/role-of-robo-readers/

Q1 What is your definition of AES, robo-reading, or robo-grading? #sschat

Q2 What is greatest hope and/or your worst fear about technology-assisted grading? #sschat

Q3 When is it ok for a computer to assign grades on student work? #sschat

Q4 How can classroom teachers test & evaluate a robograder without disrupting learning? #sschat

Q5 What would parents think if Ts required Ss to use robo-graders before submitting work? #sschat

Q6 What would school admins say if you used a robograder in your classes? #sschat

Q7 How would you use a robograder in your History-Social Science class?

Q8 How could robo-readers help teachers gamify the art and process of writing?

Shameless plug: https://www.canvas.net/browse/ncss/courses/improving-historical-writing has a module on writing feedback & AES. Course is free and open til Sept. 22. #sschat

Teaser Tweets (to promote the chat after Monday – 7/6).

Are robo-graders the future of assessment or worse than useless? http://wp.me/4SfVS #sschat

Robo-readers are called Automated Essay Scorers (AES) in education research. http://wp.me/4SfVS  #sschat

In one study, Ss using a robo-reader wrote 3X as many words as Ss not using the RR. http://wp.me/4SfVS #sschat

Robo-readers produce a change in Ss behavior from never revising to 100% revising. http://wp.me/4SfVS #sschat

Criticism from a human instructor has a negative effect on students’ attitudes about revisions. http://wp.me/4SfVS #sschat

Comments from the robo-reader produced overwhelmingly positive feelings for student writers. http://wp.me/4SfVS #sschat

Computer feedback stimulates reflectiveness in students, something instructors don’t always do. http://wp.me/4SfVS #sschat

Robo-graders are able to match human scores simply by over-valuing length compared to human readers. http://wp.me/4SfVS #sschat

None of the major testing companies allow open-ended demonstrations of their robo-graders http://wp.me/4SfVS #sschat

Toasters sold at Walmart have more gov. oversight than robo-readers grading high stakes tests. http://wp.me/4SfVS #sschat

What is the difference between a robo-reader & a robo-grader? http://wp.me/4SfVS #sschat

To join the video-chat follow @ImpHRW, sign into www.Nurph.com. Enter the #ImpHRW channel. Note you will still need to enter #sschat to your tweets.

Resources

https://www.grammarly.com/1

http://www.hemingwayapp.com/

http://paperrater.com/

http://elearningindustry.com/top-10-free-plagiarism-detection-tools-for-teachers

http://hechingerreport.org/content/robo-readers-arent-good-human-readers-theyre-better_17021/

http://www.bostonglobe.com/opinion/2014/04/30/standardized-test-robo-graders-flunk/xYxc4fJPzDr42wlK6HETpO/story.html#

http://www.newscientist.com/article/mg21128285.200-automated-marking-takes-teachers-out-of-the-loop.html#.VZYoNEZZVed

Promo Video for a forthcoming Turnitin.com product

https://www.youtube.com/watch?v=aMiB4TApZa8

A longer paper by Shermis & Hamner
www.scoreright.org/NCME_2012_Paper3_29_12.pdf

Perelman’s full-length critique of Shermis & Hamner

http://www.journalofwritingassessment.org/article.php?article=69

If you are really a hard-core stats & edu-research nerd

http://www.journalofwritingassessment.org/article.php?article=65

https://www.ets.org/research/policy_research_reports/publications/periodical/2013/jpdd

http://eric.ed.gov/?q=source%3a%22Applied+Measurement+in+Education%22&id=EJ1056804

National Council of Teachers of English Statement

http://www.ncte.org/positions/statements/machine_scoring

For Further Research

Williamson, D. M., Xi, X., & Breyer, F. J. (2012). A framework for evaluation and use of automated scoring. Educational Measurement: Issues and Practice, 31(1), 2-13.

Providing Feedback

Hi Everyone,  

This post is for the 411 participants in the MOOC Helping History Teachers Become Writing Teachers.  Thanks to many of you that filled out my TEO Survey. Here are the results so far.

Participants

We start module two on a high note wrapping up some great conversations from Week 1. We had 119 introductions; 76 conversations posted in writing about writing, 41 discussions about SRSD instruction, and about 51 discussions about detecting plagiarism. I use the number of active discussions to tell me how many of the 411 teachers enrolled in the class are actually participating.

This week, we will start with some videos on providing feedback. These videos are tagged elementary, secondary, and college. Feel free to view the one that would be most helpful to your subject. Don’t feel obligated to watch all three.

We will continue our dialogue about the difficulty in providing effective feedback. There are some short pro and con articles about robo-graders, or automated essay scoring systems, which I hope will spark a spirited, yet civil debate on our discussion boards. Our featured reading is followed by a short, 10-question quiz. Dr. Christian Schunn, who offers a guest lecture on a web-based peer review program has offered to give us complimentary access to Peerceptiv during the course of the MOOC.  However, in order to use it we would need to generate some mentor texts and conduct multiple peer reviews as an assignment. I am worried about assigning too much work and scaring people off.

A couple of highlights on the discussion board were from Jennifer Brown, who is trying to wrap her head around why students plagiarize, librarian Lorraine Saffidi who asked “How can students be expected to express in their own words a complex idea they only partially understand?”  And Wesley Lohrman, who wrote: “History teachers can work to eliminate plagiarism by requiring students to incorporate a variety of text and push students to analyze what they are reading, compare and contrast text, and build opportunities with the classroom for students to discuss ideas and build their own concepts related to the current history learning targets.”

Overall, I am very impressed with the quality of the participation and I have enjoyed chiming in.  It is not too late for anyone to log in and participate in the Week One discussions. Remember, you need to participate in all of the course discussions to earn the certificate at the end of the course.

Tweeted in Class.

Here are some of the interesting items I found on Twitter this week and shared with MOOC participants under #HistRW. These pieces were authored or shared by course participants. Feel free to follow me @scottmpetri and connect with participants from the course.

Why do I have to teach writing in my 8th Grade American History?

http://historywithcj.blogspot.com/2015/01/why-do-i-have-to-teach-writing-in-my.html?spref=tw

World History Teachers Blog: M.A.I.N. Causes of WWI Video

http://worldhistoryeducatorsblog.blogspot.com/2015/01/main-causes-of-whi-video.html

How do students regard feedback from their teachers

http://www.turnitin.com/assets/en_us/media/favorite-feedback/

Interesting Summer PD Seminars

https://www.gilderlehrman.org/programs-exhibitions/2015-teacher-seminars

Charts that help writers distinguish idea generation from idea execution

http://makewriting.com/2014/12/01/charts-that-help-writers-distinguish-idea-generation-from-idea-execution/

A non-freaked out approach to the core

http://www.teachingthecore.com/non-freaked-approach-common-core-01/

Revising bit by bit

http://makewriting.com/2014/11/23/revising-bit-by-bit/

That’s it for now. I hope you are enjoying the course.  Cheers. Dr. P.

Detecting Plagiarism

This post summarizes the Turnitin.com white paper on Plagiarism and the Web, which will culminate in a lecture for course participants in the MOOC Helping History Teachers Become Writing Teachers that will take place between January 12, 2015 – February 24, 2015. Regardless of if you are participating in the course or not, feel free to make comments, or click on the links for the original sources.

Turnitin Sources

Blum (2011) reported that more than 75 percent of college students have admitted to cheating and 68 percent have admitted cutting and pasting material from the internet without citing it. Over the last 15 years, almost 40 million student papers have been submitted to Turnitin. This study examined and classified 140 million content matches to discover which web sources students rely on for unoriginal content in their written work. The vast majority of students who have matched content in their work do not rely on cheat sites or paper mills. Many use legitimate homework, academic and educational sites as research sources. Students rely on social networks and user-generated content sites such as content sharing and question-and-answer (Q&A) sites to find content for their papers. Turnitin detects patterns of matching text to help instructors determine if plagiarism has occurred. The text in the student’s paper that is found to match a source may be properly cited, making it legitimate academic work. Social and content sharing web sites comprised the highest percentage of all matched content over the course of the study. Legitimate homework and academic help sites were second, followed by cheat sites/paper mills and news and portals a close third and fourth. The fifth most popular category was encyclopedia. The top eight matched sites for matched content were: 1) www.en.wikipedia.org 2) www.answers.yahoo.com 3) http://www.answers.com 4) http://www.slideshare.net 5) http://www.oppapers.com 6) http://www.scribd.com 7) http://www.coursehero.com 8) http://www.medlibrary.org. Only one of the top eight sites is dedicated to helping students cheat by providing unoriginal content. Out of the twenty-five most popular sites, fourteen are legitimate student resources. While close to fifteen percent of unoriginal content comes from cheat sites and paper mills, the majority of students are frequenting legitimate academic or educational web sites. Educators can guide students in proper citation procedures. With digital tools, educators can show students how much of their paper lacks attribution. Showing students a detailed report on the originality of their written work creates a teachable moment. Turnitin offers a collection of white papers on student writing and plagiarism that teachers may find beneficial. Although most students understand that quoting word for word requires a citation, they are often confused about the need to cite someone else’s paraphrased ideas. Professional authors like Stephen Ambrose,  Doris Kearns Goodwin, and Stephen Glass have had problems in this area too. Turnitin claims that academic institutions adopting their service see a reduction in unoriginal content of 30-35% in the first year. By the fourth year, many institutions see levels of unoriginality in student writing falling by 70 percent. This claim, when applied to the assertion that the rate of serious cheating on written work remained stable between 1963 and 1993 (Blum, 2011, p. 2) indicates that electronic plagiarism detection tools could be beneficial to teachers and help increase the amount of writing assigned in high school and college. I am interested in hearing about your experiences using plagiarism detection tools. Do you think students are genuinely confused about the rules of paraphrasing and citing? Or are the vast majority of students deliberately copying other writer’s works? What motivates this? Does cutting and pasting happen from poor research skills or laziness? How can you create assignments that reduce the amount of plagiarism from your students? Please make a comment, or send me your questions via Twitter to @scottmpetri #HistRW. For those interested in experimenting with plagiarism detection tools, there are several free options.

References

Blum, S., D. (2011). My word: Plagiarism and college culture. Cornell University Press. Ithaca, NY.

Plagiarism and the Web: Myths and Realities. An Analytical Study on Where Students Find Unoriginal Content on the Internet. Retrieved from http://turnitin.com/static/resources/documentation/turnitin/company/Turnitin_Whitepaper_Plagiarism_Web.pdf on November 25, 2014.