Dartmouth Cheating Debacle: 3 Initial Observations

Update 6/11: Dartmouth Medical School’s dean sent out an email yesterday dropping all honor code charges. This was the right decision given the flawed process.

Over the past month a story has been developing at Dartmouth College’s Geisel School of Medicine about an alleged cheating scandal, and with the New York Times coverage yesterday, the story is getting much bigger. The NYT coverage largely takes the students’ viewpoint while explaining the situation fairly well and providing ample quoted explanations from Dartmouth’s administration (in particular the dean of the school). It’s a depressing story that ties together the real need to validate academic integrity with the unintended consequences of capturing student clickstream data, overlaid with an apparent lack of reasonable due process.

The Boston Globe covered the story in mid April:

A Dartmouth medical student was halfway through a timed practice test for a high-stakes board exam last month when an e-mail flashed on her phone. She was startled to find a formal message on her medical school’s letterhead.

The e-mail accused her — and, she later learned, more than a dozen other students — of cheating by accessing online course materials while taking a test on a different software platform. The school said that it had electronic evidence of misconduct, and that she was invited to make a brief statement defending herself at a tribunal to be held over Zoom in a week.

“I sat there for a few minutes, just looking at it completely in shock,” said the student, who, like the other students who spoke for this story, maintains her innocence. “It was just like, ‘Here is this infallible data. You could face expulsion if it’s verified.’”

For the past month the cheating allegations have engulfed the medical school, a small, close-knit community on anIvy League campus in New Hampshire’s Upper Valley. At least nine of the accused say that they are innocent, and that Dartmouth doesn’t understand how its own software platforms work. Medical school officials say the situation is unfortunate but the ongoing investigation is necessary to preserve academic integrity.

The New York Times described the core role of the Canvas LMS in its investigation. 1

At the heart of the accusations is Dartmouth’s use of the Canvas system to retroactively track student activity during remote exams without their knowledge. In the process, the medical school may have overstepped by using certain online activity data to try to pinpoint cheating, leading to some erroneous accusations, according to independent technology experts, a review of the software code and school documents obtained by The New York Times.

This story has not fully played out, but I have three initial observations to share based on existing coverage.

Student Clickstream Data: Not designed for forensics but used for it nevertheless

Many people have been calling out the limitations of using clickstream data to improve student learning, but at least it is the stated goal of learning analytics to do just that. The New York Times describes a use of system data outside of its design scope.

At Dartmouth, the use of Canvas in the cheating investigation was unusual because the software was not designed as a forensic tool. Instead, professors post assignments on it, and students submit their homework through it. [snip]

Seven of the 17 accused students have had their cases dismissed. In at least one of those cases, administrators said, “automated Canvas processes are likely to have created the data that was seen rather than deliberate activity by the user,” according to a school email that students made public.

In one case this limitation has been specifically called out in the Canvas Community forums.

You can view quiz logs to view the status of your student quizzes. This feature is designed to help you investigate problems that a student may have in the quiz and gain insight into your quiz questions. Quiz logs are not intended to validate academic integrity or identify cheating for a quiz.

Capturing this type of clickstream data can lead to its usage for validating academic integrity and catching cheating, whether that was the intention or not. Dartmouth is not the only case, just the most visible. A recent Reddit thread provided first-hand descriptions of this usage by educators.

A recent post made me realize this may not be common knowledge so I’m just throwing this out there as I can verify this as a TA myself: anyone who has instructor privileges on a canvas page (professors, TAs, etc) can easily see your activity levels on the canvas page. This includes dates, times, every link you click and when, etc. If it’s on their Canvas page, they can see it if given reason to look (for example, if you have an exam that is not open note but you go clicking around on canvas while taking the exam, or you have a canvas ate your homework scenario).

One reply extended this to exams.

To add to by this, exam questions are time stamped with every time you look at a question, answer a question, flag a question, etc. I’ve caught a half dozen students cheating that way. It isn’t coincidence when a whole group is answering questions in synch…

What makes the Dartmouth case much bigger is that its investigation had the whole weight of the School of Medicine and could have a direct impact on career decisions, not just a poor test grade.

Dartmouth, for its part, insists that it filtered out automated page access cases, but they never seem to acknowledge their decision to use a system not designed for forensics for just that purpose.

Due Process: Presumption of guilt and two minute defenses are not that

The questionable use of LMS data is only part of the problem, however. The bigger issue to me is the overall lack of due process assuming that the reporting is accurate. From the NYT:

Some accused students said Dartmouth had hamstrung their ability to defend themselves. They said they had less than 48 hours to respond to the charges, were not provided complete data logs for the exams, were advised to plead guilty though they denied cheating or were given just two minutes to make their case in online hearings, according to six of the students and a review of documents.

Five of the students declined to be named for fear of reprisals by Dartmouth.

Duane Compton, dean of the Geisel School, said in an interview that its methods for identifying possible cheating cases were fair and valid. Administrators investigated carefully, he said, and provided accused students with all the data on which the cheating charges were based. He denied that the student affairs office had advised those who said they had not cheated to plead guilty.

Note that Dean Compton did not deny the short preparation time and two minute limited time for students to defend themselves. It seems that this was more of a Star Chamber, where a committee investigated the data and convinced themselves that guilt was proven, with only minimal opportunity for alternative explanations.

Geisel’s Committee on Student Performance and Conduct, a faculty group with student members that investigates academic integrity cases, then asked the school’s technology staff to audit Canvas activity during 18 remote exams that all first- and second-year students had taken during the academic year. The review looked at more than 3,000 exams since last fall.

The tech staff then developed a system to recognize online activity patterns that might signal cheating, said Sean McNamara, Dartmouth’s senior director of information security.

We’re talking about a serious issue here, and Dartmouth did not have a system designed for this type of forensics. So a committee of faculty and students got the IT department to develop a new system using messy clickstream data. It takes a fair amount of hubris to develop unproven methods on questionable data and conclude that the process was fair and accurate.

School officials said that their analysis, which they hired a legal consulting firm to validate, discounted automated activity and that accused students had been given all necessary data in their cases.

But at least two students told the committee in March that the audit had misinterpreted automated Canvas activity as human cheating. The committee dismissed the charges against them.

In another case, a professor notified the committee that the Canvas pages used as evidence contained no information related to the exam questions his student was accused of cheating on, according to an analysis submitted to the committee. The student has appealed.

The proper response in my opinion is to acknowledge that Dartmouth was on shaky ground and should be careful with their analysis. Instead, Dean Compton claimed that these dismissals validated its methodology. Dean Compton does not seem to believe that being careful should include letting the students defend themselves in a reasonable manner priot to appeal, and to understand that some of the investigation’s original assumptions were faulty at best.

Academic Integrity: A real need that is not going away

While I believe this situation at Dartmouth is a debacle, it is not due to bad intentions or hasty judgement. Dartmouth spent a lot of time and effort investigating the case, hired outside counsel to review, and worked with an existing faculty and student governance group.

The guiding principle called out by Dean Compton is important is not going away, and should not.

“We take academic integrity very seriously,” he said. “We wouldn’t want people to be able to be eligible for a medical license without really having the appropriate training.”

It is important for schools to validate academic integrity, particularly for accrediting and professional licensing purposes, and the shift to remote learning during the pandemic just intensified this need. The problem often comes from a poor understanding or overreliance on technology and data solutions to validate academic integrity.

Some of the details are not yet clear, particularly during student appeal processes, but based on these articles this sounds like a story of good intentions and hard work gone awry in a high-pressured environment – both because of the stakes involved and the pandemic’s shift to remote teaching and remote hearings.

Disclosure: Instructure and several of its competitors are subscribers to the MindWires EdTech Market Analysis service.

1 Disclosure: Instructure and several of its competitors are subscribers to the MindWires EdTech Market Analysis service.