With support from the University of Richmond

History News Network puts current events into historical perspective. Subscribe to our newsletter for new perspectives on the ways history continues to resonate in the present. Explore our archive of thousands of original op-eds and curated stories from around the web. Join us to learn more about the past, now.

Going Beyond the Bubble: Stanford’s New Approach to History Assessments


Credit: http://beyondthebubble.stanford.edu/

History can be a difficult subject to assess at a pre-college level. Multiple choice questions (MCQs) are quick and easy to administer, but do not always provide an accurate reflection of a student’s understanding of the historical subject. At the other end of the spectrum, the document-based questions (DBQs) used in AP History exams, can be too advanced for students unfamiliar with the analysis of primary documents, and are time-consuming to administer.

The Stanford History Education Group (SHEG), a team that includes Stanford faculty, graduate students, post-docs, and visiting scholars, may have developed a happy medium between these two methods with their innovative “Beyond the Bubble” assessments. Working as part of the Library of Congress’s Teaching with Primary Sources (TPS) Educational Curriculum and in close alignment with the new Common Core State Standards, the Beyond the Bubble (or BtB) assessments have been in development since 2009 and are now being released to the public.

“We looked at what the available assessments were, and we were struck by how far apart they were,” say Mark Smith and Joel Breakstone, both assistant directors of the TPS program. MCQs cause students to answer based on the process of elimination, which “doesn’t look much like history” and takes historical facts out of context, hindering a student’s ability to develop real historical understanding. On the other hand, DBQ questions, often considered the “gold standard” of history testing, do not always provide teachers enough specific information about how to develop such understanding.

The developers at SHEG believe that neither approach fully offers the kind of assessment the subject of history requires. As they put it on their website, there is an “absence of creativity” in the testing industry, and it fails both the teachers and the students. Professor Sam Wineburg, who heads up SHEG, believes that the reason behind this is the emphasis of testing and the form it takes in the United States. Driven in part by what he calls the “American predilection for scorecards,” and in part by the motivation of profit, for-profit testing companies want “the cheapest scorecard who [sic] will reap them the biggest profits.”

This, he believes, is why MCQs are the most frequent form of assessment in schools. “But,” he says, “nobody believes MC testing is useful.”

Wineburg is optimistic that the need for an alternative method of assessment is finally being recognized on a national level. Educators are calling for something more effective than MCQs but less time-consuming and advanced than DBQs -- and something that can be used for all students. That is where BtB steps in, offering not just a middle ground, but a return of creativity to testing.

Their exercises are called “History Assessments of Thinking,” or HATs. Each HAT requires students to go beyond simple factual recall and apply the information they have learned to a specific historical context. Individual HATs cover a range of topics from the first Thanksgiving to the civil rights movement, and each comes with a combination of historical facts and primary documents. “We need to teach people basic skills on how to determine the credibility of facts on the internet,” Wineburg observes. For example, the First Thanksgiving HAT shows students a picture from 1932 that purports to capture the actual 1621 event.

Similarly, in the civil rights HAT, students are tested on their understanding of the broader narrative of the movement. Smith and Breakstone found that many students, lacking sufficient factual knowledge of the events that took place, made logical but historically inaccurate assumptions which undermined their grasp of the movement as a whole. “This assessment,” they say, “proves it’s incredibly important to have that factual knowledge.”

But BtB also seeks to go beyond mere factual knowledge. As the website states, SHEG believes that “historical thinking is about cultivating habits of mind,” and it is these habits of mind that they seek to instill in students. The assessments focus on three specific aspects of historical thinking: evaluation of evidence, historical knowledge, and historical argumentation. Each provides a more formative evaluation than MCQs, but most can be administered in minutes as opposed to the hour-long DBQs. SHEG hopes to “provide a window to what students think,” which will allow the assessments to continue to adapt and improve based on what works and what doesn’t.

They have good reason to be optimistic about this approach, as it is the same way the assessments were created in the first place. SHEG used a meticulous research and development process, selecting documents from the Library of Congress’s online archives around which to formulate each HAT. They also worked with student focus groups, conducting “think aloud interviews” in which students were asked to talk out loud as they worked. This provided the “window” to students’ thought processes which the developers sought, in order to ensure that the assessments were successfully encouraging historical thinking.

SHEG did not start this process entirely from scratch. It already had a curriculum in use called Reading Like a Historian (RLH), which has been highly successful. Like BtB, this curriculum was developed to teach students how to analyze and understand historical documents, and develop historical thinking. It consists of 75 lessons that were designed to stand alone individually, and supplement what teachers were already doing in the classroom. Its flexibility has made it easy for teachers to use, and now with the public release of BtB materials, teachers have even more options for the classroom.

Smith and Breakstone say that response from teachers has generally been quite positive, particularly among older teachers. “For many history teachers, the challenge has been getting documents accessible to their students,” they explain, and both RLH and BtB help them meet that challenge.

While teachers who are already using the RLH curriculum can begin using BtB assessments right away, Smith and Breakstone add that they are already working to make the two blend more seamlessly through the development of “more modular curricular materials, where there are lessons that have integrated assessments.” The goal is to move towards “an even closer relationship between curriculum and assessment.”

Valerie Ziegler, a teacher from San Francisco, has used RLH since its beginning, and is an enthusiastic supporter of the project. She used to give students individual oral assessments to evaluate their historical thinking, using, for example, the first Thanksgiving picture. “In two minutes, I know very quickly if they know what I want them to know,” she says. And now that BtB assessments are available, and often take as little as 7-10 minutes, her task is even easier and more reliable. “They’re short, easy to administer, easy to grade.”

Ziegler sees both curriculum and assessment as “stepping stones” to DBQs, introducing students to primary source analysis and historical thinking without overwhelming them. “If you don’t have these skills,” she says, “you won’t be able to do a DB.”

They help the teachers, too. “If I don’t know if they can do a DBQ,” she notes, “I can’t get them to that next level.” She believes that the BtB assessments, which favor a more skills-based approach compared to the content-based MCQs, are better for students in the long run as well, because “the skills that are tested are broadly applicable to their future.”

She adds that it is not just the teachers who have responded positively to the RLH/BtB approach. While the assessments provide a lot of needed background information for the teachers, they also provide interest for the students in the form of videos, online information, and a great variety of primary materials. There are rubrics and examples, making it easy for both sides. She says that students also “love the idea [that] they’re answering a debatable question.”

This is right in line with the goals of the BtB assessments, as stated on the website: to balance historical knowledge and historical thinking, to ask students to apply knowledge instead of merely reproducing it, and, perhaps most importantly, to ask students to “consider content in ways that require thought, judgment, and deep understanding.”

So far, Ziegler considers the assessments to be a success. Out of 35 students last year who were given BtB assessments alongside RLH lessons, she reports that 31 of them answered the assessments correctly. This is good news for everyone, since she, like Professor Wineburg, predicts that state MC tests will change in the future to include critical thinking assessments as the American educational system faces more and more pressure to keep up with the rest of the world. The Beyond the Bubble approach is, in her opinion, “ahead of the curve on that.”

Could SHEG’s new, acronym-laden approach be just what the American educational system – and the assessment of history in particular – needs? It might be too soon to say for sure, but there is no denying that something has to change, and soon. Perhaps the developers at SHEG are right: perhaps it’s time to go beyond the bubble.