ÁñÁ«ÊÓƵ

In the age of Google, why are we still so focused on testing facts?

<ÁñÁ«ÊÓƵ class="standfirst">New assessment strategies are required to promote the types of learning needed in the 21st century, says Danny Oppenheimer
March 25, 2021
A Google search on a home computer. With such easy access to all of human knowledge at our fingertips, methods of assessment need to be updated.
Source: iStock

Back when I?was in college, the internet was in its inception; information was hard to find online and even less reliable than it is today. If I?wanted to know basic facts (¡°What is the fundamental attribution error?¡±; ¡°What are the stages of?meiosis?¡±; ¡°What was the Treaty of Westphalia?¡±), I?had to either ask an expert or head to the library for a gruelling search. Information was valuable, which is why so much of college was focused on acquiring information.

Today, things are very different. Students have all sorts of cognition-enhancing technologies, from Google to Grammarly. Basic facts are easy to find online, and with smartphones, students typically have access to the bulk of human knowledge anywhere and at any time.

Now, the problem isn¡¯t information scarcity, but rather information overload. The challenge is no?longer in finding information, but in evaluating information, synthesising information, applying information and creating new knowledge ¨C what some psychologists refer to as cognitive skills or habits of thinking.


THE Campus spotlight: what does good assessment look like?


And yet so much of higher education is still content-based ¨C many courses are still focused on teaching students to memorise large bodies of facts rather than how to think judiciously about those facts, which can easily be acquired online.

ÁñÁ«ÊÓƵ

Many faculty believe they are already teaching students to think rather than to memorise, and in fairness, some probably are. But a brief look at any college course catalogue will quickly reveal that most courses are described and organised around the content being conveyed (¡°The molecular basis of addiction¡±; ¡°19th-century Russian masterpieces¡±; ¡°Nuclear regulatory policy¡± and so?on) rather than cognitive skills and habits of thinking that the course hopes to instil.

Still, faculty often say that their lectures, labs and discussion sections not only convey facts but also discuss the evidence for and implications of those facts. Indeed, many classes are structured around the premise that students, by virtue of observing faculty engaging in the types of analysis, synthesis and application that the instructor hopes to convey, will naturally develop those skills themselves ? a highly questionable assumption given the psychological literature on transfer and generalisability of knowledge and skills.

ÁñÁ«ÊÓƵ

Even in classes that explicitly cover how to engage in critical thinking, much of what students are tested?on would be classified as part of the lowest level of Bloom¡¯s taxonomy: factual knowledge and remembering. When professors test facts and minutiae, it encourages students to focus on learning facts and minutiae to the exclusion of the higher cognitive skills that faculty are trying to teach. Resolving these problems will involve overhauling how we think about course design and student assessment.


THE Campus resource: facilitating active, collaborative learning experiences


First, courses and curricula need to be designed around the notion of explicitly teaching the kind of thinking we want to encourage, rather than assuming that students will pick up those cognitive skills through incidental exposure.

That includes creating courses that specifically signal the nature of the skills that students are expected to?learn (for example, including courses in the catalogue such as ¡°Computer-assisted literature review¡±; ¡°Analysing externalities¡±; ¡°Engaging in civil discourse with somebody you disagree?with¡±) and aligning syllabi and lessons with those goals.

Second, knowing that students will focus on what is tested, we need to develop assessments that test the skills we want students to learn. One way to do this is to use open book (and open internet) exams. If?faculty know that students will have access to Google during the exam, there is little point in asking factual questions with easily searchable answers. Instead, questions will naturally focus on how to evaluate, synthesise or apply information.

ÁñÁ«ÊÓƵ

An example question might be: ¡°Find three thinktank reports that provide different estimates of how many jobs will be lost if the minimum wage is increased to $15 an?hour and explain how the different data or assumptions used by the three reports cause them to come to different conclusions.¡±

Alternatively, we might design practicum assessments in which students are forced to actually use skills rather than regurgitate descriptions of how to use those skills (for example, ¡°use the techniques learned in your chemistry lab to identify a?mystery compound¡± or ¡°use the techniques of your human-computer interaction class to design a graphical user interface (GUI) for a banking website¡±).

Such assessment strategies are more difficult and time-consuming to develop and grade but are necessary to promote the types of learning that are essential for the internet age.

The needs of an educated society are changing, and higher education needs to adapt to be responsive to those changes. Universities must invest the resources needed to develop new learning priorities and techniques to deliver on those priorities in light of advancing technology, lest we become obsolete in a world of information overload.

ÁñÁ«ÊÓƵ

Danny Oppenheimer is a professor at Carnegie Mellon University jointly appointed in psychology and decision sciences. He studies judgement, decision-making, metacognition, learning and causal reasoning.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles
<ÁñÁ«ÊÓƵ class="pane-title"> Related universities
<ÁñÁ«ÊÓƵ class="pane-title"> Reader's comments (1)
I agree with many of the points, and certainly agree that we need to go through a paradigm shift in how we do assessment and how that informs how we teach (or indeed if we're lucky, forced change to assessment driven by the Age of Google then drives forced change in teaching to adapt). However, there are numerous subjectivities to consider in changing the way that "we" as a collective teach. Teaching style and requirements vary from class size to discipline to access to technology to skills of the academic themselves (both pedagogical and digital). Honestly if you are a good lecturer than you expertly deliver a blend of content that promotes discussion to encourage the association of fact and application (and if you can keep their attention and interest throughout your teaching session by changing things up on a dime based on their interactions and reactions in the moment then you stand a much better chance of all of that coming together). If you're just lecturing (dry style), then yes I agree it is more about delivering facts and hoping that some kind of of evaluative skills emerge from the cocoon of knowledge absorption. At the moment we are at risk of a large scale shift based on a reflex rather than an adaptation and an evolution. Higher education is not one thing to all people, it has many things to many people. And we need to keep that in mind as we go through these changes and address what areas need them and what scale and what support may need to be put in place in order to make them a reality. There is also of course the student Factor to consider. Not only actually asking them what they feel they would most benefit from, having gone through a year of radically different content delivery to normal if not an increase in active learning to to make them feel part of a community. But also as to whether all our proclamations of students wanting the opportunity to design their own learning and discuss everything and do small group learning and actually used their knowledge, when we try it do any more than the 5% highflyers actually engage and participate? When a large class is asked a question to consider either for delivery to the whole group discussion in smaller groups, how many actually take part in that and volunteering formation? I'm certainly not against change and in fact spend most of my time trying to convince others to adapt and evolve. I just think that we need to include voices on the ground from academic and student perspectives, and actually test that which we assume to be the way that things should be. All that said, I have actually been doing online open book exams in final year for several years prior to the pandemic, have done digital assessment and skills assessment in a continuous fashion for large-scale modules in first year for almost 8-years, and help colleagues design practical assessments that encourage students to actually use that which they have already learnt by doing, and apply it to a hypothetical future situation. Learning by doing, and succeeding by engaging. So I'd like to think that I'm half on board and rather joyous about being on the "Change Train". In the age of Google it no longer matters what you know but what you do with what you know. We need to embrace that.
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs