My Scientific Training Brought Me to My New Favorite Book Genre

‘Tis the season of failed New Year’s resolutions

It was about this time last year that I found myself falling flat on the admirable New Year’s resolutions I had set. My daily yoga routine had evolved into a 30 second toe-touch in the morning and my new eco-friendly products were still in a shopping list on Amazon. As a 4th year graduate student all too familiar with failed experiments, I couldn’t stomach an unsuccessful project at home as well. Luckily, pursuing a PhD comes with strong problem solving skill sets. What is Step 1 of The Scientist’s Plan of Attack when faced with a problem? Turn to the literature to see if a solution has already been found. This is how I found myself in the societally constructed land where sad, desperate singles with three too many cats hang out: the self-help aisle at the bookstore.

Rather than using SciFinder to tease out quality sources, I turned to Goodreads to sort through books that could help troubleshoot my failed resolutions. My first self-help purchase was “The Willpower Instinct” written by Kelly McGonigal, a health psychologist. Professor McGonigal (Harry Potter fans rejoice) wrote the book based on a course she taught at Stanford about the science of willpower. After a few years of teaching and feedback from her students, she wrote this book outlining the currently accepted research associated with willpower, peppered in with some anecdotes from her students and how they applied the conclusions from these published studies to their own habits. Between the author’s credentials, solid reviews, and over 20 pages of references in the back of the book, this seemed like a good place to start.

It’s a year later, and I am now one of those people that goes around flinging self-help book recommendations to anyone who mentions a personal problem they’re having and constantly saying things like “this book changed my life”. But… this book changed my life.

“Self-Help” has taken on such a negative connotation that many bookstores re-brand these aisles with names such as “Personal Growth”

McGonigal walks us through a series of published rat and human studies that identifies two separate parts of the brain that either experience anticipation of a reward or the pleasure from the reward itself. In other words, while the anticipation of binging Netflix sure feels more rewarding than anticipating 20 minutes of yoga, this book taught me to pay attention to how much pleasure each activity actually brought me. I immediately found that I felt happier and less anxious during and after yoga, while watching TV left me feeling less refreshed. After learning to be mindful of what I was anticipating versus actually enjoying, I had a better shot at ignoring the distracting anticipation of a television reward.

I also learned that I was using what psychologists call “moral licensing” to give myself permission to not adapt green habits because of the little things I did, like making a saved shopping list on Amazon or sharing articles on social media. She cites many studies demonstrating the ways consumers use small green actions to assuage their guilt for participating in more harmful behaviors. After realizing what I was doing, I took concrete steps to switch to reusable grocery bags and decline plastic straws at restaurants. However, I wouldn’t let that give me permission to make more wasteful decisions later in the day.

My new identity as a self-help book reader is a year old and I am still learning new cheat codes to navigating adult life with each addition to my library. However, the stigma of the genre makes my recommendations for others often fall on the ears of unwilling readers. My 2019 New Year’s resolution? Convince you, the reader, that this genre deserves a re-brand and a chance. As scientists we are trained to find reputable sources of information, and as graduate students we have a remarkably high chance of suffering from mental illnesses. Let’s use our skill set to find quality self-help books to add to our artillery of problem solving tactics that can be applied to ease some of these stresses that make it home with us after a long day in the lab. I hope to one day walk into the self-help aisle and see lonely, desperate singles with three too many cats and any other person who wants to be a proactive researcher and peruse the literature to find personal solutions to any minor problems they find themselves needing better resources to solve.

Peer edited by Rashmi Kumar.

Follow us on social media and never miss an article:

Where Motivation Hides
Image courtesy of

Recently, I couldn’t find my keys. They weren’t where I usually keep them. Turns out, I was so distracted when I came home that I left them dangling in the lock.

Other days, I’m looking for my phone. Where could I have put it? Oh yeah, there it is, on top of my refrigerator where I left it while I was cooking.

Sometimes though, I’m looking for motivation. Unfortunately, I can’t find that in a lock or on top of the refrigerator. But where can you find motivation? As elusive as motivation can seem, psychologists and neuroscientists have identified strategies and parts of your brain that contribute to feeling motivated.

According to psychologists, there are three main factors that contribute to motivation: autonomy, value, and competence. For many of us, feeling like you have to do something kills any motivation to complete that task. However, shifting your mindset from “have to” to “choose to” can energize you and remind you of the benefits of completing that task.

Similarly, aligning a task with your values can give you a sense of autonomy and increase your investment in it. Even still, it can be difficult to even start something if you feel like you do not have the skills to do it. In this case, remembering that “practice makes perfect” can help you see how important putting in the effort will help you improve for the future.

Neuroscientists are also investigating a growing link between dopamine and motivation. While dopamine is commonly associated with pleasure, movement, and focus, research in rats and humans suggests it also contributes to motivation.

Researchers at the University of Connecticut  found that rats with low dopamine levels were more likely to choose a nearby pile of food rather than an equally close pile with twice as much food that required jumping over a small fence. The scientists concluded that lower dopamine levels in rats is connected to lower motivation.
High levels of dopamine associated with motivation were found in parts of the frontal lobe (right), while high levels of dopamine in the anterior insula (blue) did not contribute to motivation. Image courtesy of Shappelle/Wikipedia

Using brain mapping, scientists at Vanderbilt University saw that self-described “go-getters” had high levels of dopamine in  parts of the brain associated with reward and motivation, the striatum and ventromedial prefontal cortext. However, “slackers” displayed high levels of dopamine in the anterior insula, which is important for emotion and risk perception.

These studies highlight how not just dopamine levels, but also where in the brain the dopamine is, can influence your feelings of motivation. So next time you’re looking for motivation, focus on your sense of autonomy, values, and competence or even try some natural ways to boost your dopamine levels. Just getting started can go a long way.

Peer edited by David Abraham.

Follow us on social media and never miss an article:

If everyone jumps off a bridge, would you too?

For better or for worse, some of our most vivid memories are the ones we made as a teenager. Memories of questionable fashion choices, high school cliques, and many faux pas certainly reinforce just how tumultuous the adolescent years were. With the newfound importance of peer and romantic relationships, a key motivation underlying most teenage behaviors is the desire to “fit in.” Do they want to hang out with me? Does he like me? Will this make me look cool? Although these questions may arise at any age, the motivation toward social belonging is perhaps most salient and emotionally evocative during adolescence.

If everyone jumps off a bridge, would you too?

In some situations, individuals will shift their own attitudes and behaviors to be more like others, a phenomenon known as social influence. Although conforming to social influences can increase feelings of social belonging, most research has focused on the negative consequences of being affected by, or susceptible to, social influences, especially during adolescence. For example, relative to adults, adolescents take more risks in the presence of peers than when they are alone. Thus, a widely popular belief shared by parents, educators, and policymakers is that peers steer youth toward engaging in negative behaviors that they otherwise would not participate in. While adolescents are notorious for “hanging out with the wrong crowd,” researchers wanted to know if adults are susceptible to social influence too.

Scientists recently showed the effect of social influence on decision making changes significantly from late childhood to adulthood. In two studies, participants (8-59 years old) rated the perceived riskiness of everyday situations (e.g., crossing a street on a red light).Then, they were shown the ratings of a social influence group (either teenagers or adults) on the same situations before being asked to rate the everyday situations again. Although all participants changed their risk perception in the direction of the social influence group, younger participants were more susceptible to social influence on risk perception than older participants. In other words, risk attitudes are most likely to be shaped by social influence during childhood and early adolescence, an effect that wanes but persists in late adolescence and adulthood.

The source of social influence matters too. The authors found that early adolescents (12-14 years) were the only age group to change their perceptions of risk more in the direction of other teenagers’ perceptions. Children (8-11 years), late adolescents (15-18 years), young adults (19-25 years), and adults (26-59 years) showed the opposite effect, being more influenced by the risk perceptions of other adults relative to other teenagers. Overall, these findings highlight the profound impact other people have in shaping risk attitudes, even beyond the teenage years. Whereas peers are most influential in shaping early adolescents’ risk attitudes, adults play a stronger role in changing risk attitudes at earlier and later ages. One possibility is that most individuals incorporate the advice of adults when forming their risk attitudes because adults are considered experienced and trustworthy. In contrast, early adolescents may value the opinions of other teenagers more than the opinions of adults to inform their risk perceptions, potentially due to the heightened importance of peer acceptance and social belonging during this time. While the desire to fit in may push everyone to give in to social pressures, even beyond the teenage years, the type of consequences that arise from adopting others’ risk attitudes depends on the source of that social influence. Thus, perhaps the more appropriate question to have posed at the beginning of the blog is whether those around you would jump or not?

Peer edited by Kathryn Weatherford and Breanna Truman.

Follow us on social media and never miss an article:

Fight Fire with…(Why it’s good you’re already finishing this title)

The elderly woman exhaled loudly as she pushed up from sitting at the kitchen table. She’d heard a knocking from the front porch and wondered if her son had forgotten something earlier. She walked to the kitchen door and looked out across the porch, only to see giant orange flames licking up the siding of the house. Her breath caught in her throat. She fumbled pulling her phone from her pocket and her fingers shook as she punched in 9-1-1. Her voice trembled as she almost screamed at the operator – “There’s a fire on the front porch!” Then, in her hurry to leave, she put the phone down as she picked up her purse and rushed towards the side door. Just as she made it out into the yard, she saw that the flames had already come around the porch and soon the entire side of the house was on fire. Now safely outside the burning house, she suddenly wondered why the operator had said they were sending the police. Why weren’t they sending the firetrucks? I need firefighters!  

Minutes later a neighbor drove by, saw the flames and stopped to help. The woman had the presence of mind to borrow his phone to call 911 back and clarify that her house was on fire and that she needed firefighters to be sent. But in those critical moments, the fire had grown in intensity and the house seemed already engulfed. Somehow, the first 911 operator had heard “there’s a fight” instead of “there’s a fire”.

The old adage, learn from your mistakes, applies not just to trying to improve yourself, but to how the different kinds of mistakes we make can teach us about how the world works. For example, understanding communication mistakes like the one above can help us to better understand human cognition and the mechanisms behind how our minds comprehend language, and these lessons can then be broadly applied in everything from improving education to making your Google/Amazon/Apple AI assistant work better. So, what might have led to the mistake in our story?

You might be thinking, well, “fight” and “fire” sound somewhat alike. The distinction between the two may be even less obvious depending on the speaker’s accent, rate of speech, and degrees of emphasis and articulation. Additionally, maybe the clarity of the audio was degraded over the cellular signal or through the phone’s speaker, and all of this may have been affected or exasperated by the stress and intensity of the emergency situation. Perhaps what the 911 operator heard simply sounded more like “fight” than “fire.”

A maybe less obvious possibility is that the 911 operator’s mind made a sort-of calculated guess – a prediction – about the word or words it might hear, given the context of an emergency call and the phrase “There’s a…”, and that this prediction influenced what they thought they heard. It might seem strange to think that our minds make predictions about what we’re about to hear or read, because if we waited just a few moments there probably wouldn’t be a need to predict at all. However, we know that human minds make lots of other generally beneficial predictions. You may try to predict how your opponent will move when playing basketball, where the ball will land in a game of catch, or how the drivers around you will behave to better plan your own movements. You probably aren’t even fully aware that you’re doing it. If you think of language use like these other joint activities, predictions of what a speaker might say next could allow better coordination of turn-taking, faster comprehension, and better planning of your own responses. When you add in the additional ambiguity of spoken language, from all the words that sound alike to all the different ways the same word can be articulated to just how unintelligible speech can sometimes be, making calculated guesses – when you’re right – could be very beneficial for efficient comprehension.

If our minds are really making predictions during language comprehension, what specifically is being predicted and what information is used to make those predictions? These are questions that are still being actively investigated and debated across the levels of language. There is evidence that one source of information that people can use to make predictions is knowledge about the world, and specifically about what is likely to happen in a given context, to make predictions about upcoming language. For example, how would you complete the following sentences?

Getting himself and his car to work on the neighboring island was time consuming.

Every morning he drove for a few minutes, and then boarded the…

If you said ferry you agreed with most people in the classic study by Federmeier & Kutas where people were able to use knowledge about what can be boarded (not a bridge) and how you can travel to islands with a car (not on just a regular boat) in order to predict the next word in sentence pairs like these.  

           But asking people to complete sentences isn’t necessarily the same as predicting language in real time as it’s being produced. How do we know that people are making predictions early and throughout language comprehension? One way is to follow their eyes. People are attending to what they are looking at, and thus following their gaze (or eye-tracking) as they comprehend sentences can allow you to determine how people are processing information in real time, and specifically what they are thinking about. This is often done using the visual world paradigm, developed by Michael Tanenhaus and colleagues. People are asked to look at objects (or pictures of objects) while their eye movements are measured with a special eye-tracking device. In a seminal study using this paradigm, Altmann & Kamide found that people looked more at a picture of a cake than a train or ball while hearing the sentence “The boy ate the cake” after the verb ate but importantly before they even heard the word cake. Thus people were using their knowledge of what can be eaten to restrict and predict what could be talked about before it was even mentioned.

Another way language prediction can be seen is by measuring how the brain responds to specific linguistic stimuli like words, using non-invasive EEG (those fun head caps with all the wires sticking out everywhere!). A neural response to a specific stimuli, or event, is called an Event Related Potential (ERP).

A centro-parietal, negative-going event-related brain potential that occurs about 300-500 ms after a word is encountered is commonly referred to as the N400 (because it is negative and occurs around 400 ms). A large amplitude of the N400 seems to be the “default” response of the brain to words, with reductions occurring for words that are easier to access because of the prior context or because they are semantically related or part of a predictable continuation. So you might have a smaller N400 to “fire” after hearing “Harry Potter and the Goblet of…” and a larger N400 to “fire” after hearing “Harry went to the circus and ate…” In both cases, it seems your mind uses your own real world knowledge (about what is typically eaten) and experiences (enjoying the Harry Potter series) to make predictions about what the next word might be. (See Kutas & Hillyard, 1980 for foundational N400 work or Troyer& Kutas, 2018 for a more recent example of work in this area.)

       The same might be true of our 911 operator. Perhaps they typically have more calls for fights than fires, or perhaps they had just had another similar call that was about a fight. Perhaps, over the course of the operator’s experiences with the language that people use in emergency calls, people tended to say “My _____ is on fire”, whereas they tended to end phrases like “There’s a…” with words like fight or car accident. (In fact, a quick check of the Google Ngram corpus of literature and periodicals finds that “there’s a fire” is less frequently used compared with “is on fire.”) It would take more research to understand exactly why our operator heard “fight” over “fire,” but this example illustrates the importance of understanding the cognitive mechanisms behind language prediction and comprehension in general. In the majority of cases, predictions like this might not even lead to mistakes, and in fact could lead to better, more efficient responses to a variety of communicative situations. However, understanding more about how the mind makes predictions in language comprehension, both in the mistakes and the successes, can help us to have a greater understanding of the human cognition of language and could be vital to improving any human endeavor that depends on successful communication.

Opening Our Minds to “Outsiders”

Who I am today is a reflection of all the sacrifices my immigrant parents made to achieve the American Dream. In the late 1970s, my parents fled the Communist takeover of Vietnam, leaving behind family and friends and spending weeks traveling by boat to come to the U.S. for a better future.

Vietnamese refugees traveling via boat.

Having arrived with little money and limited English fluency, my father worked long hours at a blue-collar job while my mother stayed home to take care of my younger sister and me. My parents always found ways to provide for my sister and me with what little resources they had, using their own hardships to inspire us to achieve more than they could have. I could not be prouder to be the daughter of “boat people,” refugees, and immigrants, a sentiment I hope the refugees and immigrants being turned away at our borders today due to the targeted travel ban will eventually share.

It may be easy for me to empathize with these affected refugees and immigrants because our shared experiences categorize us as part of the same in-group. In social psychology, an in-group is a social group arbitrarily defined based on similarities among its members (e.g., citizenship). And if you’re not an in-group member, then you’re likely to be denigrated as an out-group member, simply for your dissimilarities. Importantly, while there are often no objective differences between in-groups and out-groups, classic social psychology experiments show that minimally defined groups, such as being on a meaningless “blue” or “yellow” team, are sufficient for eliciting out-group bias, even in children as young as 6 years old. This “Us vs. Them” mentality results in people being more likely to help in-groups and discriminate against helping out-groups. While helping in-groups may promote social connection, choosing not to help out-groups may cultivate feelings of rejection or exclusion, reinforcing group boundaries in society.

During a time when Americans’ attitudes and behaviors are especially rife with out-group prejudices, how can we encourage aid and support for those less similar to us?

A recent study by Dr. Grit Hein and colleagues used neuroimaging methods to probe whether out-group biases that emerge implicitly in the brain can be changed through experiencing more positive interactions with out-groups. The researchers used a learning intervention in adults to examine whether attitudes and empathy toward out-group members would change after receiving help from an out-group member (experimental condition) just as often as an in-group member (control condition). Because it is more unexpected to receive help from an out-group member relative to an in-group member, the researchers hypothesized that experiencing more of this unexpectedly positive outcome would increase positive associations with out-group members.

Indeed, Hein and colleagues found that experiencing unexpectedly positive out-group interactions led adults to develop more positive attitudes towards out-group members, which in turn increased empathy-related processing in the brain (i.e., greater neural activation in the anterior insula) to out-group members. Especially promising, increases in out-group empathy were achieved after only two positive learning experiences with out-group members!

Thus, while our perceptions of out-group members — like refugees and immigrants — are often biased and may lead to negative societal consequences (e.g., intergroup conflict), the results of this study highlight just how malleable these arbitrary intergroup distinctions can be. By increasing how often we interact with people less similar to us — whether those differences are by race, citizenship, or whatever arbitrary feature that we think divides us— we can learn to be more accepting of every person’s unique and important contribution to the fabric of our nation. After all, we are each united in our pursuit for the American Dream.

Peer edited by Alissa Brown and Christine Lee. 

Follow us on social media and never miss an article:

Is Your Impostor Syndrome Showing?

Image by Kelsey BreretonI was sitting at my kitchen table with a scattered mess of textbooks and notes studying for my first graduate school final.  The white board was filled with incoherent scribbles of chemical structures and electron arrows.  I had hit a wall, and all the thoughts of self doubt and inadequacy played on an endless loop through my brain: “I’m not as smart as everyone else, I don’t deserve to be here, and now they will really know I’m a fraud.”  I ended up passing the courses, so clearly I am smart enough to be at UNC. However, those negative feelings kept creeping back up to the surface over the years, no matter how many P’s I earned in my courses, positive reviews I received from my PI, or even fellowship awards I won.  The nagging feeling of inadequacy remained!  It turns out this emotion has a name: impostor phenomenon, aka impostor syndrome.

Impostor syndrome is described as feelings of perceived inadequacy even when there is plenty of evidence otherwise.  Many people struggle silently with feelings of chronic self-doubt, intellectual fraudulence, anxiety, and depression.  These can become so severe that it stifles performance in graduate school or the workplace.  Ironically, it’s usually successful people that suffer from impostor syndrome; even acclaimed celebrities and high profile business executives are not immune. They will attribute their success to luck or generosity of others – anything but their own hard work and skills. This is extremely common for many professionals and graduate students. These high achievers set unsustainable, high expectations on their work and when the standards are not met, allowing feelings of deficiency to creep in.

The impostor phenomenon was first studied by Suzanne Imes and Dr. Pauline Clance, who  saw a strong link between the impostor syndrome and perfectionism. Imes posits that impostor syndrome could result from growing up in households that place an exuberant emphasis on achievement.  This resonates with my personal experience. In my family, we got A’s. Not A-’s. Only A’s were acceptable, and before long, I became overtly self-critical when I didn’t get an A on a test or even a homework assignment.  It’s no wonder why self-worth becomes directly tied to achievement.  Students either fear their work won’t be perfect and procrastinate, or they develop obsessive work habits, spending more energy than necessary.  These unhealthy study habits, developed in high school, can become firmly seeded during undergraduate careers.  In intensive graduate programs, that sense of achievement, once drawn from grades, gets drawn from the success of research projects.

Graduate and professional students spend years struggling to learn extremely difficult material, overcoming failed experiments, and (hopefully) becoming experts in highly specialized fields.  It is easy for students to lose their sense of personal identity as they become expert scientists. Perfectionism and attention to detail are described as skills by many successful scientists, but these skills can also be holding us back in many ways if we succumb to the Impostor Monster.

Do others see your impostor syndrome?

Most of the time, the people suffering from impostor syndrome hide their symptoms extremely well because they are afraid of being exposed as a fraud.  If impostor syndrome becomes too bad, others will start to notice your lack of self-confidence and increased self-doubt.  This can be problematic during performance evaluations, job interviews, and committee meetings and can start to negatively impact your life other than just emotionally.

How could your impostor syndrome be holding you back?

Impacts on graduate school:

  1. The fear of being exposed as a fraud can lead students to take less risks in lab.  Students would be afraid of an experiment failing and having to tell their boss they failed, over time leading to less creative research and lower productivity.
  2. Holding back on submitting publications or proposals because they might not be absolutely perfect.
  3. Negative self talk can lead you to think you’re not good enough, then you don’t do your best work which, by default, reinforcing the negative thoughts.  
  4. Constantly comparing yourself to others only feeds the impostor monster and wastes energy that could be better spent elsewhere.

Impacts on professional career:

  1. Not taking ownership of personal accomplishments can result in not getting a promotions, awards, and recognition.
  2. Miss opportunities for new experiences: lowering career goals to match feelings of being unqualified. For example, deciding not to pursue a tenure-track position at a research intensive university because of the feelings of fraudulence and inadequacy.
  3. Work way too hard to make up for your “deficiencies” which can make you more likely to burn out.

So what can we do about this impostor syndrome?

Most people struggle with this their whole lives, but there are ways to keep it from running the show.  

Put your impostor syndrome in perspective: Identify your feelings and do a reality check.  Assess whether your feelings of incompetence are exaggerated.

Remind yourself what you are good at: Determine what your skills are and what you have accomplished so far.  Remember, getting into graduate school itself is a major accomplishment!

Write down the compliments that you receive: On days when you’re struggling with negative thoughts, reading the positive thoughts others have about you will boost your self-confidence. Also, try to accept compliments with a simple “thank you” rather than discounting them.

Build a support group of trusted friends and family: There’s a good chance that most of your peers are going through this too and think that they are also the only one suffering.  

It’s been a few years now since my first graduate school finals, but my impostor syndrome still resurfaces from time to time. I still struggle everyday with setting realistic standards for the amount of work I can accomplish and avoid my excessive perfectionist tendencies. When I start to think “I’m not good enough,” I stop and remind myself what I’m good at and how much I have improved as a scientist since starting graduate school. Don’t let your impostor syndrome run wild and limit your future success in life.

Peer edited by Tom Gilliss and Kelsey Noll. 

Follow us on social media and never miss an article:

Grossed Out? It’s a Grave Matter in Moral Psychology

Halloween is a time of year when we hanker for the horrific, ogle at the ugly, and revel in the rotten. And in this election year, we’re just as likely to overhear conversations about repugnant costumes (like gory zombies or bloody brides) as we are comments on disgusting (or “nasty”?) politicians. Continue reading

An Apple Logo a Day Means Your Memory’s Okay, But Not Perfect

Immediately close your eyes and draw the Apple logo from memory. How confident are you that your drawing is accurate? Keep reading to see how well you did!

Companies change logos frequently. Google, Uber, and Instagram all rebranded in the past year or so. But how well do we actually know what these logos look like? Consider this: which way is Lincoln facing on the penny? Don’t dig into your pocket. Think about it for a second.

In a now classic study, people could not accurately identify the details of pennies. Features were confused, misplaced, or left out altogether. We’ve seen pennies many times throughout our lives, yet we are remarkably poor at remembering their design.

The penny result may seem surprising. But when we are not specifically asked to learn something – and even when we are – our memory doesn’t always come through. Our memories are not perfect. They are a combination of the actual object or event that we experienced and our own expectations and knowledge.  “But okay,” you’re thinking, “I rarely use pennies. Surely I would remember something more meaningful.”

Since the penny study, researchers have continued to explore the connection between our exposure to objects and our memory for them. In one recent study researchers looked at the Apple logo, a symbol that is everywhere; it’s on TV, on billboards, and in the hands of the person sitting next to you. Logos are also made to be recognizable, so we must know what it looks like, right?

Once again, the answer is “Not really.” Participants were, justifiably, very confident in their ability to draw the logo from memory, but only one (out of 85!) did so perfectly. The drawings tended to include parts of apples that are not in the logo, such as stems, consistent with the idea that our memory is based on our expectations (in this case, expectations of what an apple should look like). Participants even struggled to pick out the logo from eight different variations: less than half chose the correct one.

Source: By Apple, Inc. [Public domain], via Wikimedia Commons

How well did you draw the Apple logo from memory?

A similar pattern is found for locating potentially life-saving objects. Here, researchers asked university faculty, staff, and students about the location of the nearest fire extinguisher in their building. These participants had worked in the building for about 5 years, yet most (about 75%) could not report the location.

Even frequent physical interaction with objects does not guarantee success: We can have poor memory for a well-traveled elevator, and self-identified Apple users were just as overconfident in their memory for the logo as were non-users.

Here’s what is happening. Passively encountering or interacting with something does not ensure we’ll remember it. We have little trouble remembering the gist of an event or object, but picking out particular features or locations is difficult.

Cognitive psychologist Alan Castel and colleagues differentiate between “seeing” and “noticing.” Being exposed to information frequently (or “seeing” it) does not guarantee we will notice and remember it. Actually, all that exposure may encourage us to stop paying attention to the details, particularly when they do not hold any real benefit, which is the case for specifics of a penny or the Apple logo. We know it’s a penny because of its size and color, not because of the direction Lincoln faces (it’s to the right).

But these memory lapses are not necessarily a bad thing. Not keeping track of every detail leaves us room to process other important things in the world around us. We can also rely on our expectations for where objects will be, such as that a fire extinguisher will be in an easily accessible place, or that the random piece of trivia knowledge can be found on Google.

Here’s a good place to start if you want to remember something: think about it. Cognitive scientist Daniel Willingham says that memory is the residue of thought. So when you want to remember the layout of that nickel in your pocket, spend some time mulling it over.

Peer edited by Amy Rydeen

Follow us on social media and never miss an article:

Why Oreos Are Not As Addictive As Cocaine

They had to go. Their cream filled indifference stared back at me as I decided their final seconds were nigh. The Oreos. They would all disappear into my face tonight. All I knew was that by the next day, I wanted to be released from their siren song. You might imagine that my reasoning is similar to that of a cocaine addict, vowing to dispose of the last of their stash to aid going clean the next day.

Yet I can assure you, I am not addicted to Oreos.

You may have been concerned about your beloved cookie after reading articles that poured forth from the internet in 2013 making bold claims about Oreo addictiveness. Even though research suggests that sugary foods like the Oreo have negative health consequences, rest assured, there is little evidence to support sensationalist headlines like “Rats find Oreos as addictive as cocaine”. Continue reading

Is My Professor’s Lecture Style Affecting My Learning?

You’re sitting in class as your professor rambles on. The material is interesting, but the lecture is choppy. The professor stops-and-starts frequently, sounding uncertain, and you’re counting the number of times he says, “um.” Meanwhile, your friend is taking the same class with a different instructor known for his confident and clear style.

The content of the courses is the same, but the delivery of that content must be affecting your ability to learn the material, right?


Is the way your professor presents the material affecting your learning?

Maybe not. A recent study out of Iowa State University found that the lecture style of an instructor had no consistent effect on learning.

In the experiment, participants watched a 22-minute presentation of a scientific concept. Half of the participants heard the presentation narrated by an instructor who sounded hesitant, disengaged, and awkward. The other half heard the information from the same instructor but who now spoke in a calm and fluid manner. The actual material covered in the two presentations was identical.

Participants’ confidence in their learning did not differ between the two instructors either. In other words, those who learned from the awkward instructor thought they would perform just as well as those who learned from the confident instructor.

This result may sound surprising, but classrooms have a lot going on. Lectures are typically accompanied by a presentation, graphics, and demonstrations. If the material itself is generally considered hard, it might not matter how it is presented.

That’s what the researchers found: participants generally based their memory confidence on the material being learned and on their own learning abilities instead of on the instructor’s delivery.

But even if the clarity of presentation does not influence learning or confidence, it could affect other outcomes. Teaching evaluations hold a lot of weight in how instructors are perceived – by themselves, their students, and the institutions they work for. Therefore, instructors may look to improve these evaluations. One way to do this? Work on how you sound. Participants rated the clear and confident instructor as more organized, knowledgeable, prepared, and effective.

This study has takeaways for both teachers and students. First, instructors, if you value the perceptions of your students, try to present your lectures in an engaging and fluid manner. Students notice presentation style and judge their professors on how the material is given. But we all have rough days. There might be a time when you can’t prepare and rehearse as much as you’d like, and that’s okay. Your students’ comprehension may not be any worse off.

Now students, your understanding of the material may not be influenced by your instructor’s delivery. Actually, having an overly confident instructor could hurt your learning. A related study found that students learning from such an instructor thought that they understood the information much better than they actually did. So maybe you don’t have to be too envious the next time you hear about your friend’s awesome professor.

Peer edited by Salma Azam, Sara Duncan, and Lindsay Walton.

Follow us on social media and never miss an article: