Underground Science at SNOLAB

The best models of how our world works are incomplete. Though they accurately describe much of what Mother Nature has thrown at us, models represent just the tip of the full iceberg and a deeper understanding awaits the endeavoring scientist. Peeling back the layers of the natural world is how we physicists seek a deeper understanding of the universe. This search pushes existing technology to its limits and fuels the innovation seen in modern day nuclear and particle physics experiments.


This is a map of the SNOLAB facility. It’s 2 km (~1.2 miles) underground and is the deepest clean room facility in the world!

Today, many of these experiments search for new physics beyond the Standard Model, the theory physicists have accepted to describe the behavior of particles. Some physical phenomena have proven difficult to reconcile with the Standard Model and research seeks to improve understanding of those conundrums, particularly regarding the properties of elusive particles known as neutrinos which have very little mass and no electric charge, and dark matter, a mysterious cosmic ingredient that holds the galaxies together but whose form is not known. The experiments pursuing these phenomena each take a different approach toward these same unknowns resulting in an impressive diversity of techniques geared towards the same goal.

On one side of the experimental spectrum, the Large Hadron Collider smashes together high-energy protons at a rate of one billion collisions per second. These collisions could have the potential to create dark matter particles or spawn interactions between particles that break expected laws of nature. On the other side of the spectrum, there is a complimentary set of experiments that quietly observe their environments, patiently waiting to detect rare signals of dark matter and other new physical processes outside the realm of behavior described by the Standard Model. As the signals from the new physics are expected to be rare (~1 event per year as compared to the LHC’s billion events per second), the patient experiments must be exceedingly sensitive and avoid any imposter signals, or  “background”, that would mimic or obscure the true signal.

The quest to decrease background interference has pushed experiments underground to cleanroom laboratories setup in mine caverns. While cleanrooms reduce the chances of unwanted radioactive isotopes, like radon-222, wandering into one’s experiment,  mines provide a mile-thick shield from interference that would be present at the surface of Earth: particles called cosmic rays constantly pepper the Earth’s surface, but very few of them survive the long journey to an underground lab.

Figure reproduced with permission from Michel Sorel from La Rivista del Nuovo Cimento, 02/2012, Volume 35, Issue 2, “The search for neutrinoless double beta decay”, J. J. Gómez-Cadenas, J. Martin-Albo, M. Mezzetto, F. Monrabal, M. Sorel, all rights reserved, with kind permission of Società Italiana di Fisica.

The rate at which muons, a cosmic ray particle, pass through underground labs decreases with the depth of the lab. At the SNOLAB facility, shown in the lower right, approximately one muon passes through a square centimeter of the lab every 100 years.

The form and function of modern underground experiments emerged from the collective insights and discoveries of the scientific community studying rare physical processes. As in any field of science, this community has progressed through decades of experimentation with results being communicated, critiqued, and validated. Scientific conferences have played an essential role in this process by bringing the community together to take stock of progress and share new ideas. The recent conference on Topics in Astroparticle and Underground Physics (TAUP) was a forum for scientists working to detect dark matter and study the properties of neutrinos. Suitably, the conference was held in the historic mining town of Sudbury, Ontario, home to the Creighton Mine, at the bottom of which lies SNOLAB, a world-class underground physics laboratory which notably housed the 2015 Nobel Prize winning SNO experiment. SNO, along with the Super-Kamiokande experiment in Japan’s Kamioka mine, was awarded “for the discovery of neutrino oscillations, which shows that neutrinos have mass.”

There is a natural excitement upon entering an active nickel mine, donning a set of coveralls, and catching a cage ride down into the depths; this was our entrance into the Creighton Mine during the TAUP conference. After descending an ear-popping 6800 feet in four minutes, we stepped out of the cage into tunnels— known as drifts— of raw rock. From there, we followed the path taken everyday by SNOLAB scientists, walking approximately one kilometer through the drifts to the SNOLAB campus. At SNOLAB, we prepared to enter the clean laboratory space by removing our coveralls, showering, and donning cleansuits. Inside, the rock walls are finished over with concrete and epoxy paint and we walked through well-lit hallways to a number of experiments which occupy impressively large caverns, some ~100 feet high.

Photo credit: Tom Gilliss

Physicists visiting SNOLAB get a close-up view of the DEAP-3600 and MiniClean dark matter experiments. Shown here are large tanks of water that shield sensitive liquid argon detectors located within.

Our tour of SNOLAB included visits to several dark matter experiments, including DEAP-3600 and MiniClean, which attempt to catch the faint glimmer of light produced by the potential interaction of dark matter particles with liquid argon. A stop by PICO-60 educated visitors on another captivating experiment, which monitors a volume of a super-heated chemical fluid for bubbles that would indicate the interaction of a dark matter particle and a nucleus. The tour also included the SNO+ experiment, offering glimpses of the search for a rare nuclear transformation of the isotope tellurium-130; because this transformation depends on the nature of neutrinos, its observation would further our understanding of these particles.

SNOLAB is also home to underground experiments from other fields. The HALO experiment, for instance, monitors the galaxy for supernovae by capturing neutrinos that are emitted by stellar explosions; neutrinos may provide the first warnings of supernovae as they are able to escape the confines of a dying star prior to any other species of particle. Additionally, the REPAIR experiment studies the DNA of fish kept underground, away from the natural levels of radiation experienced by all life on the surface of Earth.

The search for rare signals from new physical phenomena pushed physicists far underground and required the development of new technologies that have been adapted by other scientific disciplines. The SNOLAB facility, in particular, has played a key role in helping physics revise its best model of the universe, and it can be expected that similar underground facilities around the world will continue to help scientists of many stripes reveal new facets of the natural world.

Peer edited by JoEllen McBride and Tamara Vital.

Follow us on social media and never miss an article:

One in a Million: The Importance of Cellular Heterogeneity and the Power of Single Cell Sequencing

One of the most overwhelming aspects of modern-day biomedical research is the overarching heterogeneity that consumes all realms of biology. Ranging from cell to cell to human to human, we have become increasingly aware of the important differences that drive divergent responses to therapeutics and biological stimuli.  The complexity of cancer is one such example.   

A landmark paper published by Gerlinger et al. in The New England Journal of Medicine demonstrated that analyzing multiple biopsies from a single patient’s tumor gives a much different picture of what the biology driving that tumor is, compared to examining a single biopsy alone. In addition, many studies characterizing the cellular heterogeneity of cancer have revealed that a tumor is much more than a mass of identical cells all growing out of control. Rather, a tumor is comprised of cancer cells at different stages of the cell cycle, engaged in different cell signaling pathways, as well as other cell types including immune cells and endothelial cells (the cells lining blood and lymphatic vessels).  


Figure 1: A schematic showing the multiple biopsy sites in a single patient tumor. The study found that the number of private mutations (i.e., a mutation found only at a single biopsy site and not at any of the other sites) are immense, suggesting that looking at a single biopsy alone greatly undermines the mutational diversity of a given patient’s tumor.

New technologies are emerging to enable us to better dissect the heterogeneity of disease and biology.  One such technique is single cell sequencing. Sequencing is a technique with which we are able to get a readout of the entire genetic composition of a cell. In its most basic form, sequencing can be performed to get a readout of DNA or RNA, which are two types of molecules  involved in different levels of genetic regulation.  Sequencing has traditionally been performed on bulk samples comprised of hundreds to millions of cells in aggregate. While we have made remarkable advances in our understanding of biology using bulk sequencing, the emergence of single cell sequencing has now allowed us to begin to  probe the genetic profile of a single cell at varying levels of complexity. The technology often takes advantage of molecular indexing, which is a technique whereby individual mRNA or DNA molecules are labeled with a molecular tag that is associated with a single cell, and then the cells are all pooled together into one tube and sequenced. During the post-sequencing analysis, the molecular tags are re-associated with each single cell, and then the profiles of each of the single cells are compared to one another.

This novel technique has allowed us to begin investigating and understanding biology with a higher degree of resolution than we ever could have imagined before, and will undoubtedly lead to the discovery of many new exciting realms of biological regulation. For example, the biomedical company Becton and Dickson have performed a research study analyzing single cells from tumors of mouse models of cancer. They found that within each tumor sample, there were distinct populations of cells with unique gene expression profiles, and these profiles were associated with vastly different biological functions. Understanding how these different populations work together as a community to promote tumor growth may help us better understand how to develop new treatments for cancer.

However, like all cutting-edge technologies, there are still limitations that need to be overcome. One concern is inefficient collection of sample, because the amount of genetic material in a single cell is much smaller than the amount of material from a large group of cells. An additional confounding variable is uncertainty of whether a low yield of genetic information from a single cell is the result of technical error, inefficiency of small sample collection, or simply the lack of expression of a gene in that particular cell. Discerning the difference between a true, biological negative result and simply a technical deficiency are often difficult to parse out.

Especially in fields such as cancer biology, we have increasingly begun to realize that heterogeneity has largely been an obstacle in our ability to develop effective therapies and to truly understand the mechanisms of regulation of biological processes. Advances in single cell sequencing research have allowed us to further realize that there is not one function or process driving disease progression, but rather a network of cells with distinct roles. Uncoupling the forces dictating the progression of this heterogeneity is what will help us to make the next great advances in therapeutic development in cancer and many other diseases.

Peer edited by Chiungwei Huang and Richard Hodge.

Follow us on social media and never miss an article:

The Ethics of Open Access: Is Pirating the Best Path?


Alexandra Elbakyan will go down in history as the mastermind of Sci-Hub and perhaps as a champion for open access research. Sci-Hub is an online repository of pirated research articles that enables scholars to access millions of articles free of charge.  Elbakyan founded Sci-Hub in 2011 in response to the paywalls guarding many of the articles that she needed for her neuroscience graduate studies. The pirating website provides research article access for a community of scholars who could not afford to access the articles through traditional channels.

As you might imagine, Sci-Hub is surrounded by controversy. The Sci-Hub repository relies on hacked publishing websites or donated institutional login credentials to obtain research articles. You may even consider Elbakyan to be a modern day Robin Hood – robbing the rich publishing giants to provide research articles to those without access, and many scientists have praised her efforts (and even donated to the cause) for advancing open-access research. However, many others (including publishers) view Sci-Hub and research article piracy as ethically wrong, and they have condemned her efforts.

Apneet Jolly - https://www.flickr.com/photos/ajolly/4696604402/, CC BY 2.0, https://commons.wikimedia.org/w/index.php?curid=56109793

Alexandra Elbakyan, founder of Sci-Hub, speaks at a conference at Harvard in 2010.

A recent article on PeerJ that is based out of a group from the University of Pennsylvania investigated how extensively Sci-Hub has infiltrated the databases of publishing agencies. According to Daniel Himmelstein, the lead author of the study, Sci-Hub contains 69% of all research articles that exist (based on an estimated 81.6 million articles in total), with coverages approaching 93% for disciplines such as chemistry. What’s particularly fascinating is that the 56 million or so articles are all located within one repository, and they are easily accessed via a DOI (digital object identifier) search bar. The PeerJ article contains more about the extent of Sci-Hub’s reach than can be covered in this briefing, but you can also explore an interactive website about Sci-Hub’s capacity that is associated with the study.

Over the past two years, Elsevier and other major publishing companies such as Springer and the American Chemical Society have been scrambling to counteract the growing influence of Sci-Hub. In June of 2017, a U.S. court ruling was finalized that ordered Elbakyan to pay Elsevier $15 million in damages after a judge ruled that Sci-Hub had violated copyright laws by providing free access to 97% of Elsevier’s articles. The American Chemical Society (ACS) has also filed a complaint against Sci-Hub, who has actually mirrored the ACS website to provide easier access to pirated ACS publications.

Open-access research is a hot topic, and the recent lawsuits against Sci-Hub have only added fuel to the fire. While Sci-Hub has increased publicity for open-access research, the ethics behind Sci-Hub’s article piracy has clouded the open-access conversation. Many would agree that open-access research is important, but at what cost? Does the end result that all people have equal access to research data justify the ethical quandary of article piracy? Alexandra Elbakyan believes that is does. Only time will tell if she is right.

If you found this article interesting, check out these other articles for more information on the evolution of scientific publishing, open access and Sci-Hub, and Sci-Hub worldwide usage stats.

Please note: Accessing Sci-Hub is illegal in the United States. The author of this article and the editors of The Pipettepen do not condone the use of Sci-Hub to access research articles. Rather, this article is only intended to provide current scientific news on open-access research.


Continue the conversation with Tyler on Twitter: @Farnsworthtw


Peer edited by Kaylee Helfrich.

Follow us on social media and never miss an article:

Nanotechnology in Your Sunscreen: Doing More Harm than Good?

While soaking in the sunshine may feel good, and you may have heard about solar ultraviolet (UV) radiation harm, you may not be aware of what’s in your sunscreen. Lee Hong explored the benefits of sunscreen in his post on The Pipettepen, and today, we dive deeper into a smaller world – the nanotechnology in our sunscreen.

The two minerals available to sunscreen in form of nanosized particles (NPs) are zinc oxide (ZnO) and titanium dioxide (TiO2). They are less than 1/1000th the size of a human hair. Bulkier minerals in traditional sunscreen reflect visible light, making it opaque and cakey on your skin. NPs on the other side, scatter light instead of reflecting it, resulting in a disappearing and lighter feeling sunscreen.


Traditional sunscreens block out UV rays but many with ghostly white color. Nanotechnology make them disappear on your face!

While the resulting nanoproduct can be a big help, people have raised concerns over the safety of NPs-based cosmetic sunscreens. With their smaller size, NPs could in theory be absorbed into the skin at a higher level than their bulkier counterparts. The real question to ask is if these tiny particles are more harmful if absorbed, than good in protecting us from UV rays.

Studies are divided about whether NPs can pass through the skin. A few reassuring words from Paul Wright, a toxicology researcher at RMIT University, “There’s a negligible penetration of sunscreen particles,” as he told The Guardian, “They don’t get past the outermost dead layer of human skin cells.” In 2017, the Australian Therapeutic Goods Administration (TGA) published its review that NPs absorption is unlikely, based on both via in-vitro (i.e. studies using isolated skin cells) and in-vivo (i.e. studies on live skin tissue) studies. It appears that we are in a safe zone!

Other scientists have tested on the toxicity of these tiny metal oxides when exposed to UV light, simulating the real-life scenario for use of sunscreens. Their results indicated that the metal oxides may generate reactive free-radical species, leading to cancer due to DNA damage. However, this alarming impact on human health depends on whether NPs in sunscreen are absorbed into our skin. Providing some comfort, research associate Simon James at the Australian Synchrotron told to The Guardian that “Our study demonstrates that the human immune system has the right equipment to remove any nanoparticles that somehow make it through the skin, assuming some do at all.” Their work showed that human natural defenses can gather and destroy ZnO nanoparticles. Moreover, sunscreen manufacturers utilize surface coatings to improve transparent effect and as a result, the coated components can essentially reduce toxicity from lessen reactivity to UV lights.


It’s not a bad idea shielding your skin from burning sun with an umbrella’s shade whenever you are up for outdoor activities.

With the increased popularity of the nanotech-based products, another concern is noxious effects caused by inhalation of NPs. The Environmental Working Group (EWG), based out of Washington, D. C., announced a warning to refrain from spray sunscreen and loose powder cosmetics containing ZnO or TiO2 particles. The lungs have difficulty removing small particles and thus end up with organ damage possibly in the same way that air pollution is linked to lung cancer.

Evidence suggests there is more harm from skipping the sunscreen than exposing your skin to nanoparticles, but, if you’re not comfortable with these tiny oxides, UV protection umbrellas are another option!

Peer edited by Bailey DeBarmore.

Follow us on social media and never miss an article:

Getting the Whole Picture: Increasing Diversity in Medical Research


Doctor consults patients on healthcare. Photo credit: Rhoda Baer (Photographer)

Every medication for every ailment, from headaches to heart disease, goes through clinical trials before it becomes available to patients. Clinical research is necessary to determine whether potential new drugs are effective and safe to use, and those that are get the seal of approval from the Food and Drug Administration (FDA). During a clinical trial, health care providers give the new drug to patient volunteers that ideally represent a sample of the larger treatable population. However, the underrepresentation of racial and ethnic minorities in clinical trials has historically skewed the demographics.


Medications don’t have the same effect in all patients.

Why is it so important to have proper representation across all demographics? When the FDA approves a drug for a certain medical condition, it is approved for all patients with that condition. However, even for individuals with the same condition, the efficacy and side effects of certain medications may vary across demographics. Variation can occur due to socioeconomic, environmental, and genetic differences. If certain groups are underrepresented in clinical trials, then we have an incomplete picture of the particular health needs of those populations, potentially leading to inadequate treatment. For example, although white women are diagnosed more frequently with breast cancer, African-American women are more likely to die from the disease. This disparity indicates that there is an unmet need among African-Americans for effective breast cancer treatments.


The Tuskegee experiment is just one cause of bad blood between minorities and the medical community.

Although racial and ethnic minorities make up about 30% of the U.S. population, collectively they only make up about 20% of clinical trial participants. While this is an improvement from previous years, there’s still progress to be made. Unfortunately, there are numerous barriers to increasing minority inclusion in medical research, including lack of access to healthcare and education. Additionally, there have been unethical medical practices that have targeted minority populations in the past. This includes the Tuskegee experiment, a 4 decade-long study during which African-American men were intentionally denied treatment for syphilis in favor of studying the disease. Such offenses have caused many minority communities to distrust healthcare officials. Overcoming these obstacles requires effort on the part of healthcare officials to improve communication and build connections with those communities.

Though clinical research is crucial for patients to benefit from new and more effective treatments, not all populations have benefited to the same degree. Improving representation of minorities in clinical trials is an important task that requires a change in perspective from healthcare providers and patients alike. Ultimately having diverse patient groups representative of the population will allow clinical research to yield more effective medications and improve the health of the broader population.


Peer edited by Christina Marvin.

Follow us on social media and never miss an article:

Think Before You Drink


Pregnant women should not consume alcohol.

You see them at restaurants, grocery stores, and parties. In your fridge, in your cupboard, and on TV. In city streets and in the trash. And most likely, you’ve never paid attention to them.

Image credit: Kaylee Helfrich

Warning label on an alcoholic beverage bottle but  this label didn’t not exist before 1988.

Every bottle of alcohol you’ve seen since 1988 has had a sticker on it, saying, “GOVERNMENT WARNING: According to the Surgeon General, women should not drink alcoholic beverages during pregnancy because of the risk of birth defects.”

Even if you’ve never noticed the sticker on alcoholic beverages, you’ve probably heard that pregnant women shouldn’t drink alcohol. Why should they avoid alcohol, and what happens if they do drink?

Unfortunately, alcohol consumption during pregnancy has many negative impacts on the developing baby. Affected children tend to be smaller, have altered facial features, develop more slowly, and have issues with memory, learning, decision-making, self-regulation, and social behavior. And these are just some of the most common outcomes; many other significant problems can also occur.


A child with severe FASD.

A child with symptoms resulting from prenatal alcohol exposure is diagnosed with a fetal alcohol spectrum disorder (FASD). Sadly, there is no cure for FASD, and the effects last throughout the child’s lifetime.

Numbers help us better understand the scope and impact of FASD:

  • 0: the amount of alcohol considered safe during pregnancy. Research has conclusively determined that heavy alcohol consumption (binge drinking) causes FASD, but research on light to moderate drinking is less clear. However, since maternal genetics, age, size, nutrition, timing of alcohol consumption, and many other factors can influence alcohol’s effects, recommendations state that no pregnant woman should consume alcohol.
  • 3,000,000: the number of women in the U.S. at risk of having a child with FASD. These women are sexually active, do not use birth control, and consume alcohol.
  • 50: the percentage of pregnancies that are unplanned. Unplanned pregnancies contribute to the difficulty of asking women to avoid alcohol completely during pregnancy since often they did not intend to become pregnant.
  • 4-8: the number of weeks it usually takes for women to realize that they are pregnant. This makes it difficult to avoid alcohol during pregnancy since if a women doesn’t realize she is pregnant, then she doesn’t know to avoid alcohol. Unfortunately, alcohol can harm the fetus from conception.
  • 10: the percentage of women who continue to drink alcohol even after discovering their pregnancy.
  • 2015: the year that the CDC issued a report recommending that women who were of reproductive age and who wanted to drink alcohol should be on birth control. This report led to a large backlash from people saying that the government was trying to control their lives.
  • 2-4: the percentage of children in the U.S. who are affected by FASD. In the U.S. today, the number of FASD cases rivals the number of autism cases.
  • 13-20: the percentage of children in some South African populations who are affected by FASD; these populations have some of the highest rates in the world.
  • 60-80: the percentage of children with FASD worldwide growing up in institutional settings and/or foster care.
  • 6,000,000,000: the cost of FASD-related medical bills every year in the U.S.
  • 61: the percentage of teenagers with FASD who have been in some sort of legal trouble. Prenatal alcohol exposure disrupts the brain centers that regulate impulses.

    Image credit: Kaylee Helfrich Data from:https://www.ncbi.nlm.nih.gov/pubmed/25349310 and https://www.cdc.gov/mmwr/volumes/65/ss/ss6503a1.htm

    Current estimates show that more children in the US have FASD than autism. Data adapted from CDC and Pediatrics Journal.

These numbers paint a stark picture of FASD. Although there is no cure for FASD, early intervention and treatments can improve a child’s cognitive and social development. However, the emphasis is on “early.” Without intervention, children with FASD struggle greatly throughout life, and the people around them struggle too.

Fortunately, avoiding having a child with FASD is as simple as avoiding alcohol, unless you (or your partner) are on birth control. The next time see a bottle of alcohol, you can look for the Surgeon General sticker and remember the shocking FASD statistics that it tries to prevent!


Peer edited by Lindsay Walton

Follow us on social media and never miss an article:

H What N What? A Designer Protein Hits the Science Runway

Image ID: 10073 https://phil.cdc.gov/phil/details.asp

TEM Image of Influenza Virion. Content Providers: CDC/ Erskine. L. Palmer, Ph.D.; M. L. Martin, 1981.  Photo Credit: Frederick Murphy

Influenza is a virus that straddles two worlds: that of the past and that of the future. Responsible for more deaths than HIV/AIDS in the past century, the flu is one of the world’s’ most dangerous infectious diseases though it may not seem so, especially in the United States. However, the flu is responsible for millions of cases of severe illness and approximately 250,000 to 500,000 deaths worldwide each year.

Influenza Pandemics
Influenza A and B circulate each flu season, but it is the emergence of new influenza A strains that have been responsible for worldwide epidemics, or pandemics, in the past such as the 1918 ‘Spanish Flu’ pandemic and the 2009 H1N1 pandemic. There are 2 ways a new influenza virus can emerge. Every time the virus replicates, small genetic changes occur that result in non-identical but similar flu viruses: this is called “antigenic drift”. If you get infected with a certain flu virus, or get a vaccine targeting a certain flu virus, your body develops antibodies to that virus. With accumulating changes, these antibodies won’t work against the new changed virus, and the person can be infected again. The other source of change is “antigenic shift”, which results in a virus with a different type of hemagglutinin and/or neuraminidase, such as H3N2 to H1N1. The 2009 H1N1 virus is cited by some as a result of antigenic shift, because the virus was so different than previous H1N1 subtypes; however, as there was no change in the actual hemagglutinin or neuraminidase proteins, it was technically a case of antigenic drift.

Image ID: 13469 https://phil.cdc.gov/Phil/details.asp

This diagram depicts how the human cases of swine-origin H3N2 Influenza virus resulted from the reassortment of two different Influenza viruses. The diagram shows three Influenza viruses placed side by side, with eight color-coded RNA segments inside of each virus. The virus from the 2009 pandemic (right) has HA/NA proteins and RNA from Eurasian and North American swine instead of from humans like in previous years (left two viruses).
Content Provider: CDC/Douglas Jordan, M.A. 2011.


Challenges in Studying the Flu
Scientists and policymakers face many challenges when studying the influenza virus. For instance, the virus can be transmitted among people not showing symptoms and cough, sneeze, or handshake can spread infectious droplets from someone who doesn’t know they’re sick. Scientific study is further complicated by the virus itself: there are 3 antigenic types of influenza virus that infect humans, (A, B, C), with various subtypes and strains. Each year, government agencies work with scientists to decide which strains to target in that year’s vaccine manufacturing. The lag time between production in the spring and the flu season in the winter provides time for unexpected types to emerge.

Image ID: 17345 https://phil.cdc.gov/Phil/details.asp

This is a 3D illustration of a generic Influenza virion’s fine structure. The panel on the right identifies the virion’s surface protein constituents. Content Provider: CDC/ Douglas Jordan, Dr. Ruben Donis, Dr. James Stevens, Dr. Jerry Tokars, Influenza Division. 2014. Illustrator: Dan Higgins.

The H#N# nomenclature for influenza A subtypes refers to the hemagglutinin (H) and neuraminidase (N) proteins that sit on the surface of the virus. There are 18 types of hemagglutinin and 11 types of neuraminidase. Hemagglutinin aids the virus in fusing with host cells and emptying the virus’ contents inside. Neuraminidase is an enzyme embedded in the virus membrane that facilitates newly synthesized viruses to be released from the host cells to spread the infection from one cell to another.

Targeted Therapy
In studying ways to prevent and battle influenza, research scientists have focused their efforts on blocking the actions of neuraminidase and hemagglutinin. Antiviral drugs, such as oseltamivir (Tamiflu®) and zanamivir (Relenza®), bind neuraminidase, both interact with neuraminidase at sites crucial for its activity. The drugs act to render the virus incapable of self-propagating. A computational biologist at the University of Washington in Seattle, David Baker and his team know the hemagglutinin protein well. In 2011, they utilized nature’s design by studying antibodies that bind hemagglutinin in order to design a protein that targets the glycoprotein’s stem in H1 subtype flu viruses and prevent the virion from infecting the host cell. However, antiviral resistance contributed to by antigenic drift, is a serious issue. Researchers much constantly develop new drugs to keep up with changes in the virus.

David Baker and his team now focus their research on the hemagglutinin protein. Utilizing a computational biology approach, they designed a protein that fits snugly into hemagglutinin’s binding sites. They tested their designer protein on 10 mice and found that in mice exposed to the H3N2 influenza virus, their protein worked both as a preventative measure and as a treatment.  Though there is a long road to human testing, this binding protein shows promise for bedside influenza diagnosis as well as a model for possible treatments.

Want to know more? 

Image ID: 8675 https://phil.cdc.gov/phil/details.asp

This photograph depicts a microbiologist in what had been the Influenza Branch at the Centers for Disease Control and Prevention (CDC) while she was conducting an experiment.  Content Provider: CDC/Taronna Maines. 2006. Photo Credit: Greg Knobloch.

Learn how scientists monitor circulating influenza types and create new vaccines each year.

See flu activity and surveillance efforts with the CDC’s FluView and vaccination trends for the United States using the FluVaxView.


Peer edited by Richard Hodge and Tyler Farnsworth.

Follow us on social media and never miss an article:

Spice is Nice (for Birds)

My labmate was having a problem one morning – a fuzzy, gluttonous problem. To help keep her indoor cat entertained during her time at work, she thought it was a great idea to set up a bird feeder. Being so close to the woods, surely this would bring all of the birds to the yard. Unfortunately, the local squirrels soon flocked to the free food source like a group of grad students and left not one seed for their colorful avian counterparts. My advice to her? Grab the hot sauce!


Capsaicin is the compound in peppers that makes them hot and spicy

Making the bird feeder a literal hot spot by sprinkling high-Scoville sauces on seeds is not meant to suggest that birds engage in the same machismo food challenges as humans. They simply are not irritated by the chemical in peppers that is “hot” to humans and other mammals: capsaicin.


There are biological differences between certain pain perception receptors in birds and mammals. The TRPV1 (transient receptor potential vanilloid subfamily, member 1) receptor, also known as the capsaicin receptor, is involved in the perception of unpleasant or harmful chemical, physical, and thermal stimuli. The molecular sequence of this receptor between birds and mammals is only 68% similar, as compared to the >95% similarity for most other central nervous system receptors. What this translates to is an unusually high avian threshold for tolerating the spice in hot peppers. While mammals are put off by concentrations of 10-100 ppm (up to about the heat level of a jalapeño pepper), birds are not even fazed by capsaicin levels >20,000 ppm (habanero pepper territory).

Picture by Lindsay Walton

Birds are insensitive to the burn of capsaicin from hot peppers, making spicy bird food less appealing to squirrels.

Chili peppers take advantage of this natural disparity in receptor sensitivity, according to the directed deterrence hypothesis, which states that fruits produce noxious or toxic chemicals that make them more appealing to organisms that will disperse their seeds and less appealing to those that would destroy the seeds. This is especially handy because the regions in which hot peppers grow (for example, throughout Central and South America) are favorable places for seed-eating rodents to thrive. It is common knowledge to chefs and laymen alike that the seeds of chili peppers are much hotter than the surrounding fruit, which serves as extra insurance against seed destruction by mammals that may be more desensitized to capsaicin than the average bear, so to speak.


While sprinkling hot sauce on your bird feed or suet cakes stands a good chance at repelling squirrels, anybody with That One Friend Who Puts Sriracha on Everything can tell you that a taste for spicy food can be acquired easily enough. Indeed, while the initial sensitivity to capsaicin differs from individual to individual, mammals can become desensitized to the unpleasant sensations associated with TRPV1 receptor activation, and can actually develop a preference for pungency.


Becoming acclimated to capsaicin via desensitizing TRPV1 receptors can open up a new world of chili pepper-related culinary possibilities, but quieting these receptors can have other effects. These receptors are activated by higher temperatures, and studies have shown that capsaicin-desensitized animals have impaired body temperature-regulating behaviors, which can make them more prone to accidental overheating. Because these receptors also convey information regarding painful physical or chemical stimuli, exposure to capsaicin has been used to better understand different types of pain in both rodents and humans alike.


So as we go from summer into what Chapel Hill calls “winter,” and if you feel like feeding the birds instead of the squirrels, there are a number of spicy bird foods available or you could make your own. (For those of you interested in feeding squirrels exclusively, birds are repelled by the compound methyl anthranilate, which is strangely enough a component of artificial grape flavoring.) Just keep in mind that the bolder squirrels might end up taking a liking to the spicy birdseed, but you can probably identify them easily enough by the little bottles of Sriracha that they will bring.



Peer edited by Amanda Tapia.


Follow us on social media and never miss an article: