Epigenetics: The Software of the DNA Hardware


Scientists identified the genes of the human genome to understand how the genes influences the function and physical characteristics of human beings.

The Human Genome Project (HGP) was an amazing endeavor to map the full human genome, and so intense an effort that it required an international collaborative research team. One of the ultimate goals of this project was to shed light on human diseases and find the underlying genes causing these health issues. However, the HGP ended up creating more questions than answering them. One thing we found out is that most diseases are complex diseases, meaning that more than one gene causes the disease. Obesity is one such example of a complex disease. This is in contrast to cystic fibrosis which is a disease caused by a mutation in a single gene. To further complicate diseases, there are gene and environment interactions to consider. A gene-environment interaction is a situation in which environmental factors affect individuals differently, depending on their genotype or genetic information. The possible number of gene-environment interactions involved in complex diseases is daunting, but the HGP has given us the information necessary to start better understanding these interaction.

Although the HGP did not end up giving us the answers we were looking for, it pointed us in the direction we needed to take. We needed to consider the role of environmental factors on human health and disease. For not only are complex diseases not fully explained by genetics alone, but another aspect of these diseases remains unexplained by genetics: the health disparities seen within diseases like obesity, diabetes, cancer, etc. The HGP showed us that not all diseases are caused by single mutations and that genetic diversity does not explain the differences in health outcomes. The environment plays a big part.


Epigenome  is the software that controls gene expression to make the different cells that make up the human body.

Gene-environment interactions begin to answer why genetics alone cannot explain  varying health outcomes by considering that the environment may have varying effects on our genetic data. However, there is another dimension to our genetic background that could better answer why genes often can’t be mapped directly to a disease. Imagine that our genome is our computer hardware, with all the information necessary to create the cells we are composed of. However, something needs to configure the genome to differentially express genes so as to make skin and heart cells from the same DNA. Skin and heart cells have the same information (DNA) in their nucleus but only express what’s necessary to function as a skin or heart cell. In other words, software is needed for the hardware. For us, that software is the epigenome. The epigenome consists of a collection of chemical compounds, or marks, that tell the genome what to do; how to make skin and heart cells from the same information. The epigenome, unlike the genome, is flexible. It can change at key points in development and even during the course of one’s lifetime. This flexibility makes the epigenome susceptible to environmental factors and could explain: (1) Why our genome alone cannot explain the incidences of diseases such as obesity, (2) the health disparities within these complex diseases, and (3) the transgenerational inheritance of complex diseases like metabolic syndrome, defined as a cluster of conditions such as high blood pressure and high blood sugar that increase your risk for heart disease and diabetes.

Now of course, the more we find out the more questions are left unanswered. As stated before, the epigenome can change due to lifestyle and environmental factors which can prompt chemical responses. However, the mechanisms by which things like diet and smoking induce these chemical responses is unclear. But researchers have started to fill in the gap. For example, certain types of fats, like polyunsaturated fatty acids (corn oil is high in these), can generate highly reactive molecules and oxidative stress, which can cause epigenetic alterations. Tobacco smoke contains a mixture of chemicals that have been independently investigated with mixed results on the epigenetic effects. Psychological stress, more specifically child abuse, has been seen to cause increased methylation (a sort of mark on the genome) of a receptor for hormones responsible for metabolism (glucocorticoid receptor) in suicide victims. This has also been seen in mouse models where higher maternal care of pups decreased methylation of the glucocorticoid receptor. Increased methylation usually decreases the expression of the glucocorticoid receptor, and decreased methylation would increase the glucocorticoid receptor’s expression.

The HGP was an amazing endeavor of science and has given us amazing insight into the structure, organization, and function of the complete set of human genes. It has also helped point us in a new direction to better understand chronic diseases and seek to find the solutions to address the burden of disease.

Peer edited by Mejs Hasan and Emma Hinkle.

Follow us on social media and never miss an article:

Heart to Heart


Hearts and heart health are front and center throughout the month of February.

The month of February is a big month for hearts. Between Valentine’s Day and American Heart Month, you cannot escape heart-shaped decorations and reminders to exercise daily. And while many of us are fortunate that our heart health can be maintained through diet and exercise, there are some cases where that is not enough. Individuals with certain congenital heart defects, weakened heart muscles, or other types of heart disease may need a totally new heart. In the United States, about 2,300 heart transplants occur each year with over 70% of those patients surviving for five years afterwards. This high survival rate is in stark contrast to the early days of heart transplants in the late 1960s and 1970s, and it is largely due to advances not in heart physiology, but the immune system.

Our immune systems are exceptionally good at identifying foreign invaders and attacking them. In the cases of bacterial or viral infections, the immune system’s voracious assault on foreigners keeps us healthy. However, in the case of a heart transplant, where foreign tissue is introduced to the body to save the patient’s life, such voracity is detrimental to survival. A distressing catch-22 emerged as early heart transplants were performed – doctors gave patients powerful immunosuppressants to prevent rejection of the heart, but these drugs left the immune system so weakened that it could not fight off post-surgical infections. Eventually, a breakthrough came from an unexpected place – a Norwegian soil fungus.


Sometimes medical breakthroughs come from unlikely places – like a Norwegian soil fungus.

While on vacation in Norway, a scientist collected a soil sample that would change the fate of organ transplants forever. The soil sample was taken to Sandoz Pharmaceutical Ltd. where Jean-Francois Borel worked diligently with a team of scientists to characterize an interesting compound found in the Norwegian soil: cyclosporine, which was made from a fungus.

Sandoz Pharmaceutical was interested in developing new antibiotics, but, cyclosporine did not prove to be an effective antibiotic. Luckily for future recipients of heart transplants, cyclosporine  did show promise as an immunosuppressant. Cyclosporine specifically inhibited white blood cells and T cells instead of killing them, thus preventing organ rejection while still allowing the immune system to fight off infections. Dr. Borel and his team faced several setbacks while studying cyclosporine, including pressure from Sandoz to discontinue the studies. However, they persisted until 1983 when the Food and Drug Administration approved cyclosporine as an immunosuppressant for all organ transplants. Many healthy hearts are beating today due to cyclosporine, and a heartfelt thanks goes out to the countless individuals who worked so hard to make these survival stories a reality.

Peer edited by Kaylee Helfrich.

Follow us on social media and never miss an article:

Cloned Monkeys: Another Human Creation

http://english.cas.cn/head/201801/t20180123_189488.shtml Image credited to Qiang Sun and Mu-ming Poo, Institute of Neuroscience of the Chinese Academy of Sciences

First cloned none-human primates: Zhong Zhong and Hua Hua (Image credited to Qiang Sun and Mu-ming Poo, Institute of Neuroscience of the Chinese Academy of Sciences)

Cloned primates are here! Over three decades have passed since the birth of Dolly, the sheep, scientists have now tackled cloning mammals that are even closer to us on the evolutionary tree: macaque monkeys. What does this mean for a society that witnesses dramatic changes day by day: computers are outperforming doctors in calling out heart abnormalities in patients; 3D-printed organs are bringing us one step closer to tissue restoration; genome sequencing has become an online product easily available for anyone curious about their ancestry, bodybuilding, or just simply wine tastes. Breakthroughs in science and technologies are so prevalent in our life that by now, we probably shouldn’t be surprised by any new discovery. Yet when the two cute, little, cloned monkeys were born, the whole world was, once again, shaken.

Published in Cell on January 24th, 2018, a study from a group of scientists in China reported their methods in generating two non-human primates that are genetically identical. To clone the two identical macaque monkeys, the scientists applied Somatic Cell Nuclear Transfer, the same method that generated Dolly in 1996. The key idea behind cloning is that a new organism, be it sheep or monkey, is generated without sexual reproduction. Asexual reproduction is not as uncommon as one would think, plenty of plants do so. For example, Bryophyllum shed plantlets from the edge of the leaves to produce new plants. Some insects, such as ants and bees, also exploit asexual reproduction to clone a huge working class army. Since asexual reproduction is essentially an organism duplicating itself, the offsprings are all genetically identical. Evolution, however, doesn’t favor asexual reproduction as identical offsprings don’t prevail in a fast changing environment. On the other hand, sexual reproduction combines different sperms and eggs to create diverse offsprings, of which some may survive. To combat challenges from the mother nature, higher organisms, such as mammals, almost exclusively reproduce sexually. This is why a cloned monkey, an anti-evolution human creation, is mind blowing.


The succulent, genus Kalanchoe, uses asexual reproduction to produce plantlets.

To clone mammals, scientists came up with the idea of transferring the nucleus of a somatic cell to an enucleated egg (an egg that lacks nucleus). Unlike  germ cells (sperm and eggs), somatic cells refer to any cells that don’t get passed onto the next generation. These cells have the full genome of an organism that is split equally in germ cells during sexual reproduction. Carrying half of the genome, sperm and egg need to fuse their genetic materials to make one viable embryo. Technically, the nucleus of a somatic cell holds all the genetic information an organism needs. Thus, by inserting the somatic cell nucleus into an egg, scientists could generate a functional embryo. But why not into a sperm? Evolution has trimmed mammalian sperm tremendously so that it can accomplish its only job better: swim faster to fertilize the egg. As a result, not much other than the sperm’s genetic information is incorporated into the fertilized egg and the embryo relies on the cellular machinery from the egg to finish development. Using this technology, the scientists generated over 300 “fertilized” embryos. Of these embryos, 260 were transferred to 63 surrogate mothers to finish developing. 28 surrogate mothers became pregnant, and from those pregnancies, only 2 healthy monkey babies were born. Although they were carried by different surrogate mothers, every single piece of their genetic code is the same as the the somatic nucleus provider, a real-life demonstration of primate-cloning. Followed by millions of people since their debut to the world, these two macaque superstars are the living samples of a revolutionary breakthrough in our science and technologies.


Despite the extremely low success rate, this technology erects another monument in the history of mankind’s creations. Carrying identical genetic information, cloned monkeys like these two can be a very powerful tool in biomedical research and diseases studies. Co-author Mu-ming Poo, director of the Chinese Academy of Sciences’ Institute of Neuroscience in Shanghai, said that these monkeys could be used to study complicated genetic diseases where environmental factors also play a significant role, such as Alzheimer’s and Parkinson’s diseases. While there are ethical concerns on this technology and its easy application to human cloning, it is worth noting that almost all human creations (explosives, GMO food, the internet, etc.) are double-sided swords. It is up to the hand that wields this sword to decide whether to do good or bad. It is wise to be cautious with the development of new technologies, but it’s also important not to constrain our creativity. After all, it is our creative minds that drive us toward creating a better life for everyone.

Peer edited by Cherise Glodowski.

Follow us on social media and never miss an article:

Why is the Flu such a Big Deal?

With each flu season comes a bombardment of new advertisements reminding people to get a flu vaccine. The vaccine is free to most and widely available, yet almost half of the United States chooses to forgo the vaccine.

When Ebola emerged, there was 24 hour news coverage and widespread panic, but the influenza virus (the flu) feels more familiar and much less fear inducing. This familiarity with the flu makes its threat easy to brush aside. Yet, every flu season is met with stern resolve from the medical community. What’s the big deal with the flu?

What makes the flu such a threat?

Influenza is a globetrotting virus: flu season in the northern hemisphere occurs from October to March and April to September in the southern hemisphere. This seasonality makes the flu a year round battle. The virus also evolves at a blistering pace, making it difficult to handle.

To understand why the flu is able to evolve so rapidly, its structure must be understood.The graphic to the right shows an illustration of the ball-shaped flu virus.


Illustration of flu structure

On the outside of the ball are molecules that let the virus slip into a person’s cells. These molecules are called hemagglutinin and neuraminidase, simply referred to as HA and NA. HA and NA are also used by our body’s immune system to identify and attack the virus, similar to how a license plate identifies a car.

These HA and NA molecules on the surface can chanAntigenic shift in the fluge through two processes. One such process is like changing one license plate number; this is known as antigenic drift. When the flu makes more of itself inside a person’s cells, the instructions for making the HA and NA molecules slightly change over time due to random mutations. When the instructions change, the way the molecules are constructed also changes. This allows the flu to sneak past our immune systems more easily by mixing up its license plate over time.

Another way the virus can evolve is known as antigenic shift. This type of evolution would be more like the virus license plate changing the state it’s from in addition to a majority of its numbers and letters, making the virus completely unidentifiable to our immune systems. Unlike antigenic drift, antigenic shift requires a few improbable factors to coalesce.  

Antigenic shift happens more regularly in the flu when compared to other viruses.For instance, one type of flu virus is able to jump from birds, to pigs, and then to people without the need for substantial change. This ability to jump between different animals enables antigenic shift to occur.


How antigenic shift occur in the flu

This cross species jumping raises the odds of two types of the virus to infect the same animal and then infect the same cell. When both types of the flu virus are in that cell, they mix-and-match parts, as can be seen in the picture to the right. When the new mixed-up flu virus bursts out of the cell, it has completely scrambled it’s HA and NA molecules,generating a new strain of flu.

Antigenic shift is rare, but in the case of the swine flu outbreak in 2009, this mixing-and-matching occured within a pig and gave rise to a new flu virus strain.

This rapid evolution enables many different types of the flu to be circulating at the same time and that they are all constantly changing. This persistent evolution results in the previous year’s flu vaccine losing efficacy against the current viruses in circulation. This is why new flu vaccines are needed yearly. Sometimes the flu changes and becomes particularly tough to prevent as was the case with swine flu. At its peak, the swine flu was classified by the World Health Organization (WHO) as a class 6 pandemic, which refers  to how far it had spread rather than its severity. Swine flu was able to easily infect people, fortunately it was not deadly. The constant concern of what the next flu mutation may hold keeps public health officials vigilant.

Why is there a flu season?

A paper by Eric Lofgren and colleagues from Tufts University grapples with the question “Why does a flu season happen?”. The authors highlight several prevailing theories that are believed to contribute to the ebb and flow of the flu.

One contributing factor to the existence of flu seasons is our reliance on air travel. When flu season in the Australia is coming to an end in September, an infected person can fly to Canada and infect several people there, kickstarting the flu season in Canada. This raises the question: why is flu season tied with winter?

The authors touch on this question. During the winter months, people tend to gather in close proximity allowing the flu access to many potential targets and limiting the distance the virus need to cover before infect another person. This gathering in confined areas likely contributes to the spread of flu during the winter, but another theory proposed in this paper is less obvious and centers around the impact of indoor heating.

Heating and recirculating dry air in homes and workplaces creates an ideal environment for viruses. The air is circulated throughout  a building without removing the virus particles from the air, improving the chances of the virus infecting someone. The flu virus is so miniscule that air filters are unable to effectively remove it from the air. The authors come to the conclusion that the seasonality of the flu is dependent on many factors and no single cause explains the complete picture.

What are people doing to fight the flu?

The flu is a global fight, fortunately the WHO tracks the active versions of the flu across the world. This monitoring system relies on coordination from physicians worldwide. When a patient with the flu visits a health clinic, a medical provider, performs a panel of tests to detect the type and subtype of flu present. This data is then submitted to the WHO flu database, which is publicly accessible.

This worldwide collaboration and data is invaluable to the WHO; it allows for flu tracking and informed decision making when formulating a vaccine. Factor in the rapidly evolving nature of the flu and making an effective vaccine seems like a monumental task. Yet, because of this worldwide collaboration twice a year, the WHO is able to issue changes to the formulation of the vaccine as an effort to best defend people from the flu that year.

Peer edited by Rachel Cherney and Blaide Woodburn.

Follow us on social media and never miss an article:

Bonnethead Shark: The Newest Veggie Lovers of the Sea

Vegetarian sharks.

If you love a cheesy sci-fi movie as much as I do, the word shark probably brings a few images to mind; swimmers rushing to shore, a huge, hungry, Great white, ready to devour anything in its sights. You may have even started humming the iconic Jaws theme. But you might be surprised to hear that off the big screen, not all sharks are out for blood. In fact, one shark prefers a leafy, green, salad.

We often think of sharks as strict meat eaters, but researchers at the University of California-Irvine are turning the meat hungry shark stereotype on its head with their (mostly) vegetarian Bonnethead sharks. The Bonnethead shark is a small type of hammerhead shark often found in warm, shallow waters of the Northern hemisphere. Bonnetheads get their name from their distinct shovel-like head shape.


The Bonnethead shark’s unique head shape distinguishes it from its hammerhead cousins.


Though distinct in appearance, the characteristic that makes the Bonnethead shark truly unique is its diet. Sharks are infamous meat-eaters. The Bonnethead, however, prefers its meat with a side of veggies. Studies on the diet of the Bonnethead began in 2007 when large amounts of seagrass were found in the stomach of a shark in the Gulf of Mexico. For many years, it was thought the seagrass was indigestible and eaten on accident while the sharks were hunting for shrimp, mollusks, and small fish in the seagrass ridden waters. Recent research now suggests Bonnethead sharks can digest the seagrass they eat and could use it as a source of nutrients.

As the first seagrass-eating shark be discovered, there are still many questions surrounding this veggie-loving shark. Does the Bonnethead eat seagrass on purpose? Or is it accidentally consumed while hunting for creatures on the ocean floor? Perhaps the most puzzling question is how  the Bonnetheads are able to digest seagrass? Because Bonnethead sharks have short intestines that are typical of a strict meat eater, scientists suspect bacteria living in the gut give the Bonnethead the ability to digest seagrass. More research is needed to discover which, if any, bacteria help the Bonnethead digest its dinner.

Although questions remain, one thing is certain; the Bonnethead shark is a unique and remarkable creature with much to teach their human neighbors about what constitutes a five-star meal under the sea.

Peer edited by Zhiyuan Liu.

Follow us on social media and never miss an article:

Can We Make Tastier Tomatoes?

They can be eaten raw, made into countless stews and sauces, and add a tasty addition to nearly any dish. Tomatoes are practically indispensable in any modern kitchen and are one the highest monetary valued fruits. However, they were not always the big and meaty fruit we know today.


Tomatoes were not always the big and meaty fruit we know today.

Throughout history, humans have domesticated and improved plants, selectively breeding them till they bear little resemblance to their wild counterparts. Wild tomatoes are thought to have arisen from the Andean region of South America and were small, resembling cherry tomatoes. In fact, cherry tomatoes are thought to be the ancestor of the larger varieties. Conquistadors then brought tomatoes from South America to Europe in the sixteenth century, and the continued migration and selective breeding has massively changed tomato genetics. Some of these changes are responsible for tomatoes that are ~100 times larger than their ancestors, while others have produced the unique pink coloration of tomatoes popular in China and Japan. However, much less is known on how the tomato metabolome, or its collection of small molecules (metabolites) such as amino acids, vitamins, and sugars, has changed throughout domestication and later improvement of the fruit.

The metabolites in tomatoes not only affect their development but also play key roles in human health and are responsible for their nutrition and taste. Today, the breeding of tomatoes has largely focused on increasing shelf life, yield, and disease resistance. Yet these changes may sometimes have negative impacts on the quality of tomatoes. Understanding how the metabolites in modern tomatoes has changed through selective breeding will help us to understand how best to breed and design tomatoes in the future to maximize their nutrition and taste. In a recent study, Guangtao Zhu et al. identified the metabolites in a variety of tomato samples spanning the domestication and improvement stages in the species. Interestingly, the greatest amount of change in metabolites happened not during domestication but during the later improvement stage.

A notable change during the development of the modern tomato is the selection against steroidal glycoalkaloid (SGA), responsible for the bitter taste in early tomatoes and common in the nightshade family of plants to which it belongs. Presumably, humans selected this without any knowledge of SGAs, instead breeding tomatoes that were less bitter than others. In addition to SGAs, other metabolites in modern tomatoes are very different from their ancestors. The authors also explored why the pink tomatoes popular in Asian countries are considered so much more flavorful from their red counterparts. The peel of red tomatoes contain a compound called naringenin chalcone that gives a yellow-hue to the peels, and the absence of this compound in pink tomatoes results in their pink coloring. Yet, why pink tomatoes are considered more delicious is unknown. This paper identified many metabolites that are different between red and pink tomatoes. This finding will lay the foundation for further studies to determine which of these metabolites give pink tomatoes their unique, sweet taste and that may be incorporated into red tomatoes to make them more flavorful.

One question in the design of modern tomatoes is whether we can design equally large tomatoes that are both more flavorful and nutritious. This paper suggests that through metabolomic changes in the tomato we can. The authors propose that the changes in metabolites were likely not due directly to the genes responsible for fruit weight that produced larger tomatoes, but rather genes that were “linked” to these fruit weight genes. Essentially these genes hitched a ride with the fruit weight genes to be passed on unintentionally. Nowadays, we have the capabilities of making more precise changes in DNA and ensuring that only the genes of interest are changed and not related genes or “linked” genes. Using these modern genetic approaches, such as Crispr-Cas, we can now increase the nutritional value and improve taste in tomatoes while avoiding the “linked” genes that likely brought about some negative changes in the modern tomato. So yes, we may be eating bigger, tastier, and healthier tomatoes in the future!

Peer edited by Laetitia Meyrueix.

Follow us on social media and never miss an article:

Cambridge Researchers use Mouse Embryonic Stem Cells to Grow Artificial Mouse “Embryos”

Let’s start at the very beginning. When a mammalian egg is successfully fertilized by a single sperm, the result is a single cell called a zygote. A zygote has the potential to grow into a full-bodied organism. It is mind-boggling that this single cell, containing the genetic material from both parents, can divide itself to make two cells, then four cells, then eight cells, and so on, until it becomes a tiny ball of 300-400 stem cells.


Early Development and Stem Cell Diagram, modified by author to include ESC and TSC labels.

At these early stages, these stem cells are totipotent, meaning that they have the potential to become either embryonic stem cells (ESCs), which will eventually become the fetus itself, or extraembryonic trophoblast cells (TSCs), which go on to help form the placenta. That ball of ESCs and TSCs develops into a blastocyst with a tight ball of ESCs on the inside, and a layer of TSCs on the outside (See Figure 1).

You might imagine the blastocyst as a pomegranate, with the seeds representing the ESCs and the outer skin representing the TSCs. The ESCs have the potential to transform, or differentiate, into any type of cell in the entire body, including heart cells, brain cells, skin cells, etc., which will ultimately become a complete organism. The outer layer TSCs have the ability to differentiate into another type of cell that will ultimately attach itself to the wall of the uterus of the mother to become the placenta, which will provide the embryo with proper nutrients for growth. 

Scientists in the field of developmental biology are absolutely bonkers over this early stage of embryogenesis, or the process of embryo formation and development. How do the cells know to become ESCs or TSCs? What tells the ESCs to then differentiate into heart cells, or brain cells, or skin cells? What signals provide a blueprint for the embryos to continue growing into fully-fledged organisms? The questions are endless.

The challenge with studying embryogenesis is that it is incredibly difficult to find ways to visualize and research the development of mammalian embryos, as they generally do all of their growing, dividing, and differentiating inside the uterus of the mother. In recent years, there have been multiple attempts to grow artificial embryos in a dish from a single cell in order to study the early stages of development. However, previous attempts at growing artificial embryos from stem cells face the challenge that embryonic cells are exquisitely sensitive and require the right environment to properly coordinate with each other to form a functional embryo.

Enter stage right, several teams of researchers at the University of Cambridge are successfully conducting groundbreaking research on how to grow artificial mouse embryos, often called embryoids, in a dish.

In a paper published in Development last month, David Turner and colleagues in the Martinez-Arias laboratory report a unique step-by-step protocol developed in their lab that uses 300 mouse ESCs to form tiny balls that mimic early development.


Mouse embryonic stem cell aggregates with polarized gene expression in a dish (4 days in culture). Image courtesy of authors.   doi.org/10.11.01/051722.

These tiny balls of mouse ESCs are collectively termed “Gastruloids” and are able to self-organize and establish a coordinate system that allows the cells to go from a ball-shape to an early-embryo shape with a head-to-tail axis. The formation of an axis is a crucial step in the earliest stages of embryo development, and it is exciting that this new model system may allow scientists to better study the genes that are turned on and off in these early stages.

In a paper published in Science this past April, Sarah Harrison and her team in the Zernicka-Goetz laboratory (also at Cambridge) report another technique in which mouse ESCs and TSCs are grown together in a 3D scaffold instead of simply in a liquid media. The 3D scaffold appears to give the cells a support system that mimics that environment in the uterus and allows the cells to assemble properly and form a blastocyst-like structure. Using this artificial mouse embryo, the researchers are attempting to simulate the growth of a blastocyst and use genetic markers to confirm that the artificial embryo is expressing the same genes as a real embryo at any given stage.

The researchers found that when the two types of stem cells, ESCs and TSCs, were put together in the scaffold, the cells appear to communicate with each other and go through choreographed movement and growth that mimics the developmental stages of a normal developing embryo. This is enormously exciting, as models like this artificial embryo and the Gastruloid have the potential to be used as simplified models to study the earliest stages of embryo development, including how ESCs self-organize, how the ESCs and TSCs communicate with each other to pattern embryonic tissues, and when different genetic markers of development are expressed.

It is important to note that this artificial embryo is missing a third tissue type, called the endoderm, that would eventually form the yolk sac, which is important for providing blood supply to the fetus. Therefore, the artificial embryo does not have the potential to develop into a fetus if it is allowed to continue growing in the dish. The fact that these artificial embryos cannot develop into fully-fledged organisms relieves some of the controversial ethical issues of growing organisms in a dish, and will allow researchers to study critical stages of development in an artificial system.   

These techniques and discoveries developed by these teams of researchers have the potential to be applied to studies of early human development. These models may prove especially useful in studying how the maternal environment around the embryo may contribute to fetal health, birth defects, or loss of pregnancy. In the future, artificial embryos, coupled with the not-so-futuristic gene editing techniques that are currently in development to fix disease genes, may prove key in the quest to ensure healthy offspring. 

Peer Edited by Nicole Smiddy and Megan Justice.

Follow us on social media and never miss an article:


Cinnamon, Bam!

https://commons.wikimedia.org/wiki/File:001-Cinnamon.jpg Photo Credit: https://www.kjokkenutstyr.net/

Many of us associate the holiday seasons with the smells of cinnamon.

Well the holiday season is upon us. Our calendars and days are now filled with shopping, travel, and social gatherings with friends, family, and loved ones. As the temperature outside turns cold, we turn to many of our favorite treats to fill our bellies and help keep us warm. Our mouths water as we think about all of the delectable items that line our kitchens and tables. I can picture it now… a warm fire keeping the room nice and toasty, glass of wine in hand, friends and relatives conversing and catching up and of course, avoiding awkward conversations with Uncle Gary. All while hovering around various piles of unknown cheeses, meats, and delicious stacks of sweets. And If you’re lucky, you may even find a warm, sticky stack of homemade cinnamon buns. As it turns out, these may be just the thing to reach for to help burn off some of that unwanted extra “padding” that comes with all of those holiday favorites.

What’s that you say? Cinnamon buns burn fat? Well before you go eating the whole tray, it’s not really the cinnamon buns themselves that may help burn fat, but the cinnamon for which they are named. It tastes great, you can use it in all sorts of dishes, and it accelerates fat loss. I’m a fan of all of those things. Now, you probably find yourself asking, where can I learn more about this awesome spice? Well, look no further my friend, I am about to lay enough cinnamon-spiced knowledge on you to guarantee that you can bore your friends and family to tears with your cinnamon information stream at your holiday gathering. You’ll be less popular than Uncle Gary.

Cinnamon contains a compound known as cinnamaldehyde. Cinnamaldehyde is a naturally occurring chemical found in the bark of cinnamon trees that gives cinnamon both its characteristic flavor and odor. A recent study shows that cinnamaldehyde can even help burn fat by increasing metabolism and your body’s ability to breakdown fat! I know, it’s pretty magical. Now before you go running around stabbing cinnamon trees with a spout, there’s a few things you should know. Primarily that you have to fly to Sri Lanka, which is expensive but totally worth it since it’s a beautiful tropical island in the Indian Ocean. And you can even stay at a place called Cinnamon Bey, which looks like this picture I found of it on the interweb. Pretty sweet, huh? (See what I did there!)


Sri Lanka is located off the southeast coast of India.

Anyway, the purest source of cinnamon-derived cinnamaldehyde is the Ceylon Cinnamon tree (say that several times fast while jamming a sticky bun in your face!). Also known as, the “True” Cinnamon tree, which is named after the historical moniker of its native country, Sri Lanka (formerly Ceylon). The country still produces and exports up to 90% of the world’s true cinnamon. The other 10% comes from Seychelle and Madagascar, which are equally far and equally awesome as travel destinations. However, there are six species of cinnamon sold commercially around the world. So If you prefer the regular stuff found cheaply at most grocery stores, then you will have to head to China or Southeast Asia for the most common variant, cassia, which is considered to be less, um, “Top-Shelf”.

The cassia variant is cultivated on a larger scale and is coarser than ceylon cinnamon. It also has a higher oil content and contains more cinnamaldehyde, which gives it a harsher, stronger, spicier flavor than Ceylon cinnamon. Huh? Wait, you thought more cinnamaldehyde might equal more fat loss? You are correct my friend, but before you book that ticket to Guangdong and attempt the cinnamon challenge for the thirtieth time, you should know that the Cassia variety also contains coumarin, which is not found in the Ceylon variety. Coumarin is a naturally occurring blood thinner that can cause damage to the liver in high doses. So, take your pick, though if you really want that good, pure cinnamaldehyde, the “True” kind, then you better hustle it to Sri Lanka.

However, getting there is only part of the story. Isolating cinnamaldehyde from the bark of the Cinnamon tree is a slightly tricky process that involves some rather unsavory chemicals, the potential of explosions, and a few fancy science machines (namely a mass spectrometer) for pulling the oil out of the bark, to leave you with that tasty, cinnamoney goodness. What? You thought you could just grab a tree and squeeze really hard? No, no, no. That might work for your lemongrasses, aloes and coconuts, but not cinnamon.

Actually, I’m guessing from your weird tree-squeezing thoughts that you take Cinnamon for granted. I mean…your cinnamon disrespect is understandable, since you can buy it pretty much everywhere and it’s almost as prolific as pumpkin spice, but this wasn’t always the case. In fact, until recently true cinnamon was extremely rare, since there were no planes or cars…or Amazon, well the internet really…and it only came from one relatively small island in the Indian Ocean. As such, until the 1500’s cinnamon was highly valued and was given to kings and as tribute to gods. Eventually, during the colonial period, the East India Company (the original Amazon) began distributing the spice to the rest of the world and cultivating it on a large scale.

So, cinnamon has been around forever, you say, since remote antiquity and what-not. Great. But what about this cinnamon burns fat thing? First off, settle down. We have arrived, so here’s the details. A recent study from Jun Wu at the University of Michigan Life Sciences Institute showed that cinnamaldehyde increases thermogenesis, which is the process the body uses to create heat. Thermogenesis can burn a lot of calories and accelerate metabolism, and that results in the breakdown of fat. In addition, cinnamaldehyde can decrease and stabilize fasting blood sugar. What’s even more interesting is that chronic treatment with cinnamaldehyde can reprogram your body’s metabolism, which may serve as protection from diet induced obesity.


Cinnamon is used in a variety of holiday treats including cinnamon rolls and apple pies.

So, cinnamon can burn fat and protect you from gaining it back! Now that is a magical spice. Well, there you go. I’m pretty sure that should be just enough information to cause awkward emotional discomfort to those within ear shot at your holiday festivities. Your shining personality may keep you from being the next Uncle Gary, but at least your cinnamon tales will have him running for the eggnog, which contains cinnamon. Bam! Take that Uncle Gary. No one cares about the length of your ear hair!

And while you’re enjoying your holidays, eating those cinnamon packed delicacies, remember the reason for the season! Be good to each other and have some fun, safe, and cinnamon filled holidays! Cheers!


Peer edited by David Abraham.

Follow us on social media and never miss an article:

Tardigrades! The Super-animal of the Animal Kingdom


Tardigrade (aka waterbear or moss piglet)

Tardigrades, also known as waterbears or moss piglets, are microscopic invertebrates that “resemble a cross between a caterpillar and a naked mole rat,” according to science writer, Jason Bittel. First discovered almost 250 years ago, there are now over 1,000 known species of tardigrade that can be found in almost every habitat throughout the world – from the depths of the ocean, to the tops of mountains, to your own backyard. As long as there is a little bit of moisture, you can find them. They are small and chubby, with most species being less than one millimeter in length. Their unique, usually transparent bodies have no specialized organs and four pairs of legs with claws at the end. Tardigrades can reproduce sexually or asexually via self fertilization. Like regular bears, Tardigrades eat a variety of foods, such as plant cells, animal cells and bacteria.

Despite being small, adorable microorganisms, tardigrades are fascinating creatures that have recently garnered the attention of scientists around their world due to their adaptability and resilience towards the most extreme environmental conditions. They have been observed to survive in a vacuum (an environment devoid air and matter) for up to eight days, for years to decades without water, temperatures ranging from under -200˚C to almost 100˚C, and heavy ionizing radiation. Tardigrades survive these conditions through a reversible mechanism known as desiccation (extreme drying), in which an organism loses most of the water in their body. In tardigrades, this can be as high as 97%. This is especially important in freezing temperatures, where water frozen into ice crystals can pierce and rupture the cells in the tardigrades’ body. During desiccation, the metabolic rate slows down to as low as 0.01% of normal function, allowing survival under the harshest of conditions for years.


Scanning Electron Microscopy image of a Tardigrade (Hypsibius dujardini)

In a 2016 Nature paper, scientists sought to answer the question of how a certain specie of tardigrade, Ramazzottius varieornatus, is so tolerant to extreme environmental conditions. They found an increase in several stress-related gene families such as superoxide dismutases (SODs). Most multicellular animals have less than ten SODs, however the study identified sixteen in this tardigrade specie. They also found an increased copy number of a gene known to play an important role in DNA double stranded breaks, MRE11. R. varieornatus had four copies of MRE11 while most other animals have only one.  Aside from having improved mechanisms of handling stress and DNA damage, scientists were able to identify waterbear-specific genes that seemed to explain tardigrades’ radiotolerance, or rather, resistance to radiation. The scientists were curious about whether this tardigrade-specific gene had any effect on DNA protection and radiotolerance in human cells. To their surprise, this gene, called DSUP for DNA damage suppressor, was able to decrease DNA damage in human cultured cells by 40% and decreased both double and single stranded DNA breaks.

At the University of North Carolina at Chapel Hill, Dr. Bob Goldstein studies animal development and cellular mapping during development in C. elegans and recently in tardigrades as well. He is also focusing on developing tardigrades into an new model system while studying their body development! His lab website has a section dedicated to tardigrades, with resources about them along with pictures and videos of tardigrades in motion.

The environmental resilience of tardigrades is incredible, making the tardigrade the super-animal of the animal kingdom (in my opinion). Who knows what other fascinating creatures we have yet to discover that may have characteristics as interesting and unbelievable as those of the tardigrade?

Peer edited by Nick Martinez.

Follow us on social media and never miss an article:


Superbug Super Problem: The Emerging Age of Untreatable Infections

You’ve heard of MRSA. You may even have heard of XDR-TB and CRE. The rise of antibiotic-resistant infections in our communities has been both swift and alarming. But how did these once easily treated infections become the scourges of the healthcare world, and what can we do to stop them?

Antibiotic-resistant bacteria pose an alarming threat to global public health and result in higher mortality, increased medical costs, and longer hospital stays. Disease surveillance shows that infections which were once easily cured, including tuberculosis, pneumonia, gonorrhea, and blood poisoning, are becoming harder and harder to treat. According to the CDC, we are entering the “post-antibiotic era”, where bacterial infections could once again mean a death sentence because no treatment is available. Methicillin-resistant Staphylococcus aureus, or MRSA, kills more Americans every year than emphysema, HIV/AIDS, Parkinson’s disease, and homicide combined. The most serious antibiotic-resistant infections arise in healthcare settings and put particularly vulnerable populations, such as immunosuppressed and elderly patients, at risk. Of the 99,000 Americans per year who die from hospital-acquired infections, the vast majority die due to antibiotic-resistant pathogens.


Cartoon by Nick D Kim, scienceandink.com. Used by permission

Bacteria become resistant to antibiotics through their inherent biology. Using natural selection and genetic adaptation, they can acquire select genetic mutations that make the bacteria less susceptible to antimicrobial intervention. An example of this could be a bacterium acquiring a mutation that up-regulates the expression of a membrane efflux pump, which is a transport protein that removes toxic substances from the cell. If the gene encoding the transporter is up-regulated or a repressor gene is down-regulated, the pump would then be overexpressed, allowing the bacteria to pump the antibiotic back out of the cell before it can kill the organism. Bacteria can also alter the active sites of antibacterial targets, decreasing the rate with which these drugs can effectively kill the bacteria and requiring higher and higher doses for efficacy. Much of the research on antibiotic resistance is dedicated to better understanding these mutations and developing new and better therapies that can overcome existing resistance mechanisms.


While bacteria naturally acquire mutations in their genome that allow them to evolve and survive, the rapid rise of antibiotic resistance in the last few decades has been accelerated by human actions. Antibiotic drugs are overprescribed, used incorrectly, and applied in the wrong context, which expose bacteria to more opportunities to acquire resistance mechanisms. This starts with healthcare professionals, who often prescribe and dispense antibiotics without ensuring they are required. This could include prescribing antibiotics to someone with a viral infection, such as rhinovirus, as well as prescribing a broad spectrum antibiotic without performing the appropriate  tests to confirm which bacterial species they are targeting. The blame is also on patients, not only for seeking out antibiotics as a “cure-all” when it’s not necessarily appropriate, but for poor patient adherence and inappropriate disposal. It’s absolutely imperative that patients follow the advice of a qualified healthcare professional and finish antibiotics as prescribed. If a patient stops dosing early, they may have only cleared out the antibiotic-susceptible bacteria and enabled the stronger, resistant bacteria to thrive in that void. Additionally, if a patient incorrectly disposes of leftover antibiotics, they may end up in the water supply and present new opportunities for bacteria to develop resistance.



Overuse of antibiotics in the agricultural sector also aggravates this problem, because antibiotics are often obtained without veterinary supervision and used without sufficient medical reasons in livestock, crops, and aquaculture, which can spread the drugs into the environment and food supply. These contributing factors to the rise of antibiotic resistance can be mitigated by proper prescriber and patient education and by limiting unnecessary antibiotic use. Policy makers also hold the power to control the spread of resistance by implementing surveillance of treatment failures, strengthening infection prevention, incentivizing antibiotic development in industry, and promoting proper public outreach and education.


While the pharmaceutical industry desperately needs to research and develop new antimicrobials to combat the rising number of antibiotic-resistant infections, the onus is also on every member of society to both promote appropriate use of antibiotics as well as ensure safe practices. The World Health Organization has issued guidelines that could help prevent the spread of infection and antibiotic resistance. In addition, World Antibiotic Awareness Week is November 13-19, 2017, and could be used as an opportunity to educate others about the risks associated with antibiotic resistance. These actions could significantly slow the spread and development of resistant infections and encourage the drug development industry to develop new antibiotics, vaccines, and diagnostics that can effectively treat and reduce antibiotic-resistant bacteria.

Peer edited by Sara Musetti 

Follow us on social media and never miss an article: