Pharmacies of the Future: Chemical Lego Towers

Chemists and engineers are in the process of making on-demand production of pharmaceuticals less of an idea from a movie, and potentially a viable option for situations where medicines may not be easily accessible.

Imagine taking a vacation to an isolated rainforest resort.  You explored your adventurous side, hiking through the lush vegetation with a knowledgeable tour. Less than 10 minutes after arriving back at the hotel, an uncontrollable itch began on your forearms. It traveled up your arms, across your chest, and began rising up your neck. Was it from a bug or a plant you encountered during the hike? At this point, you are unconcerned about the cause, and just want a solution. The closest drug store is hours away; when booking the trip, it seemed like a great idea to pick the most isolated resort for your dream vacation. Even if the drug store was closer, it was not a guarantee that they would even have anything to help you. In the US, there were over 200 instances of drugs shortages from the years 2011-2014. There was no telling how difficult it would be to get medicine to this remote location.

You head to the front desk of the hotel, hoping they have something to give you for relief. They lead you down the hall and into a small room. There are a few chairs and an appliance that is similar in size and shape to a refrigerator. The employee enters a few commands into a keyboard and the machine starts working. In fifteen minutes, the employee hands you two tablets- diphenhydramine hydrochloride, more commonly known as Benadryl®.  

https://www.flickr.com/photos/mindonfire/3249070405

Diphenhydramine, better known by the brand name Benadryl, is one of the four medications that can be synthesized by the original compact, reconfigurable pharmaceutical production system.

While this scenario is not plausible in the current day, it will be in the near future. In a 2016 Science article, researchers from around the world introduced a refrigerator-sized machine that could make four common medicines. More recently, a 2nd generation prototype was released; the new model is 25% smaller and contains enhanced features necessary for the synthesis of four additional drugs that meet US Pharmacopeia standards. This is possible by technology known as flow chemistry. Flow chemistry is a development where chemicals are pumped through tiny tubes. When two tubes merge, a reaction between the two chemicals occurs, resulting in a new molecule. Compared to traditional chemical reactions (stirring two chemicals together in a flask), flow reactions are generally safer and happen faster.

In this new machine, there are different “synthesis modules,” or small boxes that contain the equipment to do a single chemical reaction. Much like an assembly line to build a car, pharmaceutical molecules are made by starting with something very simple, and pieces are added on and manipulated until it is something useful. In the case of pharmaceuticals, the assembly line consists of molecules and reactions. The modules, or boxes, can be rearranged to do the chemical reactions in the order needed to make the desired medicine. To make a different medicine, the modules must simply be rearranged. Researchers can use the original prototype to make Benadryl, Lidocaine (local anesthetic), Valium (anti-anxiety), and Prozac (anti-depressant), using different combinations of the exact same modules. As of July 2018, the FDA reported that both diazepam (Valium) and lidocaine were currently in a shortage, due to manufacturing delays.

http://www.columbus.af.mil/News/Photos/igphoto/2000125986/

On demand pharmaceutical production would allow access to medicines in rural locations and war zones.

The future of this technology would allow anyone to use it. A user could simply input the medicine they want, and computers would rearrange the modules and use the correct starting chemicals, and in about 15 minutes, you could receive the desired medicine. This technology has vast applications. It could help alleviate the aforementioned drug shortages. Additionally, it could allow access to medicine in locations where it may be difficult to ship to, including rural locations or war zones, often places that need medicines most. In these places, delivery may be difficult, and some medicines go bad quickly. With this technology, it would not be necessary to store medicines that could go bad; it could simply be made as soon as it is needed. This could also prevent waste from medicines that are not used before they go out of date. These developments could revolutionize the pharmaceutical industry and I look forward to seeing the good that these technology advances can lead to. 

Peer edited by Nicholas Martinez

Follow us on social media and never miss an article:

What Jellyfish Taught us About Microgravity

Think for a minute about your grandkid’s grandkids. Where are they living? Perhaps you momentarily considered the possibility of your intrepid descendants dwelling in outer space. You’re not alone: since 1991, when the  Space Life Sciences 1 mission was launched (SLS-1), there has been intensive research into the physiological effects of microgravity and space travel on the human body. However, studying the effects of space travel on human development and physiology can be expensive and dangerous. For example, NASA cannot send babies to space to study human development in microgravity (despite the fact that this might be one giant gurgle for mankind). To circumvent the challenges associated with rigorously studying physiology in space, innovators like Dorothy B. Spangenberg and her research team found a way to address whether growing up in space changes how we sense gravity. How? Jellyfish.

Jellyfish at the New England Aquarium living its best life. Photo Credit: Nicholas Payne

Jellyfish at the New England Aquarium living its best life.
Photo Credit: Nicholas Payne

Jellyfish aren’t actually fish at all, they are  simple invertebrates, found in the same phylum as sea anemones and corals. Jellyfish are such weak swimmers that they are often at the mercy of ocean currents, which they rely on to move them around the ocean. However, like most organisms,  jellyfish require a way to spatially orient themselves, especially with respect to Earth’s gravitational field. In order to sense which way is up, jellyfish develop sensory structures called rhopalia at the base of their bells as they mature. These sensory organs contain heavy calcium sulfate statolith crystals. As the jellyfish rotates with respect to the force of gravity, the heavy crystals tumble in the direction of the gravitational force, a movement which is sensed and interpreted by sensory cells in the rhopalia.

These jellyfish gravity sensors are not so different from our own. Our ability to orient ourselves is governed by the vestibular system, located within our inner ears. Similarly to the jellyfish, we sense linear acceleration (such as the acceleration due to the force of gravity) through an otolithic membrane. This membrane picks up the movements of otoconia, small protein/calcium-carbonate particles, in response to gravity. Though the human vestibular system develops during late embryonic stages, jellyfish develop their rhopalia over only five days. This makes jellies a useful organism for studying the effects of microgravity on the development of gravity sensors. Information about the development of gravity sensors in jellyfish in space could give us insight into an astronaut’s otoconia and even how our grandkid’s grandkid’s vestibular system would develop in response to growing up in microgravity.

To perform this experiment, Spangenberg and colleagues sent 2,478 immature jellyfish polyps into space in containers of artificial seawater. By injecting hormones into the seawater bags, the researchers could force the jellyfish to advance to the next step of development: the ephyrae phase where rhopalia (gravity sensors) are developed. They created two populations of ephyrae: jellies that were induced to develop their gravity sensors on Earth and jellies that were induced to develop gravity sensors in space. The physiology of the statoliths and the movements of these two populations of astronaut jellyfish were then compared with jellies that developed normally on Earth. Spangenberg and her team found that the jellyfish who developed gravity sensors on earth and then were subsequently sent to space lost statoliths in space more rapidly than the jellies who never went to space, which may have implications for Earth-born astronauts. Jellyfish induced to develop gravity sensors once they were already in space had no trouble pulsing and swimming in space, and had typical numbers of statoliths. What happened to the space-developed jellies when they came back down to Earth? The researchers reported that 20% of the microgravity jellyfish had trouble pulsing and swimming once back on the Blue Planet, despite having seemingly normal statolith development. Therefore, we should proceed with caution when dealing with how other organisms, including human beings, might develop in space.

Although more experiments are needed to determine whether the findings in jellyfish can translate to human development in space, these studies indicate the potential impact space travel can have on how we sense gravity. Jellyfish who developed in space appeared to experience intense vertigo once they were back on earth — so don’t be too jelly of their all expenses-paid trip into space!

Authors note: I found out while writing this that a group of jellyfish is called a “smack” of jellyfish, a fact which is far too cute not to share here.

Peer edited by Bailey DeBarmore

Follow us on social media and never miss an article:

Will dogs save us from allergies?

https://www.flickr.com/photos/tomsaint/16730323546

Picture from: Rennett Stowe

Dog is man’s best friend. Man is dog’s…predictor for allergies?

A recent study showed dogs with owners that suffer from allergies are more likely to suffer from allergies themselves. Researchers also found that dogs that live in urban environments are more likely to have allergies than dogs in rural environments. The same correlation between urban environments and allergies is found in humans. Humans in rural environments come in contact with more species of microbes than their urban counterparts. It is thought that contact with many microbes early in life may protect humans from developing allergies. The same phenomenon is thought to occur in dogs. It appears man and man’s best friend have more in common than originally thought.

Allergies in humans and dogs have been on the rise in the western world. There have been many studies to look at the causes of these allergies in humans, but few have looked into the causes in dogs. Researchers at the University of Helsinki in Finland wanted to change this. We already know that humans who live in urban environments are more likely to have allergies than human who live in rural environments. Hakanen and colleagues wanted to know if the same is true in dogs.

Researchers sent surveys to almost 6000 dog-owners in Finland. The survey asked questions about the dog’s breed, the dog’s current environment (urban v. rural), the dog’s environment at birth, the dog’s allergies, and the owner’s allergies. When analyzing the data, they removed dog breeds known to be genetically prone to allergies, so they could focus on environmental factors. After compiling the data, Hakanen et. al. concluded that dogs who live in urban environments are more likely to have allergies than their rural counterparts. It is important to note, the data are influenced by how much time the dog spends outside and how much contact the dog has with farm animals. Strangely, living in larger human families can also protect dogs from developing allergies. This suggests that we might protect our dogs from allergies; similar to the way they protect us from developing allergies.

https://www.flickr.com/photos/dani0010/537522266

Allergy symptoms in dogs can include itchiness, sneezing, hives, constant licking, itchy ears, and itchy, runny eyes. Picture from: Dani

Researchers cannot be certain the cause for differences in allergies between urban dogs and rural dogs and dogs in smaller vs. larger families, but they do have some theories. In humans, the microbiota, or the microbes that live in our bodies and do not cause illness, are an important factor in allergy development. It is thought that humans who grow up in rural areas come in contact with and are colonized with environmental microbes that protect people from allergies. The microbiota is also thought to be important for development of allergies in dogs. Dogs living in rural environments may come into contact with more environmental microbes that protect them from allergies. Furthermore, dogs in larger families likely come into contact with more species of microbes because each family member harbors a unique microbiota.

Though many of the study’s findings are similar between dogs and humans, one difference between human and dog allergies seems to be the impact of birthplace. A dog’s birthplace is not a predictor for allergies in dogs like it is in humans. Researchers think a dog’s birthplace may be less important because dogs are usually removed from their birth environment fairly early (7-8 weeks), when compared to humans (18yrs).

Hakanen and colleagues were able to identify multiple environmental factors important for predicting if a dog will develop allergies. However, the most striking finding of the study was actually in the dog owners. Dogs with owners that have allergies are more likely to have allergies. Though this is not a new finding, it suggests that the factors important to developing allergies in humans and dogs may be the same. The idea that the same factors could influence allergy development in both dogs and humans is particularly intriguing considering dogs suffer mostly skin and food allergies and few respiratory symptoms. Respiratory symptoms from pollen allergies are among the most common in humans. Furthermore, the immune responses that cause allergic symptoms in dogs and humans are different. This suggests the factors influencing allergy development may be important for all mammals despite differences in their immune systems.

There is still more research to be done to determine the factors that lead to allergies in dogs and humans. However, the studies of Hakanen et. al. and others suggest that if we can determine the factors important for developing allergies in dogs, for which it is easier to gather environmental and health information, we may be able to apply these findings to humans. So in addition to being the best listeners, best cuddlers, and our best friends, dogs may just be our best chance to cure our allergies.

Peer edited by Christina Parker

Follow us on social media and never miss an article:

How Reliable Is Our Memory?

How memories are formed, stored, and modified has been one of the key topics in neuroscience studies. It’s fascinating to realize that not only can we enhance our memory through constant practice and exercise, but also alter or eliminate existing memory in some trauma cases. Within the past few years, neuroscientists have even found ways to create fake memories or artificially manipulate memories. All of these lead to a huge question: how reliable is our memory?

Picture adapted from: https://pixabay.com/en/nerves-cells-dendrites-sepia-346928/

Picture adapted from: https://pixabay.com/en/nerves-cells-dendrites-sepia-346928/

Most of us have perhaps heard of the analogy that our brain functions like a computer. Information we perceive through our sensory organs is transduced and processed by various neurons and experiences are accumulated to create memories that are stored in different regions of the brain. As our existential evidence, memories constitute our identities and to certain extent determine who we are as human beings. Scientifically, the formation of memories is facilitated by the dynamic generation and deterioration of synapses that connect different neurons to spike different responses. Put it in a simple way, a new memory is created when neurons that didn’t connect before are wired together in the brain, forming new contacts and synapses. The memory will be strong if it is constantly revisited and the relevant synapses get strengthened by reactivating this particular group of neurons. On the other hand, the memory will gradually fade if these neurons stop firing and the synapses dissemble. This is similar to how we maintain friendships at the macro-level. Throughout our entire life, we have a lot of friends. We make new friends everyday and perhaps also lose some from time to time. Like some friends maintain close relationships because they spend time together more often, some memories stay lucid because of repetitive reminders. Understanding which neurons and synapses store which memory is thus like deciphering a codebook. By genetically labeling neurons and tracing their activities, scientists have made huge progress in decoding this mystery, revealing the possibility of Inception in real life.

A research group led by Dr. Steve Ramirez at Boston University have been the experts in memory studies for years. In 2013, they made a breakthrough of implanting fake memories into the brain of mouse. Published in Science, this study has demonstrated that when neurons in a particular region of the hippocampus get activated artificially through optogentics, a pain-related memory can be recalled without actual pain stimulus. By labeling different neurons in the brain, the researchers were able to identify neurons activated by a particular pain stimulus, a foot shock in this case. Mice exposed to foot shock once were separated into two groups: one would receive a real foot shock again to trigger this pain memory, and the other would be stimulated with light for artificial neuron activation. Using optogentics, the researchers activated the particular pain-responsive neurons with light and compared the response from mice that have received real foot shock. Interestingly, light activation of these neurons produced similar response as the normal pain stimulation, which means both groups of mice were “reminded” of the pain even though only one group actually received it.  Dr. Ramirez’s group smartly bypassed the sensory neurons for pain and created this fake memory in their mice. In other words, the researchers tricked the mice to memorize a pain that they weren’t exposed to simply by shining light on them.

Similarly in a followup study published in Nature in 2015, the same group managed to rescue behavioral disorders in mice due to stress by optogenetically activating the rewarding neurons to trigger good memory. Although human brain is a hundred times more complicated than a mouse brain, with the success in manipulating mouse memory, it’s promising that we will have the capacity of interfering with human memories soon. While advancements made in this field will help cure neurodegenerative disorders such as Alzheimer’s, memory altering technologies can also bring up ethical issues and concerns on how reliable our memories are. Without doubts, positive effects in response to rewarding memories can be extremely beneficial to the continuously growing population fighting with depression. On the other hand, if our life is composed of a mixture of fake and authentic memories for various reasons, how would we know what to trust and who we are? As fully conscious human beings, maybe it’s time to ponder on how reliable we want our memories to be.

Picture adapted from: https://pxhere.com/en/photo/654163

Picture adapted from: https://pxhere.com/en/photo/654163

Peer edited by Nicole Fleming.

Follow us on social media and never miss an article:

Heat Waves and Training Gains

http://www.hill.af.mil/News/Features/Display/Article/838518/ask-a-badwater-finisherif-you-can-find-one/Elite athletes and weekend warriors alike understand the struggles of training during the hot, humid summer months in many parts of the United States. One of the main problems is that the higher temperatures cause your body to sweat more in an effort to cool itself. However, the increased levels of moisture in the air makes sweating less efficient and breathing difficult. While sweating profusely and struggling to breathe during a seemingly simple workout can feel miserable, athletes will reap the physiological benefits of heat training even after the summer months have passed.

https://commons.wikimedia.org/wiki/File:Blood-centrifugation-scheme.pngOne of the main benefits of heat training is an increase in blood plasma volume. Blood plasma accounts for about 55% of blood volume and is mostly water, but also contains electrolytes, hormones, and proteins. Its main purpose is to carry these nutrients to the parts of the body that need them. Therefore, an increase in blood plasma volume allows the heart to pump more blood and more nutrients throughout the body.

Studies on elite athletes have shown that even after only five or ten days of heat training, there was a 4.5% (five days) or 6.5% (ten days) increase in blood plasma volume. One study showed that the increase in blood plasma volume was also accompanied by an increase in VO2max (maximum amount of oxygen someone can use during intense exercise). Another study reported an increased rowing performance two weeks after the heat training occurred.

In addition to increased blood plasma volume, heat training can help athletes reduce overall core temperature and blood lactate, which have both been linked to improved athletic performance. Overall, these adaptations not only make the athlete better acclimated to the heat, but allow them to perform better in cool temperatures as well.  While many athletes have long utilized altitude training as a natural way to increase red blood cells, heat training also has its own specific set of physiological benefits. So if you cannot escape the heat this summer, at least take solace that you cannot escape its benefits either.

Peer edited by Caitlyn Molloy

Follow us on social media and never miss an article:

Time to quit “Ordinary Smoking”

For thousands of years, the tobacco plant has been used for various purposes, ranging from general enjoyment to medicinal uses. Apart from the sniffing and chewing of tobacco, the more common means of tobacco use today is smoking. Nicotine, the drug in question, can have positive psychiatric effects, but is also very addictive thus leading to dependence and other severe health effects. Although the harmful effects of tobacco have been known for decades, nicotine addiction continues to be one of the major causes of noncommunicable diseases and mortality worldwide.

Despite the availability of therapeutic alternatives, nicotine’s inherently addictive nature makes smoking cessation a real challenge for users.Traditionally, combustible products have ruled the tobacco market where smoke from the burning of tobacco delivers nicotine to the users, along with thousands of toxic and carcinogenic compounds. These products mainly include cigarettes, little cigars, cigarillos, cigars and hookah. In addition, flavors are added to mask the harsh taste of tobacco and facilitate new user recruitment. There was also an attempt to manufacture safer cigarettes, known as light cigarettes which ultimately failed to deliver the promise of reduced harm. Products with lower nicotine content have also been adopted to reduce the extent of addiction. Overall, the central effort has continued to develop products that deliver nicotine efficiently without the toxic compound load.

Tobacco products have evolved significantly over the decades

Tobacco products have evolved significantly over the decades

In this quest, e-cigarettes were introduced in the mid-2000s which have gained rapid popularity among existing and naive users. Simply speaking, these products are composed of nicotine in an organic solvent that creates a nice visible cloud of vapor. Enticing flavors and attractive packaging were used to attract users. Originally conceptualized as an aid for smoking cessation, e-cigarettes quickly became a topic of debate dividing health professionals and regulatory authorities. On one hand, vaping does result in reduced harmful compound exposure, but it is suspected to act as a “gateway” to nicotine addiction in youths due to tempting flavors. Additionally, there is grave concern about the long-term hazardous health effects unique to inhaling the e-liquid and flavor compounds. Currently, thousands of different flavored e-liquids are available in the market and the severity of use was even acknowledged by Oxford dictionary by making “vape” the word of the year in 2014. The Internet is also full of vaping videos also called “cloud chasing”. However, e-cigarettes do not to provide a satisfactory nicotine “kick” to the users and also pose danger to the users.

So, what’s next? The next big thing for the US tobacco market is the heat-not-burn products; strategically named “iQOS”, or I Quit Ordinary Smoking. The principle is based on heating instead of burning the tobacco to provide the nicotine with reduced toxic chemical compound generation. Already launched in different parts of the world, iQOS, also known as “heat sticks”, promise nicotine delivery with reduced harmful compound exposure. Research from the tobacco industry supports these ideas. Currently, iQOS is under review in the US by FDA, and if approved, have the potential to completely replace combustible tobacco, as we know it. However, with addition of flavors and selective marketing strategy, these may well be the next biggest concern for health professionals.

Acknowledgements: Drs. Robert Tarran and Boris Reidel for their support.

Edited by Nicole Smiddy

Follow us on social media and never miss an article: