How a Cancer’s Genome Can Tell Us How to Treat it

By Gesa Junge, PhD


Any drug that is approved by the FDA has to have completed a series of clinical trials showing that the drug is safe to use and brings a therapeutic benefit, usually longer responses, better disease control, or fewer toxicities.

Generally, a phase I study of a potential cancer drug will include less than a hundred patients with advanced disease that have no more treatment options, and often includes many (or all) types of cancer. The focus in Phase I studies is on safety, and on finding the best dose of the drug to use in subsequent trials. Phase II studies involve larger patient groups (around 100 to 300) and the aim is to show that the treatment works and is safe in the target patient population, while Phase III trials can involve thousands of patients across several hospitals (or even countries) and aims to show a clinical benefit compared to existing therapies. Choosing the right patient population to test a drug in can make the difference between a successful and a failed drug. Traditionally, phase II and III trial populations are based on tumour site (e.g. lung or skin) and/or histology, i.e. the tissue where the cancer originates (e.g. carcinomas are cancer arising from epithelial tissues, while sarcomas develop in connective tissue).

However, as our understanding of cancer biology improves, it is becoming increasingly clear that the molecular basis of a tumour may be more relevant to therapy choice than where in the body it develops. For example, about half of all cutaneous melanoma cases (the most aggressive form of skin cancer) have a mutation in a signalling protein called B-Raf (BRAF V600). B-Raf is usually responsible for transmitting growth signals to cells, but while the normal, unmutated protein does this in a very controlled manner, the mutated version provides a constant growth signal, causing the cell to grow even when it shouldn’t, which leads to the formation of a tumour. A drug that specifically targets and inhibits the mutated version of B-Raf, Vemurafenib, was approved for the treatment of skin cancer in 2011, after trials showed it lead to longer survival and better response rates compared to the standard therapy at the time. It worked so well that patients in the comparator group were switched to the vemurafenib group halfway through the trial.

While B-Raf V600 mutations are especially common in skin cancer, they also occur in various other cancers, although at much lower percentages (often less than 5%), for example in lung and colorectal cancer. And since inhibition of B-Raf V600 works so well in B-Raf mutant skin cancer, should it not work just as well in lung or colon cancer with the same mutation? As the incidence of B-Raf V600 mutations is so low in most cancers, it would be difficult to find enough people to conduct a traditional trial and answer this question. However, a recently published study at Sloan Kettering Cancer Centre took a different approach: This study included 122 patients with non-melanoma cancers positive for B-Raf V600 and showed that lung cancer patients positive for B-Raf V600 mutations responded well to Vemurafenib, but colorectal cancer patients did not. This suggests that the importance of the mutated B-Raf protein for the survival of the tumour cells is not the same across cancer types, although at this point there is no explanation as to why.

Trials in which the patient population is chosen based on tumour genetics are called basket trials, and they are a great way to study the effect of a certain mutation on various different cancer types, even if only very few cases show this mutation. A major factor here is that DNA sequencing has come a long way and is now relatively cheap and quick to do. While the first genome that was sequenced as part of the Human Genome Project cost about $2.7bn and took over a decade to complete, a tumour genome can now be sequenced for around $1000 in a matter of days. This technological advance may make it possible to routinely determine a patient’s tumour’s DNA code and assign them to a therapy (or a study) based on this information.

The National Cancer Institute is currently running a trial which aims to evaluate this model of therapy. In the Molecular Analysis for Therapy Choice (MATCH) Trial, patients are assigned to a therapy based on their tumour genome. Initially, only ten treatments were included and the study is still ongoing, but an interim analysis after the 500th patient had been recruited in October 2015 showed that 9% of patients could be assigned to therapy based on mutations in their tumour, which is expected to increase as the trial is expanded to include more treatments.

This approach may be especially important for newer types of chemotherapy, which are targeted to a tumour-specific mutation that usually causes a healthy cell to become a cancer cell in the first place, as opposed to older generation chemotherapeutic drugs which target rapidly dividing cells and are a lot less selective. Targeted therapies may only work in a smaller number of patients, but are usually much better tolerated and often more effective, and molecular-based treatment decisions could be a way to allow more patients access to effective therapies faster.

The Danger of Absolutes in Science Communication


By Rebecca Delker, PhD

Complementarity, born out of quantum theory, is the idea that two different ways of looking at reality can both be true, although not at the same time. In other words, the opposite of a truth is not necessarily a falsehood. The most well known example of this in the physical world is light, which can be both a particle and a wave depending on how we measure it. Fundamentally, this principle allows for, and even encourages, the presence of multiple perspectives to gain knowledge.


This is something I found myself thinking about as I witnessed the twitter feud-turned blog post-turned actual news story (and here) centered around the factuality of physician-scientist Siddhartha Mukherjee’s essay, “Same but Different,” published recently in The New Yorker. Weaving personal stories of his mother and her identical twin sister with experimental evidence, Mukherjee presents the influence of the epigenome – the modifications overlaying the genome – in regulating gene expression. From this perspective, the genome encodes the set of all possible phenotypes, while the epigenome shrinks this set down to one. At the cellular level – where much of the evidence for the influence of epigenetic marks resides – this is demonstrated by the phenomenon that a single genome encodes for the vastly different phenotypes of cells in a multicellular organism. A neuron is different from a lymphocyte, which is different from a skin cell not because their genomes differ but because their transcriptomes (the complete set of genes expressed at any given time) differ. Epigenetic marks play a role here.


While many have problems with the buzzword status of epigenetics and the use of the phrase to explain away the many unknowns in biology (here, here), the central critique of Mukherjee’s essay was the extent to which he emphasized the role of epigenetic mechanisms in gene regulation over other well-characterized players, namely transcription factors – DNA binding proteins that are undeniably critical for gene expression. However, debating whether the well-studied transcription factors or the less well-established epigenetic marks are more important is no different than the classic chicken or egg scenario: impossible to assign order in a hierarchy, let alone separate from one another.


But whether we embrace epigenetics in all of its glory or we couch the term in quotation marks – “epigenetics” – in an attempt to dilute its impact, it is still worth pausing to dissect why a public exchange brimming with such negativity occurred in the first place.
“Humans are a strange lot,” remarked primatologist Frans de Waal. “We have the power to analyze and explore the world around us, yet panic as soon as evidence threatens to violate our expectations” (de Waal, 2016, p.113). This inclination is evident in the above debate, but it also hints at a more ubiquitous theme of the presence of bias stemming from one’s group identity. Though de Waal deals with expectations that cross species lines, even within our own species, group identity plays a powerful role in dictating relationships and guiding one’s perspective on controversial issues. Studies have shown that political identities, for example, can supplant information during decision-making. Pew Surveys reveal that views on the issue of climate change divide sharply along partisan lines. When asked whether humans are at fault for changing climate patterns, a much larger percentage of democrats (66%) than republicans (24%) answered yes; however, when asked what the main contributor of climate change is (CO2), these two groups converged (democrats: 56%, republicans: 58%; taken from Field Notes From a Catastrophe, p. 199-200). This illustrates the potential for a divide between one’s objective understanding of an issue and one’s subjective position on that issue – the latter greatly influenced by the prevailing opinion of their allied group.


Along with group identity is the tendency to eschew uncertainty and nuance, choosing solid footing no matter how shaky the turf, effectively demolishing the middle ground. This tendency has grown stronger in recent years, it seems, likely in response to an increase in the sheer amount of information available. This increased complexity, while important in allowing access to numerous perspectives on an issue, also triggers our innate response to minimize cost during decision-making by taking “cognitive shortcuts” and receiving cues from trusted authorities, including news outlets. This is exacerbated by the rise in the use of social media and shrinking attention spans, which quench our taste for nuance in favor of extremes. The constant awareness of one’s (online) identity in relation to that of a larger group encourages consolidation around these extremes. The result is the transformation of ideas into ideologies and the polarization of the people involved.


These phenomena are evident in the response to Mukherjee’s New Yorker article, but they can be spotted in many other areas of scientific discourse. This, unfortunately, is due in large part to a culture that rewards results, promotes an I-know-the-answer mentality, and encourages its members to adopt a binary vision of the world where there is a right and a wrong answer. Those who critiqued Mukherjee for placing too great an emphasis on the role of epigenetic mechanisms responded by placing the emphasis on transcription factors, trivializing the role of epigenetics. What got lost in this battle of extremes was a discussion of the complementary nature of both sets of discoveries – a discussion that would bridge, rather than divide, generations and perspectives.


While intra-academic squabbles are unproductive, the real danger of arguments fought in absolutes and along group identity lines lays at the interface of science and society. The world we live in is fraught with complex problems, and Science, humanity’s vessel of ingenuity, is called upon to provide clean, definitive solutions. This is an impossible task in many instances as important global challenges are not purely scientific in nature. They each contain a very deep human element. Political, historical, religious, and cultural views act as filters through which information is perceived and function to guide one’s stance on complex issues. When these issues include a scientific angle, confidence in the institution of science as an (trustworthy) authority plays a huge role.


One of the most divisive of such issues is that of genetically modified crops (GMOs). GMOs are crops produced by the introduction or modification of DNA sequence to incorporate a new trait or alter an existing trait. While the debate spans concerns about the safety of GMOs for human health and environmental health to economic concerns over the potential disparate benefits to large agribusiness and small farmers, these details are lost in the conversation. Instead, the debate is reduced to a binary: pro-GMO equals pro-science, anti-GMO equals anti-science. Again, the group to which one identifies, scientists included, plays a tremendous role in determining one’s stance on the issue. Polling public opinion reveals a similar pattern to that of climate change. Even though awareness of genetic engineering in crops has remained constantly low over the years, beliefs that GMOs pose a serious health hazard have increased. What’s worse, these debates treat all GMO crops the same simply because they are produced with the same methodology. While the opposition maintains a blanket disapproval of all engineered crops, the proponents don’t fare better, responding with indiscriminate approval.


Last month The National Academy of Sciences released a comprehensive, 420-page report addressing concerns about GMOs and presenting an analysis of two-decades of research on the subject. While the conclusions drawn largely support the idea that GMOs pose no significant danger for human and environmental health, the authors make certain to address the caveats associated with these conclusions. Though prompted by many to provide the public with “a simple, general, authoritative answer about GE (GMO) crops,” the committee refused to participate in “popular binary arguments.” As important as the scientific analysis is this element of the report, which serves to push the scientific community away from a culture of absolutes. While the evidence at hand shows no cause-and-effect relationship between GMOs and human health problems, for example, our ability to assess this is limited to short-term effects, as well as by our current ability to know what to look for and to develop assays to do so. The presence of these unknowns is a reality in all scientific research and to ignore them, especially with regard to complex societal issues, only serves to strengthen the growing mistrust of science in our community and broaden the divide between people with differing opinions. As one review of the report states, “trust is not built on sweeping decrees.”


GMO crops, though, is only one of many issues of this sort; climate change and vaccine safety, for example, have been similarly fraught. And, unfortunately, our world is promising to get a whole lot more complicated. With the reduced cost of high-throughput DNA sequencing and the relative ease of genome editing, it is becoming possible to modify not just crops, but farmed animals, as well as the wild flora and fauna that we share this planet with. Like the other issues discussed, these are not purely scientific problems. In fact, the rapid rate at which technology is developing creates a scenario in which the science is the easy part; understanding the consequences and the ethics of our actions yields the complications. This is exemplified by the potential use of CRISPR-driven gene drives to eradicate mosquito species that serve as vectors for devastating diseases (malaria, dengue, zika, for example). In 2015, 214 million people were affected by malaria and, of those, approximately half a million died. It is a moral imperative to address this problem, and gene drives (or other genome modification techniques) may be the best solution at this time. But, the situation is much more complex than here-today, gone-tomorrow. For starters, the rise in the prevalence of mosquito-borne diseases has its own complex portfolio, likely involving climate change and human-caused habitat destruction and deforestation. With limited understanding of the interconnectedness of ecosystems, it is challenging to predict the effects of mosquito specicide on the environment or on the rise of new vectors of human disease. And, finally, this issue raises questions of the role of humans on this planet and the ethics of modifying the world around us. The fact is that we are operating within a space replete with unknowns and the path forward is not to ignore these nuances or to approach these problems with an absolutist’s mindset. This only encourages an equal and opposite reaction in others and obliterates all hope of collective insight.


It is becoming ever more common for us to run away from uncertainty and nuance in search of simple truths. It is within the shelter of each of our groups and within the language of absolutes that we convince ourselves these truths can be found; but this is a misconception. Just as embracing complementarity in our understanding of the physical world can lead to greater insight, an awareness that no single approach can necessarily answer our world’s most pressing problems can actually push science and progress forward. When thinking about the relationship of science with society, gaining trust is certainly important but not the only consideration. It is also about cultivating an understanding that in the complex world in which we live there can exist multiple, mutually incompatible truths. It is our job as scientists and as citizens of the world to navigate toward, rather than away from, this terrain to gain a richer understanding of problems and thus best be able to provide a solution. Borrowing the words of physicist Frank Wilczek, “Complementarity is both a feature of physical reality and a lesson in wisdom.”


Leaving Your Mark on the World

By Danielle Gerhard


The idea that transgenerational inheritance of salient life experiences exists has only recently entered the world of experimental research. French scientist Jean-Baptiste Lamarck proposed the idea that acquired traits throughout an organism’s life could be passed along to offspring. This theory of inheritance was originally disregarded in favor of Mendelian genetics, or the inheritance of phenotypic traits isn’t a blending of the traits but instead a specific combination of alleles to form a unique gene encoding the phenotypic trait. However, inheritance is much more complicated than either theory allows for. While Lamarckian inheritance has largely been negated by modern genetics, recent findings in the field of genetics have caused some to revisit l’influence des circonstances, or, the influence of circumstances.


Over the past decade, efforts have shifted towards understanding the mechanisms underlying the non-Mendelian inheritance of experience-dependent information. While still conserving most of the rules of Mendelian inheritance, new discoveries like epigenetics and prions challenge the central dogma of molecular biology. Epigenetics is the study of heritable changes in gene activity as a result of environmental factors. These changes do not affect DNA sequences directly but instead impact processes that regulate gene activity such as DNA methylation and histone acetylation.


Epigenetics has transformed how psychologists approach understanding the development of psychological disorders. The first study to report epigenetic effects on behavior came from the lab of Michael Meany and Moshe Szyf at McGill University in the early 2000s. In a 2004 Nature Neuroscience paper they report differential DNA methylation in pups raised by high licking and grooming mothers compared to pups raised by low licking and grooming mother. Following these initial findings, neuroscientists have begun using epigenetic techniques to better understand how parental life experiences, such as stress and depression, can shape the epigenome of their offspring.


Recent research coming out from the lab of Tracy Bale of the University of Pennsylvania has investigated the heritability of behavioral phenotypes. A 2013 Journal of Neuroscience paper found that stressed males went on to produce offspring with blunted hypothalamic pituitary (HPA) axis responsivity. In simpler terms, when the offspring were presented with a brief, stressful event they had a reduction in the production of the stress hormone corticosterone (cortisol in humans), symptomatic of a predisposition to psychopathology. In contrast, an adaptive response to acute stressors is a transient increase in corticosterone that signals a negative feedback loops to subsequently silence the stress response.


The other key finding from this prior study is the identification of nine small non-coding RNA sperm microRNAs (miRs) increased in stressed sires. These findings begin to delve into how paternal experience can influence germ cell transmission but does not explain how selective increases in these sperm miRs might effect oocyte development in order to cause the observed phenotypic and hormonal deficits seen in adult offspring.


A recent study from the lab published in PNAS builds off of these initial findings to further investigate the mechanisms underlying transgenerational effects of paternal stress. Using the previously identified nine sperm miRs, the researchers performed a multi-miR injection into single-cell mouse zygotes that were introduced into healthy surrogate females. To confirm that all nine of the sperm miRs were required to recapitulate the stress phenotype, another set of single-cell mouse zygotes were microinjected with a single sperm miR. Furthermore, a final set of zygotes received none of the sperm miRs. Following a normal rearing schedule, the adult offspring were briefly exposed to an acute stressor and blood was collected to analyze changes in stress hormones. As hypothesized, male and female adult offspring from the multi-miR group had a blunted stress response relative to both controls.


To further investigate potential effects on neural development, the researchers dissected out the paraventricular nucleus (PVN) of the hypothalamus, a region of the brain that has been previously identified by the group to be involved in regulation of the stress response. Using RNA sequencing and gene set enrichment analysis (GSEA) techniques they found a decrease in genes involved in collagen formation and extracellular matrix organization which the authors go on to hypothesize could be modifying cerebral circulation and blood brain barrier integrity.


The final experiment in the study examined the postfertilization effects of multi-miR injected zygotes. Specifically, the investigators were interested in the direct, combined effect of the nine identified sperm miRs on stored maternal mRNA. Using a similar design as the initial experiment, the zygote mRNA was collected and amplified 24 hours after miR injection in order to examine differential gene expression. The researchers found that microinjection of the nine sperm miRs reduced stored maternal mRNA of candidate genes.


This study is significant as it has never been shown that paternally derived miRs play a regulatory role in zygote miR degradation. In simpler terms, these findings contradict the conventional belief that zygote development is solely maternally driven. Paternal models of transgenerational inheritance of salient life experiences are useful as they avoid confounding maternal influences in development. Studies investigating the effects of paternal drug use, malnutrition, and psychopathology are ongoing.


Not only do early life experiences influence the epigenome passed down to offspring but recent work out of the University of Copenhagen suggests that our diet may also have long-lasting, transgenerational effects. A study that will be published in Cell Metabolism next year examined the effects of obesity on the epigenome. They report differential small non-coding RNA expression and DNA methylation of genes involved in central nervous system development in the spermatozoa of obese men compared to lean controls. Before you start feeling guilty about the 15 jelly donuts you ate this morning, there is hope that epigenetics can also work in our favor. The authors present data on obese men who have undergone bariatric surgery-induced weight loss and they show a remodeling of DNA methylation in spermatozoa.


Although still a nascent field, epigenetics has promise for better understanding intergenerational transmission of risk to developing a psychopathology or disease. The ultimate goal of treatment is to identify patterns of epigenetic alternations across susceptible or diagnosed individuals and develop agents that aim to modify epigenetic processes responsible for regulating genes of interest. I would argue that it will one day be necessary for epigenetics and pharmacogenetics, another burgeoning field, to come into cahoots with one another to not only identify the epigenetic markers of a disease but to identify the markers on an person by person basis. However, because the fields of epigenetics and pharmacogenetics are still in the early stages, the tools and techniques currently available limit them. As a result, researchers are able to extract correlations in many of their studies but unable to determine potential causality. Therefore, longitudinal, transgenerational studies like those from the labs of Tracy Bale and others are necessary to provide insight into the lability of our epigenome in response to lifelong experiences.

Lethal Weapon: How Many Lethal Mutations Do We Carry?


By John McLaughlin

Many human genetic disorders, such as cystic fibrosis and sickle cell anemia, are caused by recessive mutations with a predictable pattern of inheritance. Tracking hereditary disorders such as these is an important part of genetic counseling, for example when planning a family. In fact, there exists an online database dedicated to medical genetics, Mendelian Inheritance in Man, which contains information on most human genetic disorders and their associated phenotypes.


The authors of a new paper in Genetics set out to estimate the number of recessive lethal mutations carried in the average human’s genome. The researchers’ rationale for specifically focusing on recessive mutations is their higher potential impact on human health; because deleterious mutations that are recessive are less likely to be purged by selection, they can be maintained in heterozygotes with little impact on fitness, and therefore occur in greater frequency. For the purposes of their analysis, recessive lethal disorders (i.e. caused by a recessive lethal mutation) were defined by two main criteria: first, when homozygous for its causative mutation, the disease leads to the death or effective sterility of its carrier before reproductive age, and second, mutant heterozygotes do not display any disease symptoms.


For this study, the researchers had access to an excellent sample population, a religious community known as the Hutterian Brethren. This South Dakotan community of ~1600 individuals is one of three closely related groups that migrated from Europe to North America in the 19th century. Importantly, the community has maintained a detailed genealogical record tracing back to the original 64 founders, which also contains information on individuals affected by genetic disorders since 1950. An additional bonus is that the Hutterites practice a communal lifestyle in which there is no private property; this helps to reduce the impact of confounding socioeconomic factors on the analysis.


Four recessive lethal genetic disorders have been identified in the Hutterite pedigree since their more detailed records began: cystic fibrosis, nonsyndromic mental retardation, restrictive dermopathy, and myopathy. To estimate the number of recessive lethal mutations carried by the original founders, the team used both the Hutterite pedigree and a type of computational simulation known as “gene dropping”. In a typical gene dropping simulation, alleles are assigned to a founder population, the Mendelian segregation and inheritance of these alleles across generations is simulated, and the output is compared with the known pedigree. One simplifying assumption made during the analysis is that no de novo lethal mutations had arisen in the population since its founding; therefore, any disorders arising in the pedigree are attributed to mutations carried by the original founder population.


After combining the results from many thousands of such simulations with the Hutterite pedigree, the authors make a final estimate of roughly one or two recessive lethal mutations carried per human genome (the exact figure is ~0.58). What are the implications of this estimate for human health? Although mating between more closely related individuals has been long known to increase the probability of recessive mutations homozygosing in offspring, a more precise risk factor was generated from this study’s mutation estimate. In the discussion section it is noted that mating between first cousins, although fairly rare today in the United States, is expected to increase the chance of a recessive lethal disorder in offspring by ~1.8%.


Perhaps the most interesting finding from this paper was the consistency of the predicted lethal mutation load across the genomes of different animal species. The authors compared their estimates for human recessive lethal mutation number to those from previous studies examining this same question in fruit fly and zebrafish genomes, and observed a similar value of one or two mutations per genome. Of course, the many simplifying assumptions made during their analyses should be kept in mind; the estimates are considered tentative and will most likely be followed up with similar future work in other human populations. It will certainly be interesting to see how large-scale studies such as this one will impact human medical genetics in the future.


Predicting Suicide


By Jesica Levingston Mac leod, PhD


The play “suicide is forbidden in spring”. written by Alejandro Casona, describes an organization that helps potential suicide patients to end their lives, but the truth is that the doctors really want to avoid the sad end, and… they actually save the patients. They work with the “leitmotiv” that if you really want to finish your life, you will just do it, but the search for help is an indicator or alert signal of some survival and seek for attention behavior.

As reported by the Health Research found worldwide, 1 million suicides are committed by year. This means 1 death every 40 seconds. According to the CDC, In United States the percentage of suicidal is around 0.012%, where is the 10th leading cause of death. North America has 1 suicide every 13 minutes.The suicidal capital of the world is Greenland with a 108.1 suicide rate, followed by South Korea with 31.7. China is in the seventh place, it accounts for almost one third of all the suicides, and differently than the other countries it is the only one where women have a higher suicidal rate than men. Indeed, 3 years ago the terrible news about how in some factories, like Foxconn, making sought-after Apple iPads and iPhones were forcing staff to sign pledges not to commit suicide. Among 2013 at least 14 workers at Foxconn factories have taken the decision of terminating the horrendous working and housing conditions, ending their lives.


This initiative to attempt against your own life was been related to mental illness (almost in 50% of the cases) and metabolic disorders. The most implemented way of killing themselves is firearms, followed by suffocation/hanging and falls. The alarming fact is that rates of suicide have increased by 60% in the last 30 years, especially in developed countries. Also, you must consider that for every suicide that results in death there are between 10 to 40 attempted suicides. But what does bring a human been to the edge… and push him to jump?

New research has found that the answer would be the lack of the correct expression of one gene. Yes, only the downregulation of SKA2, the guilty gene, could be a biomarker for detecting suicidal behaviors. SKA2 stands for spindle and kinetochore associated complex subunit 2. The protein encoded by this gene is part of a microtubule biding complex that is essential for proper chromosomal segregation.

When they examined the postmortem brain samples from 3 independent cohorts (around 29 from suicide assesd humans and 29 controls per each group) they found that SKA2 had lower expression levels in the suicide cases than in the control, and its expression was negatively associated with DNA methylation. The chemical addition of a methyl group can activate or negatively modulate a gene, as it is considered an epigenetic modification.

I guess you are thinking: these are “Frankenstein” samples, how can this gene be related to really live human beings? Well, apparently the Johns Hopkins researchers also made the same question. In order to answer it they collected blood samples from other 3 independent cohorts with suicidal ideation and controls (with a number of subjects of 22, 51 and 327 each). In these study, the expression of the SKA2 gene was significantly reduced in suicide decedents. Furthermore, they analyzed levels of salivary cortisol. Cortisol is implicated in the glucocorticoid receptor transactivation and stress response. The results suggested that SKA2 epigenetic and genetic variation may modulate cortisol expression. The most important discovery was that the model that they generated based on these data allowed them to predict the suicidal ideation of subjects just using blood samples. They analyzed the methylated status of the SKA2 gene, which correlated with the suicidal attempts.

The great thinker Albert Camus ones recalled the attention in this issue when he said: There is but one truly serious philosophical problem and that is suicide.”  For some in risk groups, like the soldiers who are coming back home with traumas after the war, the possibility of attempts against their lives is a ghost that has taken a lot of lives. This simple blood test can point out which individuals could be in risk and therefore they may get a correct follow up and treatment that might end preventing the catastrophe. Some high pressure jobs can also implement this analysis to avoid the lost of lives, giving correct care to people who tested positive. And even closer to all: would you like to know if you have this tendency printed on your DNA? Or your partner? Or your kids?

While you think about this, let me leave you with a relief quote: “If I had no sense of humor, I would long ago have committed suicide.” Perhaps, you would be surprise to know that the wise man who said this was the Dr. Mahatma Gandhi, whom almost killed himself in a starving protest trying to obtain the independence of the Indian Republic.


The Force Beyond Genetics


By Katherine Peng

From humans, to wookies, to Jar Jar Binks. With over 20 million species in the Star Wars galaxy, one can assume that not even this fictional universe is exempt from the laws of evolution. A conversation on how diverse environments pushed them to look so strange (or like cute little ewoks!) through rare mutations would be a discussion of the “hard inheritance” of DNA sequences, but what about soft inheritance? Can the surrounding environment in one lifespan change DNA in ways that can be passed on to offspring?

You’re shaking your head thinking DUH..NO, as thoughts jump back to the image of that giraffe in your biology 101 textbook. Remember that a once accepted theory of evolution proposed by Jean-Baptiste Lamarck suggested that acquired traits were inherited (e.g. a giraffe constantly stretching it’s neck to reach tall trees will give birth to giraffes with longer necks)? We now know that this theory is preposterous. What if Yoda lost his favorite toe in battle and decided to procreate? Would all future Yodas be afflicted with the curse of having only 5 toes?

BUT WAIT. Lamarck might have been onto something after all! Molecular biology has found that environmental factors CAN affect DNA without altering the sequence, and that these changes can be passed on to future generations (though not often beneficially). Welcome to the field of epigenetics!

In all eukaryotic cells in ours and the Star Wars galaxy, DNA is packaged around histone proteins. This DNA can be methylated and/or the histones can be modified to silent gene expression. People nowadays are becoming increasingly interested in how environmental factors produce these epigenetic changes and affect disease patterns.Genome-wide epigenetic studies are more commonly done on identical twins, where differences between individuals must be environmental. While Luke and Leia Skywalker are fraternal twins, sharing ~50% of their genes as would regular siblings, they were separated at birth to be raised in different environments. So, let’s take a look at how they become more different than their genetics would have us believe.


In 2004, researchers from McGill University discovered that early nurturing from rat mothers remove epigenetic repression of the glucocorticoid receptor gene in rat pups. Consequently, rats that were not well nurtured became more sensitive to stress as adults.


Though childhood scenes of Luke and Leia are basically nonexistant, we do know that Leia was adopted into a very loving royal family who could not have children of their own. Luke was adopted by his step-uncle and step-uncle’s wife. His step-uncle did not approve of his adventurous tendencies, which created tension.

VERDICT: Though both twins are at risk of higher stress responses due to Amidala’s likely depression during pregnancy, hints of a less nurturing environment suggest that Luke may be more sensitive to stress as an adult than Leia.


The Swedes are on a role in this category. A recent Swedish study shows that 20 minutes of exercise can reverse DNA methylation of genes in muscle that show lowered expression in type 2 diabetes. Another shows that 6 months of exercise changed the methylation pattern of genes in fat cells implicated in the risk of developing obesity or diabetes.

While Luke is working the moisture farm in Tattoine and having adventures, Leia is a palace princess in Alderaan.

VERDICT: While it seems that both Luke and Leia are fit later on, Luke appeared more active as a child and may be at a lower risk for type 2 diabetes. Sorry Leia.


Bioactive food components (in tea, soybean, etc.) might beneficially reduce DNA hypermethylation of cancer associated genes. On the flip side, folate found in fresh produce is required for DNA methylation, and its deficiency in pregnant moms may cause disease or cancer in children. You are also what your father eats. A mouse study showed that a paternal low-protein diet created upregulation of lipid biosynthesis in offspring.

Unfortunately, there’s no real information out there on the diet of the Skywalkers so…

VERDICT: Inconclusive.


So what have we learned here today? Leia needs to ramp up her training, and Luke should control that anxiety before he becomes like his father. But really, epigenetic changes in twins aren’t too different until later in life so I guess it’s all speculation until Disney releases the first installment of the sequel trilogy.

The Most Scizzling Papers of 2013


The Scizzle Team

Bacteriophage/animal symbiosis at mucosal surfaces

The mucosal surfaces of animals, which are the major entry points for pathogenic bacteria, are also known to contain bacteriophages. In this study, Barr et al. characterized the role of these mucus associated phages. Phages were more commonly found on mucosal surfaces than other environments and adhere to mucin glycoproteins via hypervariable immunoglobulin like domains. Bacteriophage pre-treatment of mucus producing cells provided protection from bacterial induced death, but this was not the case for cells that did not produce mucus. These studies show that there may be a symbiotic relationship between bacteriophages and multicellular organisms which provides bacterial prey for the phages and antimicrobial protection for the animals.


Interlocking gear system discovered in jumping insects

Champion jumping insects need to move their powerful hind legs in synchrony to prevent spinning. Burrows and Sutton studied the mechanism of high speed jumping in Issus coleoptratus juveniles and found the first ever example in nature of an interlocking gear system. The gears are located on the trochantera (leg segments close to the body’s midline) and ensure both hind legs move together when Issus is preparing and jumping. As the insect matures, the gear system is lost, leaving the adults to rely on friction between trochantera for leg synchronization.


HIV-1 capsid hides virus from immune system

Of the two strains of HIV, HIV-1 is the more virulent and can avoid the human immune response, whereas HIV-2 is susceptible. This may be due to the fact that HIV-2 infects dendritic cells, which detect the virus and induce an innate immune response. HIV-1 cannot infect dendritic cells unless it is complexed with the HIV-2 protein Vpx, and even then the immune response isn’t induced until late in the viral life cycle, after integration into the host genome. Lahaye et al. found that only viral cDNA synthesis is required for viral detection by dendritic cells, not genome integration. Mutating the capsid proteins of HIV-1 showed that the capsid prevents sensing of HIV-1 cDNA until after the integration step. This new insight into how HIV-1 escapes immune detection may help HIV vaccine development.


Transcription factor binding in exons affects protein evolution

Many amino acids are specified by multiple codons that are not present in equal frequencies in nature. Organisms display biases towards particular codons, and in this study Stamatoyannopoulos et al. reveal one explanation. They find that transcription factors bind within exonic coding sequences, providing a selective pressure determining which codon is used for that particular amino acid. These codons are called duons for their function as both an amino acid code and a transcription factor binding site.


Chromosome silencing

Down syndrome is caused by the most common chromosomal abnormality in live-born humans: Trisomy 21. The association of the syndrome with an extra (or partial extra) copy of chromosome 21 was established in 1959. In the subsequent fifty years a number of advances have been made using mouse models, but there are still many unanswered questions about exactly why the presence of this extra chromosome should lead to the observed defects. An ideal experimental strategy would be to turn off the extra chromosome in human trisomy 21 cells and compare the “corrected” version of these cells with the original trisomic cells. This is exactly what a team led by Jeanne Lawrence at the University of Massachusetts Medical School has done. Down syndrome is not the only human trisomy disorder: trisomy 13 (Patau syndrome) and trisomy 18 (Edward’s syndrome), for example, produce even more severe effects, with life expectancy usually under one to two years. Inducible chromosome silencing of cells from affected individuals could therefore also provide insights into the molecular and cellular etiology of these diseases.


Grow your own brain

By growing organs in a dish researchers can easily monitor and manipulate the organs’ development, gaining valuable insights into how they normally develop and which genes are involved. Now, however, a team of scientists from Vienna and Edinburgh have found a way to grow embryonic “brains” in culture, opening up a whole world of research possibilities. Their technique, published in Nature, has also already provided a new insight into the etiology of microcephaly, a severe brain defect.

[box style=”rounded”]Scizzling extra: In general, 2013 was a great year for growing your own kidneyspotentially a limb and liver. What organ will be next? [/box]


Sparking metastatic cell growth

A somewhat controversial paper published in Nature Cell Biology this year showed that the perivescular niche regulates breast tumor cells dormancy. The paper showed how disseminated breast tumor cells (DTC) are kept dormant and how they can be activated and aggressively metastasize. Based on the paper, this is due to the interaction of interaction with the microvascularate, where thrombospondin-1 (TSP-1) induces quiescence in DTC and TGF-beta1 and periosstin induces DTC growth. This work opens the door for potential therapeutic against tumor relapse.


Fear memories inherited epigenetically

Scientists showed that behavioral experiences can shape mice epigenetically in a way that is transmittable to offspring.  Male mice conditioned to fear an odor showed hypomethylation for the respective odor receptor in their sperm; offspring of these mice showed both increased expression of this receptor, and increased sensitivity to the odor that their fathers had been conditioned on.  Does this suggest that memories can be inherited?


Grid cells found in humans

Scientists have long studied rats in a maze, but what about humans?  An exciting paper last August demonstrated that we, like out rodent counterparts, navigate in part using hippocampal grid cells.  Initially identified in the entorhinal cortex of rats back in 2005, grid cells have the interesting activity pattern of firing in a hexagonal grid in the spatial environment and as such are thought to underlie the activity of place cells. Since then grid cells have been found in mice, rats, and monkeys, and fMRI data has suggested grid cells in humans.  This paper used electrophysiological recordings to document grid cell activity in humans.


Sleep facilitates metabolic clearance

Sleep is vital to our health, but researchers have never been entirely sure why.  It turns out part of the function of sleep may be washing waste products from the brain, leaving it clean and refreshed for a new day of use.  Exchange of cerebral spinal fluid (CSF), which is the primary means of washing waste products from the brain, was shown to be significantly higher when animals were asleep compared to waking.  This improved flow was traced back to increased interstitial space during sleep, and resulted in much more efficient clearance of waste products.  Thus, sleep may be crucial to flushing metabolites from the brain, leaving it fresh and ready for another day’s work.

[box style = “rounded”] Robert adds: As a college student my friends and I always had discussions about sleep and it was also this mysterious black box of why we actually need to sleep. Studies could show the effects of lack of sleep such as poor cognition and worse memory but this paper linked it to an actual mechanism by which this happens. First of all I found it very impressive that the researchers trained mice to sleep under the microscope. On top of that showing the shrinkage of the neurons and the flow of cerebrospinal fluid which cleans out metabolites finally linked the cognitive aspects of sleep deprivation to the physical brain. [/box]


Poverty impedes cognitive function

People who are struggling financially often find it difficult to escape poverty, in part due to apparently poor decision making.  Investigators demonstrated that part of this vicious cycle may arise from cognitive impairment as a direct result of financial pressures.  The researchers found that thinking about finances reduced performance on cognitive tasks in participants who were struggling, but not in those who were financially comfortable.  Furthermore, farmers demonstrated poorer cognitive performance before harvest, at a time of relative poverty, compared to after harvest when money was more abundant.


Gut Behavior

2013 has definitely been the year of the gut microbiome! Studies have shown that diet affects the composition of trillions of microorganisms in the human gut, and there is also a great deal of evidence pointing towards the gut microbiome affecting an individual’s susceptibility to a number of diseases. Recently published in Cell, Hsiao and colleagues report that gut microbiota also affect behavior, specifically in autism spectrum disorder (ASD). Using a mouse model displaying ASD behavioral features, the researchers saw that probiotic treatment not only altered microbial composition, but also corrected gastrointestinal epithelial barrier defects and reduced leakage of metabolites, as demonstrated by an altered serum metabolomic profile. Additionally, a number of ASD behaviors were improved, including communication, anxiety, and sensorimotor behaviors. The researchers further showed that a particular metabolite abundant in ASD mice but lowered with probiotic treatment is the cause of certain behavioral abnormalities, indicating that gut bacteria-specific effects on the mammalian metabolome influence host behavior.

Your skin – their home

A paper published in Nature examined the diversity of the fungal and bacterial communities that call our skin home. The analysis done in this study revealed that the physiologic attributes and topography of skin differentially shape these two microbial communities. The study opens the door for studying how the pathogenic and commensal fungal and bacterial communities interact with each other and how it affects the maintenance of human health.


Discovery of new male-female interaction can help control malaria

A study published in PLOS Biology provided the first demonstration of an interaction between a male allohormone and a female protein in insects.  The identification of a previously uncharacterized reproductive pathway in A. Gambiae has promise for the development of tools to control malaria-transmitting mosquito populations and interfere with the mating-induced pathway of oogenesis, which may have an effect on the development of Plasmodium malaria parasites.

[box style = “rounded”]Chris adds: “My friend chose this paper to present at journal club one week, because he thought it was well written, interesting etc etc. Unbeknownst to him, one of the paper’s authors was visiting us at the time. We sit down for journal club and one of the PIs comes in, sees the guy and exclaims (with mock exasperation) “you can’t be here!” Me and the presenter look at one another, confused. He presents journal club, and luckily enough, the paper is so well written, that he can’t really criticize it!” [/box]


Using grapefruit to deliver chemotherapy

Published in Nature Communications, this paper describes how nanoparticles can be made from edible grapefruit lipids and used to deliver different types of therapeutic agents, including medicinal compounds, short interfering RNA, DNA expression vectors, and proteins to different types of cells. Grapefruit-derived nanovectors demonstrated the ability to inhibit tumor growth in two tumor animal models. Moreover, the grapefruit nanoparticles used in this study had no detectable toxic effects, could be manipulated or modified to target specific cells/ tissues, and were economical to create. Grapefruits may have a bad reputation for interfering with drugs, but perhaps in the future we will be using grapefruit products to deliver drugs more effectively!



In May, a new technique called CLARITY to effectively make tissue transparent through a new fixation technique was published in Nature. This new process has allowed them to clearly see neuron connection networks not possible before because they can now view the networks in thicker tissue sections. This new advancement will help researchers be able to better map the brain, but this new technology can also be to create 3-D images of other tissues such as cancer. This new ability allows us to gain better insight into the macroscopic networks within a specific tissue type.


Crispier genome-editing

This year, the CRISPR technique was developed as an efficient gene-targeting method. The benefit of this method over the use of TALENS or a zinc-finger knockout is it allows for the rapid generation of mice that have multiple genetic mutations in just one step. The following review gives even more information on this new technique and compares its usefulness to that of TALENS and zinc-finger knockouts. Further, just couple of weeks ago, two back-to-back studies in Cell Stem Cell using the CRISPR-Cas9 system to cure diseases in mice and human stem cells.  In the first study the system was used in mice to correct the Crygc gene that causes cataracts; in the second study the CRSPR-Cas9 system was used to correct the CFTR locus in cultured intestinal stem cells of CF patients. These findings serve as a proof-of-concept that diseases caused by a single mutation can be “fixed” with genome editing using the CRISPR-Cas9 system.

What was your favorite paper this year? Let us know! And of course – use Scizzle to stay on top of your favorite topics and authors.

Family Fear?



All You Need To Know About Epigenetic Inheritance of Fear Conditioning


Celine Cammarata

A new paper has exploded into the neuroscience world this week with the claim that conditioned fear of an odor can be passed down from father to son – remember Lamarck’s idea of how giraffe’s pass on their elongated necks?  The research has investigators’ knickers in a twist – some are excited, some are skeptical, and many are both.  So, what’s the scoop on this paper?


The Research

Emory investigators Brian Dias and Kerry Ressler showed intriguing evidence that fear memories can be inherited.  Sexually naive adult male mice were conditioned, using shock paired with odor exposure, to be fearful toward one of two scents, or, as a control, were left in their home cages.  Subsequently, these males were bred to naive females, and the lo and behold the male offspring showed increased sensitivity specifically to the odor that their fathers had been conditioned with, but not others.  The same held true for the conditioned males’ grandchildren, pups that were raised by other parents, and pups born from IVF, strengthening the argument that genetics were underlying this inherited fear.  The offspring of odor conditioned males also showed enlargements of the glomerulus corresponding to the odor sensory neurons that detect the conditioned scent and increased numbers of these neurons.  In the conditioned males’ sperm, the gene for the odor receptor of the conditioned scent showed reduced methylation of some regions, suggesting that this hypomethylation might underlie the increased receptor expression in offspring


The Context

This isn’t the first suggestion of heritable epigenetics.  A Science paper earlier this year revealed how methlyation, specifically, is “erased” in precursor cells that will become gametes, but also demonstrated that some epigenetic changes are able to escape erasure. Nor is this the first indication that parental experiences can shape offspring.  Researchers at the Mount Sinai School of Medicine showed that [mouse] fathers who experience social defeat leading to learned helplessness can pass on depression to their children, and a 2012 paper from U Penn and Mass General argued for heritable resistance to cocaine based.


The Buzz

Of course, the paper leaves some crucial questions open, most notably: how do the conditioned male’s sperm cells know what’s happening in the olfactory bulb?  The authors make some guesses, but largely leave this blank.  The investigators also never directly correlate the enlarge glomerulus in offspring to their enhanced sensitivity to their fathers’ conditioned odor.  And while the work is exciting, not everyone is ready to jump on board.  Some are dubious that the methylation seen in sperm would have the kind of effect shown in offspring, or reject the notion of such a high level of malleability in the genome; others point out that it’s difficult to determine whether it’s truly a fearful memory that’s been inherited, or just an increased sensitivity.  Nonetheless, nearly everyone has something to say on the topic – so take your newfound knowledge and get ready to join the conversation!

Leafing Through The Literature

Thalyana Smith-Vikos

Highlighting recently published articles in molecular biology, genetics, and other hot topics

Battle of the Small RNAs

Sarkies et al. have analyzed gene expression in the nematode C. elegans upon infection with the positive-strand RNA virus Orsay. As RNAi is required for the immune response, the Argonaute protein RDE-1exhibits less repression of its endogenous small RNA targets to focus on its exogenous target. The authors also showed that a wild C. elegans isolate exhibits a reduction in miRNA expression and a consequent increase in miRNA target levels upon viral infection.

Continue reading “Leafing Through The Literature”


Tara Burke

Bromodomain inhibitors show potential as a treatment for heart failure

Heart failure is the leading cause of hospitalizations, healthcare expenditures and death in America today. Heart failure occurs when the heart can no longer pump efficiently to accommodate the body’s needs. To date, most heart failure medications target hormonal signaling pathways, a process which initiates at the cell’s outer surface and eventually converges on specific transcription factors that control heart failure pathogenesis. These current treatments have improved patient survival but said treatments are far from optimal or ideal.  Although it is well established that chromatin and transcriptional changes drive heart failure pathogenesis in cardiomyocytes there is currently no treatment to directly block these detrimental nuclear changes and target damaging changes at their source. There are a number of transcription factors and epigenetic changes known to be important in heart failure pathogenesis but, as is the case for numerous other diseases, drug design for transcription factors and specific epigenetic marks proves difficult. Continue reading “BROMODOMAIN-IA!”