Lethal Weapon: How Many Lethal Mutations Do We Carry?

 

By John McLaughlin

Many human genetic disorders, such as cystic fibrosis and sickle cell anemia, are caused by recessive mutations with a predictable pattern of inheritance. Tracking hereditary disorders such as these is an important part of genetic counseling, for example when planning a family. In fact, there exists an online database dedicated to medical genetics, Mendelian Inheritance in Man, which contains information on most human genetic disorders and their associated phenotypes.

 

The authors of a new paper in Genetics set out to estimate the number of recessive lethal mutations carried in the average human’s genome. The researchers’ rationale for specifically focusing on recessive mutations is their higher potential impact on human health; because deleterious mutations that are recessive are less likely to be purged by selection, they can be maintained in heterozygotes with little impact on fitness, and therefore occur in greater frequency. For the purposes of their analysis, recessive lethal disorders (i.e. caused by a recessive lethal mutation) were defined by two main criteria: first, when homozygous for its causative mutation, the disease leads to the death or effective sterility of its carrier before reproductive age, and second, mutant heterozygotes do not display any disease symptoms.

 

For this study, the researchers had access to an excellent sample population, a religious community known as the Hutterian Brethren. This South Dakotan community of ~1600 individuals is one of three closely related groups that migrated from Europe to North America in the 19th century. Importantly, the community has maintained a detailed genealogical record tracing back to the original 64 founders, which also contains information on individuals affected by genetic disorders since 1950. An additional bonus is that the Hutterites practice a communal lifestyle in which there is no private property; this helps to reduce the impact of confounding socioeconomic factors on the analysis.

 

Four recessive lethal genetic disorders have been identified in the Hutterite pedigree since their more detailed records began: cystic fibrosis, nonsyndromic mental retardation, restrictive dermopathy, and myopathy. To estimate the number of recessive lethal mutations carried by the original founders, the team used both the Hutterite pedigree and a type of computational simulation known as “gene dropping”. In a typical gene dropping simulation, alleles are assigned to a founder population, the Mendelian segregation and inheritance of these alleles across generations is simulated, and the output is compared with the known pedigree. One simplifying assumption made during the analysis is that no de novo lethal mutations had arisen in the population since its founding; therefore, any disorders arising in the pedigree are attributed to mutations carried by the original founder population.

 

After combining the results from many thousands of such simulations with the Hutterite pedigree, the authors make a final estimate of roughly one or two recessive lethal mutations carried per human genome (the exact figure is ~0.58). What are the implications of this estimate for human health? Although mating between more closely related individuals has been long known to increase the probability of recessive mutations homozygosing in offspring, a more precise risk factor was generated from this study’s mutation estimate. In the discussion section it is noted that mating between first cousins, although fairly rare today in the United States, is expected to increase the chance of a recessive lethal disorder in offspring by ~1.8%.

 

Perhaps the most interesting finding from this paper was the consistency of the predicted lethal mutation load across the genomes of different animal species. The authors compared their estimates for human recessive lethal mutation number to those from previous studies examining this same question in fruit fly and zebrafish genomes, and observed a similar value of one or two mutations per genome. Of course, the many simplifying assumptions made during their analyses should be kept in mind; the estimates are considered tentative and will most likely be followed up with similar future work in other human populations. It will certainly be interesting to see how large-scale studies such as this one will impact human medical genetics in the future.

 

Darwin’s Finches Revisited

 

By John McLaughlin

In 1859, Charles Darwin published the now famous “On the Origin of Species,” containing the first presentation of his theory of the common origin of all life forms and their diversification by means of natural selection. One aim of this theory was to explain the diversity of traits found in nature as a result of the gradual adaptation of populations to their environments. This point is elegantly summarized in the third chapter:

 

[quote style=”boxed”]Owing to this struggle for life, any variation, however slight and from whatever cause proceeding, if it be in any degree profitable to an individual of any species, in its infinitely complex relations to other organic beings and to external nature, will tend to the preservation of that individual, and will generally be inherited by its offspring.[/quote]

 

A large contribution to this theory resulted from his five-year voyage aboard the HMS Beagle, during which he traveled in South and Central America, Africa, and Australia. Darwin collected a huge volume of notes on various plant and animal species, perhaps most famously the finch species inhabiting the Galápagos islands to the west of Ecuador. Although his finch studies were only briefly mentioned in one of his journals, “Darwin’s finches” are now a popular example of microevolution and adaptation for both students and the general public. One striking feature of these finch species is their diversity of beak shape; finches with larger, blunt beaks feed mainly on seeds from the ground while those with longer, thin beaks tend to have a diet of insects or seeds from fruit.

 

A recent study published in Nature examines the evolution of fifteen finch species that Darwin studied during his time in the Galápagos. Although previous work has helped construct phylogenetic trees based on mitochondrial and microsatellite DNA sequences from these same specimens, this is the first study to perform whole genome sequencing of all fifteen species. In addition to a more accurate phylogeny, these genome sequences allowed for new types of analyses to be performed.

 

First, the authors assessed the amount of interspecies hybridization that has taken place among the finches in their recent evolutionary history, and found evidence for both recent and more ancient hybridization between finch species on different islands. The authors then looked for specific genomic regions that could be driving the differences in beak morphology among the different finch species. To perform this analysis, they divided closely related finch species on the basis of beak shape, into either “pointed” or “blunt” groups; the genomes from each group were then searched for differentially fixed sequences. On the list of most significant regions uncovered, several included genes known to be involved in mammalian and bird craniofacial development. The top hit, ALX1, is a homeobox gene that also has previously established roles in vertebrate cranial development. Interestingly, almost all of the blunt beaked finches shared a specific ALX1 haplotype (“type B”) which was distinct from that shared by their pointed beak counterparts (“type P”), and vice versa. Based on the distribution of the “P” and “B” haplotypes, the authors estimated that these two groups of finches diverged approximately 900,000 years ago.

 

By applying genome-sequencing technologies, these labs were able to shed new light on a classic story in biology. Until fairly recently, phylogenetic relationships such as those described in the article could only by inferred on the basis of external morphology. In a Nature News piece commenting on this study, one of the co-authors remarked on what Darwin would think of the results: “We would have to give him a crash course in genetics, but then he would be delighted. The results are entirely consistent with his ideas.”

Predicting Suicide

 

By Jesica Levingston Mac leod, PhD

 

The play “suicide is forbidden in spring”. written by Alejandro Casona, describes an organization that helps potential suicide patients to end their lives, but the truth is that the doctors really want to avoid the sad end, and… they actually save the patients. They work with the “leitmotiv” that if you really want to finish your life, you will just do it, but the search for help is an indicator or alert signal of some survival and seek for attention behavior.

As reported by the Health Research found worldwide, 1 million suicides are committed by year. This means 1 death every 40 seconds. According to the CDC, In United States the percentage of suicidal is around 0.012%, where is the 10th leading cause of death. North America has 1 suicide every 13 minutes.The suicidal capital of the world is Greenland with a 108.1 suicide rate, followed by South Korea with 31.7. China is in the seventh place, it accounts for almost one third of all the suicides, and differently than the other countries it is the only one where women have a higher suicidal rate than men. Indeed, 3 years ago the terrible news about how in some factories, like Foxconn, making sought-after Apple iPads and iPhones were forcing staff to sign pledges not to commit suicide. Among 2013 at least 14 workers at Foxconn factories have taken the decision of terminating the horrendous working and housing conditions, ending their lives.

 

This initiative to attempt against your own life was been related to mental illness (almost in 50% of the cases) and metabolic disorders. The most implemented way of killing themselves is firearms, followed by suffocation/hanging and falls. The alarming fact is that rates of suicide have increased by 60% in the last 30 years, especially in developed countries. Also, you must consider that for every suicide that results in death there are between 10 to 40 attempted suicides. But what does bring a human been to the edge… and push him to jump?

New research has found that the answer would be the lack of the correct expression of one gene. Yes, only the downregulation of SKA2, the guilty gene, could be a biomarker for detecting suicidal behaviors. SKA2 stands for spindle and kinetochore associated complex subunit 2. The protein encoded by this gene is part of a microtubule biding complex that is essential for proper chromosomal segregation.

When they examined the postmortem brain samples from 3 independent cohorts (around 29 from suicide assesd humans and 29 controls per each group) they found that SKA2 had lower expression levels in the suicide cases than in the control, and its expression was negatively associated with DNA methylation. The chemical addition of a methyl group can activate or negatively modulate a gene, as it is considered an epigenetic modification.

I guess you are thinking: these are “Frankenstein” samples, how can this gene be related to really live human beings? Well, apparently the Johns Hopkins researchers also made the same question. In order to answer it they collected blood samples from other 3 independent cohorts with suicidal ideation and controls (with a number of subjects of 22, 51 and 327 each). In these study, the expression of the SKA2 gene was significantly reduced in suicide decedents. Furthermore, they analyzed levels of salivary cortisol. Cortisol is implicated in the glucocorticoid receptor transactivation and stress response. The results suggested that SKA2 epigenetic and genetic variation may modulate cortisol expression. The most important discovery was that the model that they generated based on these data allowed them to predict the suicidal ideation of subjects just using blood samples. They analyzed the methylated status of the SKA2 gene, which correlated with the suicidal attempts.

The great thinker Albert Camus ones recalled the attention in this issue when he said: There is but one truly serious philosophical problem and that is suicide.”  For some in risk groups, like the soldiers who are coming back home with traumas after the war, the possibility of attempts against their lives is a ghost that has taken a lot of lives. This simple blood test can point out which individuals could be in risk and therefore they may get a correct follow up and treatment that might end preventing the catastrophe. Some high pressure jobs can also implement this analysis to avoid the lost of lives, giving correct care to people who tested positive. And even closer to all: would you like to know if you have this tendency printed on your DNA? Or your partner? Or your kids?

While you think about this, let me leave you with a relief quote: “If I had no sense of humor, I would long ago have committed suicide.” Perhaps, you would be surprise to know that the wise man who said this was the Dr. Mahatma Gandhi, whom almost killed himself in a starving protest trying to obtain the independence of the Indian Republic.

 

The Force Beyond Genetics

 

By Katherine Peng

From humans, to wookies, to Jar Jar Binks. With over 20 million species in the Star Wars galaxy, one can assume that not even this fictional universe is exempt from the laws of evolution. A conversation on how diverse environments pushed them to look so strange (or like cute little ewoks!) through rare mutations would be a discussion of the “hard inheritance” of DNA sequences, but what about soft inheritance? Can the surrounding environment in one lifespan change DNA in ways that can be passed on to offspring?

You’re shaking your head thinking DUH..NO, as thoughts jump back to the image of that giraffe in your biology 101 textbook. Remember that a once accepted theory of evolution proposed by Jean-Baptiste Lamarck suggested that acquired traits were inherited (e.g. a giraffe constantly stretching it’s neck to reach tall trees will give birth to giraffes with longer necks)? We now know that this theory is preposterous. What if Yoda lost his favorite toe in battle and decided to procreate? Would all future Yodas be afflicted with the curse of having only 5 toes?

BUT WAIT. Lamarck might have been onto something after all! Molecular biology has found that environmental factors CAN affect DNA without altering the sequence, and that these changes can be passed on to future generations (though not often beneficially). Welcome to the field of epigenetics!

In all eukaryotic cells in ours and the Star Wars galaxy, DNA is packaged around histone proteins. This DNA can be methylated and/or the histones can be modified to silent gene expression. People nowadays are becoming increasingly interested in how environmental factors produce these epigenetic changes and affect disease patterns.Genome-wide epigenetic studies are more commonly done on identical twins, where differences between individuals must be environmental. While Luke and Leia Skywalker are fraternal twins, sharing ~50% of their genes as would regular siblings, they were separated at birth to be raised in different environments. So, let’s take a look at how they become more different than their genetics would have us believe.

STRESS

In 2004, researchers from McGill University discovered that early nurturing from rat mothers remove epigenetic repression of the glucocorticoid receptor gene in rat pups. Consequently, rats that were not well nurtured became more sensitive to stress as adults.

 

Though childhood scenes of Luke and Leia are basically nonexistant, we do know that Leia was adopted into a very loving royal family who could not have children of their own. Luke was adopted by his step-uncle and step-uncle’s wife. His step-uncle did not approve of his adventurous tendencies, which created tension.

VERDICT: Though both twins are at risk of higher stress responses due to Amidala’s likely depression during pregnancy, hints of a less nurturing environment suggest that Luke may be more sensitive to stress as an adult than Leia.

EXERCISE

The Swedes are on a role in this category. A recent Swedish study shows that 20 minutes of exercise can reverse DNA methylation of genes in muscle that show lowered expression in type 2 diabetes. Another shows that 6 months of exercise changed the methylation pattern of genes in fat cells implicated in the risk of developing obesity or diabetes.

While Luke is working the moisture farm in Tattoine and having adventures, Leia is a palace princess in Alderaan.

VERDICT: While it seems that both Luke and Leia are fit later on, Luke appeared more active as a child and may be at a lower risk for type 2 diabetes. Sorry Leia.

DIET

Bioactive food components (in tea, soybean, etc.) might beneficially reduce DNA hypermethylation of cancer associated genes. On the flip side, folate found in fresh produce is required for DNA methylation, and its deficiency in pregnant moms may cause disease or cancer in children. You are also what your father eats. A mouse study showed that a paternal low-protein diet created upregulation of lipid biosynthesis in offspring.

Unfortunately, there’s no real information out there on the diet of the Skywalkers so…

VERDICT: Inconclusive.

 

So what have we learned here today? Leia needs to ramp up her training, and Luke should control that anxiety before he becomes like his father. But really, epigenetic changes in twins aren’t too different until later in life so I guess it’s all speculation until Disney releases the first installment of the sequel trilogy.

Clones In Space, I Have Placed (Infographic)

 

By Brent Wells, PhD

 

Did Lucasfilm Ltd. direct an explosion in cloning efforts at first rumors of the storyline for Episode II, Attack of the Clones? Or did scientist’s unstoppable desire to achieve the impossible instruct the fate of the Empire? We may never know. But the happy coincidence and a recently christened holiday have brought you science in pictures so don’t think about it too much and enjoy.

 

Credit: Brent Wells, PhD
Credit: Brent Wells, PhD.
Click on the image and then expand to full screen.

 

If I’ve managed to assemble this infographic even half as well as I imagine George Lucas can assemble a sandwich, you probably command a decent understanding of the history of cloning technology by now. Like the special effects technologies developed at Industrial Light and Magic (ILM), cloning has advanced from its humble, yet provocative beginnings, into something awe-inspiring and useful at once. Unlike ILM special effects, each subsequent step in the maturation of cloning tech brings something more impressive than before.

 

A new study published just last week in the journal Nature describes the creation of a human, diploid, embryonic stem cell population using SCNT from an adult with Type 1 Diabetes. This is huge for a number of reasons: 1) They were able to use tissue from an adult, which negates any ethical concerns surrounding use of embryonic or fetal tissue. 2) They created diploid cells that can be used in treating human disease. Similar embryonic stem cells were generated in 2011 but were triploid, which means they contained three sets of chromosomes instead of the normal two found in humans, making them non-compatible and therefore inviable for use in disease treatment. 3) The stem cells, cloned from an individual with Type 1 Diabetes, can give rise to the very cells lost due to Type 1 Diabetes, and since they are clones of the affected individual, his/her body will not reject treatment that introduces new cells into their body to replace those lost to the disease.

 

This advancement in cloning technology is a significant step forward in creating stem cell banks that can actually be used in the study and treatment of disease on a case-by-case basis and will extend well beyond Diabetes. It also furthers efforts in the growth of complete replacement organs for those in need of matching donors – after all, there’s no better match for you than you.

 

If you want to learn more about cloning, *waves hand in front of face, uses weird voice inflection* You want to learn more about cloning. You’re going to look into the following resources. I am not the droid you’ve been looking for.

 

Wikipedia, of course

The Basic Science Partnership at Harvard Medical School

The Animal Biotechnology Resource at UCDavis

The Genetic Science Learning Center at the University of Utah Health Sciences

Or just Google it…

 

May the 4th be with you.

Clone wars – GMOs: Jedis or Siths?

 

By Jesica Levingston Mac leod, PhD

In any molecular biology lab cloning is a daily procedure, but getting those clones outside of the lab is the huge issue. Genetic modified organisms or GMOs were subjected to specific genes alterations, and then cloned to obtain a larger number of identical organisms. Here, I would like to compare the two faces of this technology and its impact in the nature.

 

GMOs as Jedis, the good use of the force:

Since this technology was introduced to the field the pesticide spraying has been reduced by 499 million kg (-8.7%) and this decreased the environmental impact associated with herbicide and insecticide use on the crops by 18.6% (as measured by the indicator the Environmental Impact Quotient [EIQ]). Furthermore, it has been reported a significant reduction in the release of greenhouse gas emissions from this cropping area, which, in 2012, was equivalent to removing 11.88 million cars from the streets.

Economically, they bring a high advantage to the farmers, allowing them to grow in a competitive environment, generating more products with a lower expenses.

GMOs are helping to supply resources a never ending growing world population. Therefore, they could be a solution for the doomsday prediction that the economist Maltus made more than 100 years ago: “we are going to run out of resources and we won’t feed an exponential rising world population”. Science published at the beginning of 2000 a breakthrough research: the golden rice. This special GMO counts with the addition of three beta-carotene biosynthesis genes. These compounds added nutrient value to the rice, as they are precursors for the vitamin A biosyntheses. This project was leaded by Drs. Ingo Potrykus and Peter Beyer of the University of Freiburg, whom had the aim to introduce this enriched rice in the african, latin american and asian market where the deficit in this vitamin causes terrible health problems. At the time of publication, golden rice was considered a significant breakthrough in biotechnology, as the researchers had engineered an entire biosynthetic pathway. Five years later, a new version of the golden rice producing up to 23 times more beta-carotene than the original, was announced.

GMOs on the dark side of the force:

As an artificial organism that we are introducing to nature we can only try to predict how are we going to impact the environment. The ecology of this artificial selection was predicted as catastrophic, for example for the soy harvest in Argentina, where the excessive use of this GMO leaves the soil without nutrients, kind of “death” and unable to generate any other product. This exhaustion of the field may bring a negative impact in the future.

According to the center for food safety, GMOs products make up about 90 percent of cash crops like cotton, corn and soybeans nationwide. As Monsanto holds the 80% and the 90% of american corn and soybeans , respectively, and its licenses, the monopoly issue started to rise. Neither the farmers or scientist are allowed to research on the GMOs created by Monsanto, without a legal permission. This avoids the independent safety testing, and some scientists have rise the case to the US Supreme Court.

Furthermore, the farmers are subject to pay the increase price for the seeds that they can only buy for a few companies. Indeed, between 1995 and 2011, the per acre cost of corn and soybean increased 259% and 325%, respectively (US Department of Agriculture). With this strong license policy, an increasing number of small farmers have gone bankrupt as a consequence of having an accidental (like wind dispersal, split seed or cross contamination) presence of GMO on their fields. It is not surprising that with this situation the idea of changing seeds, buying a non-GMO species scares the farmers.

 

The labeling topic is even more sensible. The Food and Drug Administration favors voluntary labeling and says GMO products must meet the same safety levels as other foods. On the other hand the Center for Food Safety supports mandatory labeling. The GMOs producers prefer to avoid the labeling, as it brings unwanted attention to the product and bad advertising. The pro labeling organizations claim that it is the consumer right to know exactly what they are eating. My favorite comment in this regard was made by Gene Hall, a spokesman for the Texas Farm Bureau: “We don’t need to label something that is absolutely safe.”

As a great technology in development, GMOs are like Anakin Skywalker in his early age trying to decide which side of the force he should join, both present advantages and disadvantages, but without a correct guidance, like Yoda would be, this technology could be joining the dark side of the force.

Building Better Beer, One Nucleotide at a Time

 

By Brent Wells, PhD

An international team of scientists headed by a group at the New York University Medical Center has created the world’s first eukaryotic synthetic chromosome, meaning they have literally engineered life from its smallest units. Intrigued? You should be.

 

This work, recently published in the journal, Science, was achieved in the common budding yeast, Saccharomyces cerevisiae; the very same little fellow that makes your bread rise and your beer ferment.

 

A chromosome, in case you were wondering, is a continuous grouping of a subset of your genes. All 20,000+ human genes are spread across 23 distinct chromosomes while yeast genes, about 6,000 of them, are spread across 16. The goal of this study was to choose one of those naturally occurring yeast chromosomes and replace it with one synthesized, from scratch, in the lab.

 

How do you synthesize a chromosome? The answer is bit-by-bit-by-bit and with plenty of cheap help.

 

The group started with small, overlapping oligonucleotides, which are very short pieces of DNA – about 70 base pairs in this case. Next, you need an army of undergrads trying to earn an A grade in their Building-A-Genome class, and whose parents are unknowingly paying for your research, to stitch all of these small pieces together into increasingly larger fragments. This is what I imagine building a weave for Rapunzel would be like. Final assembly is completed in the yeast cell where the natural chromosome is replaced, one chunk at a time, with corresponding pieces of synthetic chromosome via a process called homologous recombination.

 

This was not, however, a Gus Van Sant-Psycho-shot-for-shot remake of the original. The chromosome lost a little weight in the process, trimming down to 272,817 base pairs from 316,617. Remarkably, the synthetic yeast were just as viable as the naturally occurring strain, suggesting that there’s a lot of useless DNA floating around in our cells. Among the discarded bits were regions of non-coding DNA called introns as well as transposons. Transposons are DNA sequences that can actually jump around the genome carrying other pieces of DNA with them and which are thought to be a major driving force in evolution.

 

Speaking of evolution, the group also engineered in sequences that would allow them to randomly alter the genome by taking out non-essential genes in a process they call SCRaMbLE-ing. The removal of these genes allows the team to look at the effects of variable-scale genome size reduction on viability. In other words, they can induce a genome ‘scramble’ in millions of yeast cells at once, which will remove different subsets of genes in each, and look at which genes are gone in the ones that survive. This mimics genetic deletion events that can happen naturally during evolution and will help us understand how evolution may occur and the pressures that can lead to the traits it eventually fixes. You can also really speed up a notoriously slow process.

 

This is not the first time a synthetic genome has been attempted, or completed. Groups have had success with viral and bacterial genomes in the past, but this is the first instance of something on this scale. Other groups are currently working on more of the 16 yeast chromosomes with the goal of eventually creating a completely synthetic yeast cell.

 

Beyond the potential to understand mechanisms of evolution and just see if we can actually do it, generation of synthetic organisms have far-reaching commercial potential. Synthetic yeast could be used to generate more efficient bio-fuels, rare medicines for Malaria and Hepatitis and more. And it would be cheap – at least in principle; did I mention they are calling these ‘designer’ chromosomes? I can only assume the synthetic strain was code-named Fendi or Prada.

 

So, should you be worried about ingesting some synthetic yeast during your next trip to Dunkin’ Donuts or Subway? Hardly. Scientists have engineered fail-safes into the synthetic chromosomes that make it impossible for the yeast to live outside of special conditions provided only in the lab. Of course, they did the same thing on Isla Nublar in Jurassic Park and anyone that’s seen Jurassic Park II knows that Jeff Goldblum nailed it when he demanded ‘Nature always finds a way’. But to those alarmist naysayers saying ‘What about the potential for environmental catastrophe?’ Let me offer this recompense: ‘What about the potential for better beer!’

Can a Mutation Protect You From Diabetes?

 

Evelyn Litwinoff

For the first time in diabetes research history, researchers have found mutations in a gene that is associated with a 65% decrease in risk of developing type 2 diabetes (T2D).  What’s even more astounding is that only one copy of the gene has to be mutated to show this protection.  The gene of interest is SLC30A8, which encodes a zinc transporter in pancreatic islet cells.  (A quick brush up on your cellular anatomy: Pancreatic islet cells produce insulin, which the body uses to uptake glucose into cells.  Zinc plays an important role in the uptake, secretion, and structure of insulin.) This study found not 1, not 2, but 12(!) different loss-of-function mutations, all in SLC30A8 and all predicted to result in a shortened protein, that associates with protection from T2D risk.

 

Most of this study is based upon sequencing genes that were previously associated with a risk of developing T2D.  Overall, the authors looked at about 150,000 individuals from various ethnic populations in order to obtain statistical significance for their associations.  Their results are surprising since previous studies had linked mutations in SLC30A8 with an increased risk of T2D.

 

However, this study does not address how a decrease in function of the zinc transporter, named ZnT8, could lead to protection from a disease state.  The authors did conduct one mechanistic-ish experiment, but this was only to see if the mutations in ZnT8 actually affect the activity of the protein.  To this end, the authors overexpressed 4 different mutated versions of ZnT8 in HeLa cells and saw a decrease in protein levels in 2 out of the 4 versions.  Furthermore, they showed that the increased protein degradation could be part of the reason for the observed decrease in amount of protein.  Their main conclusion from these cell experiments show that some of the mutations in ZnT8 result in an unstable protein, which would help us understand how the zinc transporter is not working, but it does not explain why the dysfunctional protein protects from T2D.  Hopefully, this paper will spark others to investigate a mechanism for the associated protection.

 

Currently, Pfizer and Amgen are starting to develop drugs that mimic this mutation to see if they can replicate the protection.  Although a new diabetes drug based on this study could be 10-20 years down the road, this study still makes a big splash in the diabetes research community.

Can miRNA Measure Your Fitness?

 

By Maggie Kuo

We are constantly encouraged, or nagged, to exercise to promote good health.  But how much exercise is enough?  We finally might be able to precisely measure this threshold.

 

The research group lead by Thomas Thum at the Hannover Medical School in Hannover, Germany has been studying the use of microRNAs (miRs), noncoding RNAs that repress gene expression, to treat cardiovascular diseases.  As a side project, the group wondered if miRs could be used as an indicator of physical fitness.  The researchers found that a single session of endurance exercise increased the amount of miRs associated with the skeletal muscle.  Moreover, the amount of these miRs did correlate with fitness level.

 

Physical endurance is generally measured by maximum oxygen uptake (VO2max), the amount of oxygen used by the skeletal muscles during maximum effort exercise.  VO2max increases with fitness level.  How much an exercise regimen will raise VO2max depends greatly on the individual’s genes so the outcomes in one person cannot necessarily be applied to another person.  As a result, researchers in the exercise physiology field have been exploring the effects of exercise directly on gene expression.

 

miRs can be specific to the organ, like the heart, or to the biological process, like inflammation.  miRs are also stable outside of the cell and can therefore be detected in the blood plasma.  Several studies reported changes in plasma miR levels after exercise.  The miRs corresponded with skeletal and heart muscle adaptations and the levels were influenced by the intensity, type, and duration of the exercise protocol.

 

Thum and colleagues measured the amount of heart and skeletal muscle-associated miRs in blood samples from experienced marathon runners.  Blood was drawn two days before the race and immediately and 24 hours after the runner finished the course.  The researchers also recorded VO2max and anaerobic lactate threshold (VIAS), another physical fitness indicator, for each runner.  They found that the skeletal muscle miRs, miR-1, -133a, and -206, were higher shortly after the run and remained elevated 24 hours after the race.  The heart muscle miRs, miR-208b and -499, were also higher immediately after the run but returned to pre-race levels after 24 hours.   What intrigued the researchers the most was that the levels of all three skeletal muscle miRs trended with both VO2max and VIAS.  Runners with greater VO2max and VIAS, meaning they were more fit, tended to have higher amounts of skeletal muscle miRs in their blood plasma.  The researchers then compared the gene targets of the five miRs they measured against a list of 56 genes associated with physical endurance.  The heart miRs only modulated one gene on the list.  However, the skeletal muscle miR-1 regulated 8 genes and -206 regulated 10 genes, suggesting that enhanced fitness was due to increased regulation of physical endurance genes from the higher miR levels.  The researchers concluded that these skeletal muscle miRs could potentially be used as biomarkers of physical fitness.

 

More work needs to be done to confirm the accuracy of miRs as biomarkers, including measuring the amount of the miRs directly in the skeletal muscle, studying how the miR levels change with different exercise protocols, and determining if other miRs are influenced by endurance.  Nevertheless this study provides a potential new metric to evaluate physical fitness.  This metric could be used to establish more precise recommendations on exercise routines to promote better health.

Mitochondrial Clues for a Long Life

 

By Thalyana Smith-Vikos

Biological clocks that can predict an individual’s lifespan more accurately than chronological time alone have been proposed in multiple molecular, cellular and genetic contexts, but a single clock has yet to be identified. Mitochondria, however, have been identified as promising candidates for a biological aging clock in many organisms. Dong and colleagues report that mitochondrial function in Caenorhabditis elegans young adults provides a highly accurate predictive measure of eventual longevity of individual nematodes.

By visualizing quantal mitochondrial flashes, or mitoflashes, in vivo, the authors were able to show that this optical readout was specific to free-radical production and metabolic rate at the single-mitochondrion level. These mitoflashes exhibited a strong correlation with C. elegans aging and had similar attributes in a mammalian system. Mitoflash measurements in pharyngeal muscles peaked during active reproduction and when the first nematodes began dying off. The mitoflash activity on day 3 of adulthood during active reproduction explained up to 59% of lifespan variation. Day 3 mitoflash frequency was negatively correlated with future lifespan of individual C. elegans, and this negative correlation persisted in the face of various genetic and environmental alterations that extend or shorten lifespan. The authors further showed that day 3 mitoflash frequency was due to glyoxylate cycle activity, and they propose that mitochondrial activity not only predicts but also determines lifespan, as the lifespan of long-lived insulin receptor mutants was at least partially explained by decreased mitochondrial production of superoxide.

These findings indicate that mitochondria can function as a biological clock that predicts lifespan of individual C. elegans in various contexts. Importantly, this clock has already begun ticking very early in life, as mitochondrial flashes in early adulthood during active reproduction have been shown to be most potent predictors of future longevity.