Escape from Exhausting Learning and Insufferable Experiences: Sleep May Do the Trick


By Yue Liu

A 2004 film, 50 First Dates, depicted a romantic story about how the hero won the love every day from the heroine, who suffered from a fictional form of anterograde amnesia and lost her previous day’s memories after every single night. In reality, a unique form of human amnesia, sharing great similarities with that afflicting the heroine in 50 First Dates, was reported in 2010: after a car accident, the patient FL could recall events that had happened before the accident and remember things from the same day after the crash, but she could not register memories in her brain from the previous day after a night’s sleep. What had happened to those memories during her sleep?

The role of sleep in memory has been stated in a “two-stage” model: During the day, we temporarily store a remarkable amount of information in the hippocampus, a brain area named for the structural resemblance to the seahorse. While we sleep, the hippocampus gradually gets disengaged, and memories are handed over to the neocortex for long-term storage. In brief, we consolidate our reminiscences during sleep by transferring them from the hippocampus to the neocortex. If the transferring process during sleep is disrupted, as may be the case of the patient FL, temporary memories will be lost, whereas permanent memories that are already stored in the neocortex will remain intact.

In this month’s issue of Nature Neuroscience, Michaël Zugaro’s lab in France provided the first direct evidence for this two-stage model of memory. They observed a fine temporal coupling of oscillating activities between the hippocampus and neocortex in animals during deep sleep. When the animals’ learning periods (20 minutes) were long enough to trigger memory consolidation, the oscillatory coupling between the hippocampus and neocortex during sleep became stronger. However, when the learning periods (3 minutes) were too short, the strength of the hippocampo-cortical coupling did not increase; thus, the memories could not be consolidated. Interestingly, in the latter animals, boosting the hippocampo-cortical dialogue during sleep promoted memory consolidation, which otherwise would not have happened due to the short learning period.

This study offered the first causal link between the hippocampo-cortical dialogue during sleep and memory consolidation. It may also invigorate a fantasy: Can we learn much more quickly (in 3 rather than 20 minutes)? Can we study less during the day and receive a special electrical therapy during the night that can selectively enhance the hippocampo-cortical oscillatory coupling? Someday, an electrical device may be hooked up to a human brain to monitor and record electrical activities associated with various experiences. We may program the device to tighten the hippocampo-cortical coupling during the night for a specific experience, to strengthen that particular memory.

How about erasing a particular memory during sleep? In another 2004 film, Eternal Sunshine of the Spotless Mind, finding their relationship did not work out, a couple turned to a special procedure, which wiped out their memories about each other during sleep while their romantic episodes replayed. The basis of this fantasy procedure may be the vulnerability of memories while they are replayed during sleep. Human imagination may propel scientists to develop a strategy that can make erasing memories possible in reality. Someday, we may ease some insufferable emotional pain, such as that resultant from posttraumatic stress disorder (PTSD), by disrupting the replay of fear, stress, or anxiety associated memories.

When science and technology can make it possible to easily save and delete our memories, we may escape from laborious learning and unpleasant memories just by clicking “save” or “delete” on electrical devices connected to our brains. But remember, our memories sculpt who we are. After this technological intervention, will you still be you?



Beyond Neuromania

By Celine Cammarata

As someone within the field, it seems to me that neuroscience – in some form or another – appears in the media nearly every day. Indeed the term “neuromania”, originally coined by Raymond Tallis, has come into use to describe both the lofty claims made about the power of neuroscience to answer nearly every question and the general mainstream media frenzy surrounding the field. Scholars have paid increasing attention to this and it is often regarded as a problem, but more recent work suggests that despite the mania, neuroscience is still not widely understood or even considered by the public at large. So does all the hype conceal a true lack of public interest?

It’s undeniable that neuroscience is the target of extensive and potentially problematic media attention. In a 2012 Neuron editorial, O’Connor, Reese and Joffe examined the coverage of neuroscience-related topic in six UK newspapers from 2000-2010 and found that not only did the number of articles related to brain research nearly double from the early to the late 2000s, but the topics also changed and the implications of neuroscience research was often exaggerated. Whereas in the past neuroscience was generally reported on in relation to physical pathology, the authors found that in their sample the most common context for discussing neuroscience was that of brain enhancement and protection – topics that are both more widely applicable to a broad audience and that suggest a newly emerging sense of ownership over ones’ brain.  O’Connor et al describe that “although clinical applications retained an important position in our sample, neuroscience was more commonly represented as a domain of knowledge relevant to ‘‘ordinary’’ thought and behavior and immediate social concerns. Brain science has been incorporated into the ordinary conceptual repertoire of the media, influencing public under- standing of a broad range of events and phenomena.”

Such issues are also highlighted in Satel and Lilienfeld’s 2013 book Brainwashed: the Seductive Appeal of Mindless Neuroscience, in which the authors explore – and lament – the at times unrestrained application of functional magnetic resonance imaging (fMRI) to answer questions from “Pepsi or Coke?” to “does free will exist?”. The tantalizing ability to see the brain has carried neuroscience into the realms of marketing, politics, law and more, not to mention changing the way we think about more standard brain research topics such as addiction. But, the authors point out, pictures of a brain alone can not address every level of analysis and are not inherently of greater scientific value than are other research methodologies. Tracking the physical footprint of a desire, attitude, or propensity in the brain does not in and of itself tell you why or how these things emerged, nor can it be necessarily used to assign guilt, decide what is a “disease” and what not, or determine how people choose their politicians – and yet this is precisely what neuroscience is often touted to do.

Both of these works, and many others, are based on the premise that neuroscience has become markedly pervasive, nearly omnipresent. Fascinatingly, though, the brain craze seems to stop short of making a final leap from the media to public consciousness. To be sure, public interest in neuroscience does exist – someone must be buying the growing number of brain-centered books popping up at Barnes and Nobel, right? – but a 2014 paper by the same authors as the Neuron piece found that the public in general is not nearly so interested in neuroscience as the media frenzy and emergence of the brain in societal matters might suggest.

To probe how everyday citizens think about neuroscience, the authors conducted open-ended interviews where a sample of Londoners, chosen to span age, gender and socioeconomic divides, were asked to share what came to mind when they considered research on the brain. These interviews were then examined and the themes touched upon quantified, and the results showed clear indication that neuroscientific research has largely failed to penetrate into the mindset of the public at large. Participants consistently indicated that they thought of brain research as a distant enterprise quite removed from them, performed by some unknown “other” (who was consistently described as a man in a white lab coat). Brain research was widely convolved with neurological medicine and brain surgery, and was almost entirely assumed to focus on medical application – the concept of basic science on cognition, emotion, or other mental phenomena appeared nearly unheard of.

Consistent with this, although most participants were quick to tag brain research as “interesting, they also reported that it was not of particular interest to them specifically except in the context of illness. That is, above all the brain was something that might go wrong, and unless it did participants gave it little thought at all. The authors connect this to an earlier concept of “dys-appearance,” the idea that much of the body is inconspicuous and ignored so long as it is healthy, and only attracts attention when there is some kind of dysfunction.

Based on these finding, O’Connor and Joffe concluded that despite rapid advancement and intrusion of neuroscience into more and more areas of inquiry, research on the brain nonetheless continues to have little relevance to the public’s daily lives. As they put it, “heightened public visibility should not be automatically equated with heightened personal engagement.”

So is neuroscience flooding our popular culture, or simply washing up then falling away like a rolling wave, never really lasting in our overall societal consciousness? For the moment, it appears to be both. Perhaps the concern over “neuromania” need not be so heated, but also perhaps we need to do more to understand how our work can take the extra step to become more relevant to those outside the lab.

A Micro Solution to a Macro Problem?


By Danielle Gerhard

Recent estimates by the National Institute for Mental Health (NIMH) have found that approximately 25% of American adults will experience a mental illness within a given year. Individuals living with a serious mental illness are more likely to develop a chronic medical condition and die earlier. In young adults, mental illness results in higher high school drop out rate. A dearth of effective medications leaves many individuals unable to hold a job, causing America a $193 billion loss in earnings per year. These saddening statistics shed light on the need for better drugs to treat mental illness.


Traditionally, treating a mental illness like depression, anxiety or schizophrenia involves a delicate and perpetually changing combination of drugs that target levels of neurotransmitters in the brain. Neurotransmitters are chemicals produced by the brain and used by cells to communicate with one another. Drugs used to treat mental illness either increase or decrease the release, reuptake or degradation of these chemicals from the cell. The current paradigm is that the disease solely results from neurotrasmitter imbalance. Therefore, research has predominantly focused on the specific types of cells that release them. However, neurons make up approximately 50% of all cells in the human brain. The other 50% of brain cells are glial cells and are responsible for maintaining and protecting the neurons in the brain and body.


One type of glial cell, microglia, are specialized macrophage-like immune cells that migrate into the brain during development and reside there throughout life. Microglia are the primary immune cells in the brain and act as first-responders, quickly mounting responses to foreign pathogens and promoting adaptive immune actions. Microglia can adapt to changes in their microenvironment by protracting or retracting their processes to maintain neuronal health, scavenging their surroundings for dead neurons and cellular debris. Moreover, it has been shown that microglia are involved in the induction and maintenance of long-term potentiation, an event that is critical for synaptic plasticity underlying learning and memory. Only in the past decade or so has this cell type begun to surface as a potential mediator in the development and continuation of mental illness. As a result of decades of neuron-focused experiments, the function of microglia have either been misunderstood or over-looked all together. Two recently published experiments contradict our conventional understanding of the etiology of mental illness.


A new study published in the January 29th issue of the scientific journal Nature Communciations by Dr. Jaime Grutzendler’s team at Yale University highlights a novel role for microglia in Alzheimer’s Disease (AD). Late-onset AD is thought to result from the accumulation of the protein β-amyloid (Αβ). This process is referred to as plaque aggregation and results from reduced Aβ plaque clearance. Because microglia with an activated morphology are found wrapped around areas of high Aβ accumulation, it has been hypothesized that they actually contribute to weakened neuronal projections by releasing small neurotoxic proteins, cytokines, that affect cell communication. Aβ can exist as mature and inert fibrillar Aβ but can also revert back to an intermediatary state, protofibrillar Aβ, which is toxic to neurons.


Dr. Grutzendler’s lab set out to to further investigate the role of microglia in Aβ plaque expansion with respect to the different forms of Aβ. Using two-photon imaging and high-resolution confocal microscopy, the team at Yale was able to show that, for the most part, microglia formed tight barriers around Aβ plaques with their processes, but in some instances microglia left plaque “hotspots” exposed. These plaque “hotspots” were associated with greater axonal and neuronal damage.


These findings indicate that microglia generated protective barriers around Aβ plaques that served to protect neurons from the neurotoxic effects of protofibrillar Aβ. Of note, studies using aged mice revealed that microglia were less effective at surrounding plaques leading to increased neuronal damage. Microglia regulation decreases with age thereby rendering neurons more vulnerable to environmental insults. This cell type is therefore a likely key mediator of neuronal death that leads to cognitive decline and emotional distrubances in patients suffering from AD and other neurogegenerative diseases.


Another recently published study highlights a novel role of microglia in addiction, a chronic disease that afflicts many individuals with mental illness, comes from Dr. Linda Watkins, of the University of Colorado, Boulder. The study, published in the February 3rd issue of the scientific journal Molecular Psychiatry, examines the role of microglia in the rewarding and reinforcing effects of cocaine.


It has long been understood that drugs of abuse cause activation of the dopamine (DA) system in the brain, with increased DA release from the ventral tegmental area (VTA) to the nucleus accumbens (NAc), a brain region important for their rewarding effects. Cocaine achieves this effect by blocking dopamine transporters (DATs) on the cell, resulting in increased levels of synaptic DA and sustained neuronal activity. Therefore, efforts have focused on targeting DATs to prevent the rewarding effects of cocaine and ultimately reduce addiction.


In addition to these established dogmas, recent studies have shown that cocaine also activates the brain’s immune system. Microglia express Toll-like receptor 4 (TLR4) and its binding protein MD-2, which are important for reconizing pathogens and activating the release of pro-inflammatory molecules such as interleukin-1β (IL-1β). Using an animal model of addiction in combination with in silico and in vitro techniques, Dr. Watkin’s team found that cocaine activates the TLR4/MD-2 complex on microglia, resulting in an upregulation of IL-1β mRNA in the VTA and increased release of DA in the NAc. Administration of the selective TLR4 antagonist (+)-naloxone blocked the cocaine-induced DA release and the rewarding effects of cocaine administration in the rodent self-administration behavioral models. Overall, the study concludes that TLR4 activation on microglial cells contributes to the rewarding and reinforcing properties of cocaine. Thus, drugs targeting this system could provesuccessful in treating addiction.


Through these studies and similar reports, it is becoming apparent that mental illness is more than a chemical imbalance in the brain and therefore shouldn’t be studied as such. The two studies highlighted in this article show the diverse role of microglia in the development and maintenance of mental illnesses. A more in-depth understanding of how this cell type interacts with already identified neural systems underlying mental disorders could result in the development of better-tailored drug design.


Resolutions from the Bench


and some science to help you make your own!

Compiled and written by Evelyn Litwinoff and Katherine Peng

Like many of us out there, you may deem a New Year’s resolution a successful one if it lasts through January. To help create your own, the Scizzle staff is offering some tips backed by neuroscience (plus some science-y examples) that may help you to finally follow through in 2015.


Tip #1: Give yourself a pep-talk

Positive self-reflection boosts serotonin levels which is essential for proper functioning of the prefrontal cortex. The prefrontal cortex is our impulse control and decision making center, and plays the additional role of giving flexibility to habits ingrained in the basal ganglia. For example, subjects lulled into a conversation about their positive qualities prior to reading an informational packet developed a greater intention to quit smoking or eat healthier.


For inspiration, see this positive worded resolution (“I am!” “I will!”):

[quote]I am going to work on becoming better at networking. I will go to more networking opportunities, and I will not spend all of my time talking only to people that I know.[/quote]

S.S., Industry Research Scientist


Tip #2: Focus on one or few goals

Baumeister et al. have shown over and again that willpower is a limited resource. The effort it takes to complete one goal may render us too exhausted for the next. In fact, willpower depends on glucose levels, and a good dose of glucose helps to counteract willpower depletion (though admittedly not so helpful if your resolution is a diet).

[quote]This year, I will focus on the “existing” rather than “imaginary” problems in science; and I will try to address those by my solutions. The focus of the year will change from “providing solutions” to “identifying the right problems.[/quote]

Padideh Kamali-Zare, a new science entrepreneur and a Scizzle blogger


Tip #3: Give yourself a distraction

In a well-known series of marshmallow experiments, children were promised more marshmallows if they could resist the one marshmallow they were left in a room with. The most successful kids distracted themselves with singing or playing, and as a bonus had better SAT scores later in life.

[quote]For 2015, I will dedicate time every day to step away from the bench/paper I’m reading/experiment I’m designing to take a mental break, even if it’s only 5 or 10 minutes long. And I promise not to go on Facebook during those breaks![/quote]

E.L., Immunologist


Tip #4: Remind yourself

That the feeling of being in control is inherently rewarding. Imaging has shown that subjects making choices in which they control the outcome have greater activation in structures of the brain involved in reward processing.


[quote]For New Year’s, my resolutions would be to 1) Actually finish one of those online statistics classes so I understand the statistical tests I will eventually be using to analyze my data (I keep starting the courses and then getting distracted and stopping about halfway), and 2) Come up with a better system to consolidate, organize and keep track of my paper reading/notes; currently things are spread across notes in PDF files, hard-copy notes, and Google Documents.[/quote]

– Susan Sheng,  neuroscientist and a Scizzle blogger


[quote]Read a paper a day (or at least an abstract) and be more efficient.[/quote]

– K.Z., neuroscienctist


[quote]My science New Year’s resolution is to learn tissue culture techniques. And also, to be more careful with the ethanol around an open flame so I light fewer things on fire.[/quote]

E.O., Postdoc


Have a wonderful happy new year!!!


The “Big Data” Future of Neuroscience


By John McLaughlin

In the scientific world, the increasingly popular trend towards “big data” has overtaken several disciplines, including many fields in biology. What exactly is “big data?” This buzz phrase usually signifies research with one or more key attributes: tackling problems with the use of large high-throughput data sets, large-scale “big-picture” projects involving collaborations among several labs, and heavy use of informatics and computational tools for data collection and analysis. Along with the big data revolution has come an exploding number of new “omics”: genomics, proteomics, regulomics, metabolomics, connectomics, and many others which promise to expand and integrate our understanding of biological systems.


The field of neuroscience is no exception to this trend, and has the added bonus of capturing the curiosity and enthusiasm of the public. In 2013, the United States’ BRAIN Initiative and the European Union’s Human Brain Project were both announced, each committing hundreds of millions of dollars over the next decade to funding a wide variety of projects, directed toward the ultimate goal of completely mapping the neuronal activity of the human brain. A sizeable portion of the funding will be directed towards informatics and computing projects for analyzing and integrating the collected data. Because grant funding will be distributed among many labs with differing expertise, these projects will be essential for biologists to compare and understand one another’s results.


In a recent “Focus on Big Data” issue, Nature Neuroscience featured editorials exploring some of the unique conceptual and technical challenges facing neuroscience today. For one, scientists seek to understand brain function at multiple levels of organization, from individual synapses up to the activity of whole brain regions, and each level of analysis requires its own set of tools with different spatial and temporal resolutions. For example, measuring the voltage inside single neurons will give us very different insights from an fMRI scan of a large brain region. How will the data acquired using disparate techniques become unified into a holistic understanding of the brain? New technologies have allowed us to observe tighter correlations between neural activity and organismal behavior. Understanding the causes underlying this behavior will require manipulating neuronal function, for example by using optogenetic tools that are now part of the big data toolkit.


Neuroscience has a relatively long history; the brain and nervous system have been studied in many different model systems which greatly range in complexity, from nematodes and fruit flies, to zebrafish, amphibians, mice, and humans. As another commentary points out, big data neuroscience will need to supplement the “vertical” reductionist approaches that have been successfully used to understand neuronal function, by integrating what has been learned across species into a unified account of the brain.


We should also wonder: will there be any negative consequences of the big data revolution? Although the costs of data acquisition and sharing are decreasing, putting the data to good use is still very complicated, and may require full-time computational biologists or software engineers in the lab. Will smaller labs, working at a more modest scale, be able to compete for funds in an academic climate dominated by large consortia? From a conceptual angle, the big data approach is sometimes criticized for not being “hypothesis-driven,” because it places emphasis on data collection rather than addressing smaller, individual questions. Will big data neuroscience help clarify the big-picture questions or end up muddling them?


If recent years are a reliable indicator, the coming decades in neuroscience promise to be very exciting. Hopefully we can continue navigating towards the big picture of the brain without drowning in a sea of data.

High Minded Science


By Alex Berardino


Words are important, especially those spoken by authorities on medical science. That’s why it was so disappointing to see so many loose and unfounded words being tossed around in the popular press in regards to a paper by Gilman et al. in the Journal of Neuroscience. The paper presents potential differences in the shape, size and density of specific brain areas, the Amygdala and Nucleus Accumbens, between people who use marijuana moderately, and those who don’t use it at all. These brain areas have been shown to be important for reward processing and are also implicated in addiction. The paper also showed a weak correlation between amount of marijuana use and the size of these differences. Notice that I specifically said differences between these groups, and correlation between these measures. The same care and effort was not taken in most reports about this paper. These results were reported as though they showed changes in the size of structures within the marijuana user’s brains caused by the amount of marijuana that they used, and subsequently that these early changes led to the forgetfulness and lack of focus common to long-term, dependent marijuana users. To be fair to the journalists, the paper and its authors are not exactly clear on which interpretation they subscribe to.


Perhaps the importance of the difference between these two interpretations is not clear, but the differences are vital to how we are to act based on this new information. In a time when our society is experimenting with the legalization of marijuana, it is important that we are accurately informed about the actual dangers and possible benefits of the drug and its effects on the brain. Before we break down the difference in interpretation, let’s first break down what the scientists actually did.


The authors collected structural MRI scans of 20 moderate marijuana users, and 20 controls matched for age, sex and educational attainment. Structural MRI scans show a snapshot of the tissue that makes up the brain, which allows you to see the shape and size of the structures that comprise it. You can think of them like a picture of the brain itself, not of its activity.   Despite many similarities, everyone’s brain is a little bit different. The sizes vary, the folds and protrusions don’t match up perfectly from person to person, and the borders between one area and another are not perfectly consistent. When we conduct studies like the one cited here, where we want to find variations across people’s brains, we have to first warp and squeeze each individual MRI scan so that they fit, as well as they can, inside a template brain scan. The template itself is generated by averaging across a large set of brain scans, so that what is left over is assumed to be a “representative” brain. That is exactly what the authors of this study did. They warped, or registered, each individual brain onto this same template. After doing so, they could compare any remaining variations that exist across the two groups.


The authors determined the average sizes, shapes and density of neurons of the Amygdala and Nucleus Accumbens of the moderate marijuana users and compared this to those of the controls. The strongest finding of the paper shows that the left Nucleus Accumbens has, on average, larger volume, and is denser, in marijuana users than in controls. The differences are reported as significant, but the standard errors are quite large and the distributions overlap by a large amount, suggesting that the differences are not entirely reliable.


It behooves us to stop and ask ourselves just exactly what it means that this area has larger volume and is more dense with neurons. Does it mean that the area should show increased performance, or decreased? Is the architecture of this area scrambled, or arranged in an orderly fashion like that of the controls? The truthful answer is that we don’t know.   There is no good accepted answer for what these differences mean, because the true machinery of the brain is built at a scale that is too fine for an MRI to resolve. That’s not to say that the finding isn’t important, just that we should be cautious in our interpretation of it.


Next, the authors binned the members of the marijuana group into subgroups by how many joints per day each member smoked. They report finding a correlation between number of joints smoked, and the size of the difference in the left Nucleus Accumbens. This is the tricky part though, because our algorithms for registering these brains to the same template brain, rely on identifying the very neural landmarks that vary from person to person, and can, by pure bad luck, be obscured or imprecisely located by poor resolution of the MRI. To account for any possible variations caused by errors in registration, or in misidentification of the boundaries of areas, its important to look at variations averaged across many subjects. This helps to ensure that the only variations that remain are those that are present across the whole set of brains, and not those due to these errors. Comparisons across small subsamples of these groups are generally unreliable.


This is important to keep in mind because this trend, between number of joints and size of change, is used as a justification for suggesting that marijuana is causally changing these areas. The fact remains that this study measured differences between groups. It was not a longitudinal study of one group of people who did not use marijuana and then began to use marijuana. Interpreting these data as showing that marijuana use is causing these changes, and not simply correlated with differences in this region, is a difficult interpretation to support. We hear it all the time, correlation is not causation, but it is an important point.


Leaving aside critiques of the science, unsubstantiated extrapolations of misinterpretations pepper many of the articles reporting on this paper. One of the authors of the study, Dr. Hans Breiter, was quoted in the Huffington Post saying, “We think [sic] we are seeing here is a very early indication of what becomes a problem later on with prolonged use, things like lack of focus and impaired judgment”. These are bold claims, unsubstantiated by anything presented in the paper. No attempt was made to show any behavioral evidence suggesting a tendency toward lack of focus or impaired judgment between the groups, or for the members of the marijuana group across time, nor were the results presented so impressive that they suggest major rewiring of the brain after exposure to marijuana.


It is important on a basic level to be wary of making claims we can’t support with evidence. Neuroskepticism is a widespread movement these days. Distrust for institutional authority doesn’t come from thin air. It comes from a sense that information is being withheld, or warped to fit an agenda, or handled by people who are incapable of handling it. Sometimes these claims are founded in truth, sometimes in conspiracy and misunderstanding. The former, at least, we can control by maintaining a strict relationship between evidence and statements of authority.

Neuroscience: Should We Be Worried?


By Celine Cammarata

By nature of focussing on that squishy, convoluted organ that the mind calls home, the field of neuroscience is prone to investigating topics, and producing data, that could be considered… personal.  Take defining what makes some of us smarter than others, decoding patterns of activity to reveal thoughts, or examining the mental effects of economic instability, for example, not to mention the controversies of working with non-human primates as is required for much higher-level cognitive research.  We must ask ourselves, then, what are the ethical considerations associated with performing such experiments?  What can we, and what should we, do with the information obtained?  How far is too far?

Such are the questions that the Presidential Commission for the Study of Bioethical Issues hopes to gain insight on, following a request from the President to investigate the ethical considerations of neuroscience research.  To do this, the Commission is turning to the public: in a requested released in January, the Commission called on individuals, groups, and organizations to submit comments on the moral issues relating to both the process and results of research in the field.  The Commission, which has used similar approaches on other topics in the past, will then incorporate this commentary into it’s overall research, toward the final goal of crafting policy advice and determining and encouraging best practices.

And, to be fair, they’re pretty good questions.  Certainly neuroscientists, like other investigators, are generally self-regulating when it comes to ethical considerations.  But the Commission’s push gives us once again the impetus to ask the perennial questions, are there some things that should not be researched?  Are some things better not to know?  While not original, nor easily answered, these questions bear repeating and consideration.

The Commission specifically requested input on several topics, including whether current codes regulating the use of human subjects are adequate for neuroscience experiments, concerns over potential implications of results and downstream effects on discrimination and concepts of moral responsibility, the proper place of neuroscience in the courtroom, and the potential moral issues associated with communication of neuroscience findings.  This last topic particularly caught my attention, for while clearly an important issue, communication is not always thought of in the light of morality.  Are researchers obligated to share some discoveries?  Are journalists being unethical when they trump up findings?  It’s certainly food for thought.

Those who want to see the committee in action can tune in to the live feed of their public meeting to discuss neuroethics, today and tomorrow in Washington D.C. and online.

Family Fear?



All You Need To Know About Epigenetic Inheritance of Fear Conditioning


Celine Cammarata

A new paper has exploded into the neuroscience world this week with the claim that conditioned fear of an odor can be passed down from father to son – remember Lamarck’s idea of how giraffe’s pass on their elongated necks?  The research has investigators’ knickers in a twist – some are excited, some are skeptical, and many are both.  So, what’s the scoop on this paper?


The Research

Emory investigators Brian Dias and Kerry Ressler showed intriguing evidence that fear memories can be inherited.  Sexually naive adult male mice were conditioned, using shock paired with odor exposure, to be fearful toward one of two scents, or, as a control, were left in their home cages.  Subsequently, these males were bred to naive females, and the lo and behold the male offspring showed increased sensitivity specifically to the odor that their fathers had been conditioned with, but not others.  The same held true for the conditioned males’ grandchildren, pups that were raised by other parents, and pups born from IVF, strengthening the argument that genetics were underlying this inherited fear.  The offspring of odor conditioned males also showed enlargements of the glomerulus corresponding to the odor sensory neurons that detect the conditioned scent and increased numbers of these neurons.  In the conditioned males’ sperm, the gene for the odor receptor of the conditioned scent showed reduced methylation of some regions, suggesting that this hypomethylation might underlie the increased receptor expression in offspring


The Context

This isn’t the first suggestion of heritable epigenetics.  A Science paper earlier this year revealed how methlyation, specifically, is “erased” in precursor cells that will become gametes, but also demonstrated that some epigenetic changes are able to escape erasure. Nor is this the first indication that parental experiences can shape offspring.  Researchers at the Mount Sinai School of Medicine showed that [mouse] fathers who experience social defeat leading to learned helplessness can pass on depression to their children, and a 2012 paper from U Penn and Mass General argued for heritable resistance to cocaine based.


The Buzz

Of course, the paper leaves some crucial questions open, most notably: how do the conditioned male’s sperm cells know what’s happening in the olfactory bulb?  The authors make some guesses, but largely leave this blank.  The investigators also never directly correlate the enlarge glomerulus in offspring to their enhanced sensitivity to their fathers’ conditioned odor.  And while the work is exciting, not everyone is ready to jump on board.  Some are dubious that the methylation seen in sperm would have the kind of effect shown in offspring, or reject the notion of such a high level of malleability in the genome; others point out that it’s difficult to determine whether it’s truly a fearful memory that’s been inherited, or just an increased sensitivity.  Nonetheless, nearly everyone has something to say on the topic – so take your newfound knowledge and get ready to join the conversation!

Why Going Home for Thanksgiving Feels so Good



Celine Cammarata

Ahh family – sure, they can drive you crazy sometimes, but once you’re all sitting around an amazing turkey spread it’s hard to deny the joy and comfort that  loved ones bring to our lives.  And as scientists, we at Scizzle wanted to know why – what’s the biology underlying that great feeling of being surrounded by people you care about?  Watch our first Scizzling Video to find out!


Happy Thanksgiving!

Can Neuroscience Ride the Wave?


Celine Cammarata

The 1990s may have been the “Decade of the Brain”, but the time for neuroscience is now. Between President Obama ushering in the BRAIN project and the seemingly constant stream of books and headlines proclaiming new knowledge of the human mind, it’s easy to see that the field is booming, but what’s going on under the surface?

In a recent editorial, AAAS CEO Alan Leshner gives a compelling, straightforward discussion of the state of neuroscience today and what has changed. As Dr. Leshner points out, widely publicized initiatives in the 1990s actually did little to bring research funding into the field. But now, both American and European efforts are widening the financial pipelines and policy-makers are becoming more deeply involved with research. These changes have come together with rapidly advancing research technology, collaboration across fields, and an increased focus on translational research to create an unprecedented opportunity for neuroscience research.

But riding this wave could be tricky for investigators. Making the most out of this confluence of events require collaboration on a scale rarely seen in neuroscience and a shift in focus from the “small science” goals of individual labs to “big science” end games that could dramatically change the field.

Most students and researchers in neuroscience today likely recognize that the field is in a remarkable state of growth and opportunity, and yet Leshner is one of few to address the topic head on. Arguably, all of us working in neuroscience have an obligation to try to understand the extraordinary situation we are in, and Leshner’s writing is a bold move toward doing so.