Environmental enrichment leads to leaner mice!

A recent paper published in Cell has shown just how much our environment can affect our health. Environmental enrichment (EE) refers to living in a complex environment with physical and social stimulation and is most often studied in laboratory rodents, where these factors can be controlled. The authors of this paper had previously found that mice living in EE showed increased neurogenesis (birth of new neurons), enhanced learning and memory, and resistance to brain insults, but had also noticed that mice living in EE appeared leaner than those living in standard housing. This observation lead the authors to further investigate the fat profile of these animals.

A little background. There are two types of fat tissue, or adipose, in mammals. White adipose tissue (WAT) and brown adipose tissue (BAT). WAT is likely what you would think of when you think of fat. It accumulates energy, stores heat, and cushions. BAT releases energy as heat and is crucial in body temperature regulation. For this reason it is abundant in newborns and in hibernating mammals. BAT cells appear brown due to the iron present the numerous mitochondria. Just to confuse things a little more, there is a third type of adipocytes, called brown-in-white cells (or brite cells) which are thermogenically like brown adipocytes (dissipate heat), but appear in WAT and are developmentally and molecularly different from brown adipocytes. The important thing about brite cells is that they appear to be associated with resistance to obesity and metabolic diseases.

So, on with this study. As mentioned, mice were housed in either standard housing (group housed in regular cages) or EE which consisted of larger cages, running wheels, and regularly changed toys and mazes. All mice ate the same kind of food to which they had free access. After 4 weeks, EE mice were found to have a lower body weight as well as lower WAT mass. To determine whether the effect of EE were due to simply more exercise, a third group was introduced. This group had access to a running wheel, but none of the other stimulants present in the EE group. While adiposity was decreased in the wheel running group, it was not to the same extent as the EE group. This finding was not due to increased motor activity in the EE group as these animals actually ran less total distance than the wheel-runners. Food intake measurements also ruled out appetite suppression as a reason for the loss of adiposity as EE mice actually showed increased food intake.

To further investigate the effects of EE, researchers looked at changes in gene expression in both WAT and BAT. While not many changes were found in BAT, 15 of 19 genes examined in WAT were found to be altered by EE. Most interesting was the upregulation of Prdm16 which serves as a switch in the formation of brown adipocytes. Along side this upregulation was increased induction of several genes typically functional in BAT. So here we have many indicators that a BAT phenotype is being increased in WAT in animals exposed to EE. The authors propose that “EE induced a ‘browning’ molecular signature in white fat suggesting that an individual’s interaction with its immediate environment could switch a white fat energy storage phenotype to a brown fat-like energy expenditure phenotype and regulate adiposity.” To further test this, mice assigned to control or EE housing were fed a high-fat diet. After 4 weeks on this diet, EE mice gained significantly less weight and had increased body temperature with no change in food intake. This suggested that energy expenditure was responsible for the resistance to obesity, as well as its associated morbidities (hyperinsulinemia, hyperleptinemia, hyperglycemia, and dyslipidemia). And again, EE mice fed the high-fat diet also showed the “browning” molecular signature as described above.

Other findings included stronger WAT “browning” with longer exposure to EE (3 months), the involvement of the sympathetic nervous system in the changes in gene expression occurring in EE, as well as increased expression of the neurotrophin BDNF (which is involved in neuronal health and survival, brain plasticity, protection against insults, learning and memory, and the list goes on and on) in the hypothalamus. It is still not clear exactly what is occurring in this phenotypic switch within WAT: either transdifferentiation of white adipocytes to brown, or the activation of the brite cells. In any case, the authors propose that the complex environmental stimuli experienced in EE causes induction of BDNF in the hypothalamus which then leads to increased sympathetic activation to WAT. Then a functional transformation from WAT to BAT occurs leading to release of energy as heat with subsequent benefits including decreased adiposity and resistance to obesity. The authors believe that with further investigation into the origin of these brown-like cells induced by EE, potential treatments for obesity could be developed.

Overall this is was a really well-carried out, comprehensive study which highlights how drastically the physical and social environment can affect our health. Would these findings hold true in humans? (check out this blog!). Obviously our environment is much more complex with much greater physical and social stimuli. However, that brings with it negative stimuli as well, ie. stress. While certain stressors can be a good thing (ie. exercise is known to increase BDNF as does caloric-restriction), too much stress is definitely detrimental. So again, the take home message would be eat healthy, exercise, enjoy a social life, but decrease the negative stress.

Reference: Cao L, et al. White to brown fat phenotypic switch induced by genetic and environmental activation of a hypothalamic-adipocyte axis. Cell Metabolism 2011;14:324-338.

Guest Post on Scientific American!

Check out my guest post on the SciAm guest blog! I was asked to write about the nutritional differences between organic and conventionally-grown foods. Let me know what you think! http://blogs.scientificamerican.com/guest-blog/2011/08/11/nutritional-differences-in-organic-vs-conventional-foods-and-the-winner-is/

Folic Acid & Vitamin D: Deficiency as a Risk Factor for Autism?

Autism is one of three disorders that falls under the umbrella of Autism Sprectrum Disorders (ASD). It is characterized by impaired communication and social skills and repetitive/restricted behaviours, all occurring prior to 3yrs of age. The other two disorders falling under the ASD are Asperger’s syndrome, which lacks the cognitive impairments present in autism, and Pervasive Developmental Disorder-Not Otherwise Specified (PDD-NOS) which is diagnosed when all characteristics for either autism or Asperger’s are not present.

Autism rates worldwide have steadily increased since the 1980’s, although there is controversy as to whether this represents an actual increase in the occurrence of the disorder, or improved diagnostics. Nevertheless, the cause of autism remains unclear and is likely due to a multitude of genetic and environmental factors, that differ between individuals. First, let’s dispel the elephant in the room. Because the symptoms of autism begin to present at approximately the same time as the child’s MMR vaccination, the blame turned to the vaccine. Unfortunately, this theory was bolstered by Andrew Wakefield, who published a paper in The Lancet in 1998 suggesting that there was a link between the MMR vaccine and autism. This paper has since been retracted and Wakefield’s work has been called by some “an elaborate fraud”, involving misreporting of data, unethical treatment of subjects (children in this case), and conflicts of interest. What’s sad is that all of this caused many parents to stop vaccinating their children, leading to a resurgence in the occurrence of measles, resulting in preventable deaths. ANYWAYS, this could be a whole post on its own, so I’ll move on.

It is well known that factors in our environment (be it pollution, nutritional intake, physical activity, etc.) can alter normal functions of some of our genes, thereby producing phenotypic (ie. traits that we can see such as morphology, development, behavior, etc) differences. Nutritional factors play a huge role in the normal functioning of our genes, and therefore deficiencies or excesses can cause abnormal gene products to be produced. There is some indirect evidence that nutritional factors may play a role in the development of autism. The potential role of two of these factors, folic acid and vitamin D, were the subject of a review paper (cited below), which also reviewed genetic abnormalities, the role of the immune system, and heavy metal effects.

Folic acid is a B vitamin important for many functions including DNA synthesis and repair, and the production of red blood cells. Perhaps its most well-known role is that of preventing neural tube defects (NTDs, specifically spina bifida) in the developing embryo. Approximately 20yrs ago health agencies began advising women of child-bearing age to take a folic acid supplement and this has resulted in a 70% decrease in the incidence of NTDs as well as a decrease in the severity of defects when they occur.

So, what does folic acid have to do with autism? Due to a genetic polymorphism (a difference in DNA sequence between individuals) autistic individuals tend to show 50% decreased activity of a certain enzyme (MTHFR) that is required to metabolize folate. So even if these children have a sufficient intake of folate in their diets, their ability to metabolize it is only 50% of normal, and therefore deficiency may occur. In a strange way, the push for women to pre-natally supplement with folate may have contributed to the increase in autism rates that seemed to begin around the same time. Without maternal folate supplementation, miscarriage rates of fetuses with the abnormal MTHFR enzyme would have been higher than that resulting from pre-natal folate supplementation. So more children with the decreased ability to metabolize folate survived, which may be linked to the increased occurrence of autism. So the suggestion is to supplement children with folate to ensure those with the abnormal MTHFR enzyme get enough to make up for the decreased ability to metabolize.

Vitamin D is another factor that may play a role in the development of autism in some individuals. Vitamin D plays many roles in the body such as growth and remodeling of bone, neuromuscular functions, decreasing inflammation, and affecting genes that are involved in cell survival and death. Vitamin D is also important for neural development as a regulator of cell division and up-regulator of neurotrophins, which are crucial to the development, survival, and function of neurons. The majority, 90% actually, of our vitamin D supply comes from sun exposure, rather than diet. UVB light causes the transformation of a precursor molecule in our skin into an inactive form of vitamin D which is then further processed by the liver and kidneys into the active form (also called calcitrol).

Due to rising skin cancer rates in the late 80’s, sun avoidance was being recommended, and it was around this time that autism rates began to increase. There is lots of indirect evidence suggesting that a decrease in vitamin D production may be linked to the development of autism. Estrogen can increase vitamin D metabolism into the active form, while testosterone can not, possibly explaining the greater prevalence of autism in boys compared to girls (4:1). Autism rates are also higher in African Americans vs caucasians. Darker skin requires a greater amount of UVB rays to produce sufficient amounts of vitamin D. Shockingly, one study carried out in the US found that only 37% of white women and 4% of black women had sufficient amounts of vitamin D during pregnancy. While possible mechanisms of vitamin D deficiency-induced autism have not yet been shown, it is likely to again be due to a genetic polymorphism present in certain individuals. A candidate gene is CYP27B1, an enzyme that is required for the transformation of inactive vitamin D into its active form. A genetic polymorphism of this gene may lead to vitamin D deficiency, however the role this may have in the development of autism has not been examined yet.

So, in summary, both folic acid and vitamin D deficiency may contribute to the increase in autism seen in the past 20yrs or so. While potential mechanisms have not yet been verified, it is likely that some autistic individuals carry genetic polymorphisms resulting in a decreased activity of enzymes involved in the normal functioning of these important vitamins. While the indirect evidence is very interesting and indeed warrants further investigation, it is important to note that as of yet there is no direct evidence of these theories. Even if one or both of these theories are verified, they will only explain a proportion of cases, since there are many autistic children who do not show, or never have shown, deficiencies of either of these vitamins. It has proven difficult to determine a single mechanism, gene, or environmental factor responsible for the development of autism and therefore there is likely a variety of different factors, that when expressed alone or in combination with each other, in the “right” embryonic and genetic environment, lead to children who will be at higher risk of developing neurological disorders.

 

Citation: Currenti SA. Understanding and Determining the Etiology of Autism. Cell Mol Neurobiol (2010) 30:161–171. DOI 10.1007/s10571-009-9453-8

Hidden differences in the brains of Asperger’s individuals revealed?

Important Note: Please take a look at the update at the bottom of my “About” page. In short, I’ve decided to narrow my focus a bit to what I know best – nutrition and neuroscience. BUT I plan to have monthly treats that will cover a topic outside of these areas. Let me know what you think! On to your regularly scheduled programming…

Albert Einstein is believed to have had Asperger’s syndrome.
Image source: Wikipedia 

Asperger’s syndrome is one disorder falling under the umbrella of the autism spectrum, in which the affected individual may show obsessive attention to detail, social awkwardness, and difficulty relating to others. Repetitive behaviors and highly focused, restricted interests (ex. obsession with trains, horses, etc) are also present. Unlike other autism spectrum disorders, cognitive development is not usually affected nor are early basic language skills. Higher order language abilities (ie. understanding inferences, nuance, and ambiguity), however, may be affected as the individual ages, and these deficits may be associated with the more classic deficits seen, such as social awkwardness.

Asperger’s individuals show anatomical abnormalities in a small region in the frontal lobe of the brain called Broca’s area. This region was named for Pierre Paul Broca, who discovered that impairments in language production – now called Broca’s aphasia – was associated with this region. Broca’s area is responsible for basic “lower-order” language skills, ie. speech production, which, again is not affected in Asperger’s individuals, yet anatomical abnormalities persist. Confused yet? Keep reading…

Researchers at Beth Israel Deaconess Medical Center in Boston, MA set out to uncover hidden functional deficits that may be associated with Broca’s area in Asperger’s individuals. They examined naming ability (a basic language skill which consists of verbally identifying a picture on a card), in which Asperger’s individuals (referred to as ASP from here on) and neurotypical individuals (meaning people in which neurological development, especially in terms of processing language and social cues, is considered “normal”; herein referred to as N) perform equally well, and then kind of ‘messed with’ Broca’s area using magnetic stimulation to see if naming ability was then different between the two groups.

Ok, I should clarify the “messed with” part. Subjects underwent a number of “repetitive transcranial magnetic stimulations” (or rTMS). RTMS involves producing weak electric currents in the brain by use of a magnetic field, causing activity (or suppressing activity in this case) in a targeted brain area, thereby allowing the function and connections within and around that region to be studied. “Repetitive transcranial magnetic stimulations”?…you are probably envisioning something akin to the “shock shop” in One Flew Over the Cuckoos Nest. But not to worry. Basically, a “stimulator unit” is simply placed on the subject’s head (see image below), the position of which is guided by a fancy imaging technique (MRI – magnetic resonance imaging), to ensure the correct part of the brain is being stimulated.

Image source: http://thertmscenter.com

The electrical stimulation itself does not cause any damage or pain. Some side effects reported by subjects in this study were sleepiness, dizziness, trouble concentrating, stiff neck, and increased emotionality (that’s a real word, I checked); however, several of these were also reported in individuals undergoing sham stimulation, where the magnet is blocked, suggesting these effects may not be real. Broca’s area, as shown below, is divided into 2 regions called the pars triangularis (which occupies the anterior portion of Broca’s area) and the pars opercularis (which occupies the posterior portion of Broca’s area), and both of these were stimulated, on both left and right sides of the brain.

(Side note: Wernicke’s area is also shown in this figure. It, along with Broca’s area, make up the two main language centers of the cerebral cortex. Damage to Wernicke’s area causes Wernicke’s aphasia in which the speech produced is normal (unlike Broca’s aphasia) in terms of grammar, syntax, rate, intonation and stress, but the words themselves are incorrect and may even consist of “made-up” words.)

Image source: Wikipedia

After these treatments, patients were again tested on the naming task, and HERE is where the interesting part happens. The time it took for the ASP subjects to come up with the answer was affected, while N subjects showed no change. When the left pars triangularis was stimulated, ASP patients performed better, ie. were faster at the naming task, whereas performance decreased when the left pars opercularis was stimulated. While this study did not determine what this difference is due to, the authors speculate that perhaps in the normal ASP brain (ie. no stimulation), the pars triangularis has kind of a choke-hold on the pars opercularis, inhibiting its activity (this is not seen in individuals without ASP syndrome). BUT, the electrical stimulation actually causes a suppression of activity in the brain region it is targeting (I know, its called “stimulation” but actually decreases activity…that’s science for ya). So when the pars triangularis is stimulated, it actually causes a suppression of activity in this region, causing it to release its choke-hold, allowing the pars opercularis to become active. This would also explain why naming ability is decreased when the opercularis is stimulated (since the stimulation is actually causing a suppression of activity). Further testing is needed to confirm this proposed mechanism.

Ok, all well and good, but of course there are weaknesses to the study (all acknowledged by the authors I should mention). 1) Limitations of both the stimulation and the imaging devices cast some doubt on whether the intended target areas were indeed specifically stimulated. Hmm, this seems like a big weakness, but I guess the researchers are limited here by the capabilities of their equipment. More sensitive imaging techniques would help to get around this issue. 2) Potential changes in brain activity caused by the rTMS were not measured, therefore the suppressive effect of rTMS may not have been identical between N and ASP individuals. 3) The two groups had significantly different average IQs (N=111.2, while ASP =122.4). While I’m not sure how this would specifically interfere with the findings (my own ignorance), it is possible different results would be found in IQ matched groups. 4) As with many human studies, the sample size was small (10 in each group, although groups were gender and age-matched) and therefore generalizing these findings to the general population is not warranted.

These limitations are definitely important to keep in mind, as results could actually be due to differences in stimulation target accuracy, rTMS-induced brain activity, and/or IQ differences between the two groups. Even so, the results are potentially interesting and further work needs to be done to a) determine whether these findings are actually true, and b) what these findings mean and the mechanisms behind them.

So let’s say for a moment that these findings are indeed, real. The ability to name objects develops early in life and is considered an indicator of future reading ability. Since stimulation to the left pars triangularis seemed to improve naming ability in ASP subjects, one could suggest that this could lead to improved communication skills and possibly social interactions, which are impaired in ASP individuals. This type of stimulation protocol has been used in stroke patients suffering from Broca’s aphasia. Stimulation lead to sustained enhanced language skills that actually improved even further over time, suggesting a permanent benefit. This study is also important in that it highlights the use of the stimulation technique (rTMS) itself for uncovering functional differences in neurotypical brains as well as those affected by disorders.

(Reference: Fecteau S et al. European Journal of Neuroscience, vol 34, p158-164, 2011)

Art by Charley Harper

Art and science – 2 seemingly opposite worlds, yet I’ve often felt like I’d like to live in both. I love art…drawing, painting, photography…both producing it myself and appreciating the work of others. I love science…thinking up experiments and appreciating the great ideas others come up with. Career-wise, I obviously chose the science route (thought the job prospects would be better…BAHAHA), but always came back to art as a hobby and still feel a longing to incorporate it in my everyday life…including my work life. The interest I’m developing in science writing has brought me some peace and lit up my artistic-creative side again (which is in some desperate need of nurturing now that the PhD is said and done). While I never thought the form my artistic expression would take would be in words rather than a paint brush or pencil, it feels completely right and natural. Who knows…maybe one day when I am an accomplished science writer I’ll start a side career as a science illustrator. :)

On that note, when searching for some pretty pictures to use as a header for my blog I came across these little treasures (just a small sample) by Charley Harper (1922-2007). Charley was an American wildlife artist whose illustrations can be found in “The Golden Book of Biology”, “The Animal Kingdom”, “Birds and Words”, and “Beguiled by the Wild: The Art of Charley Harper”. He called his style minimal realism stating “I don’t try to put everything in, I try to leave everything out…I never count the feathers in the wings, I just count the wings.”. Isn’t that refreshing? Science can be SO detail oriented – its nice to take a step back and just appreciate the big picture. While mostly wildlife, which is of course its own vast field of science, he does from time to time incorporate other (geekier ;)) forms of science such as some physiology (note the squirrel print below), a hint of genetics (see the chromosomes lining up in second print posted below) and even a shout-out to Darwin (see Finches below). I love it all.

Charley’s work reminds me of what illustrations in children’s books used to look like, but in today’s modern minimalist style, he is still very relevant. I am so happy to have found Charley’s art and excited to realize that the merging of art and science is a real possibility – it gives me hope!

Check out more wonderful pieces at www.charleyharperartstudio.com. One of these prints might find a happy home on my wall!

Let sleeping fruitflies lie…they’ll be smarter for it

So this is my second post to do with insects…not a theme, they just happen to be amazing little creatures, both in their natural abilities (see my Honeybee post) and as research tools (see current post). The study I’m referring to below was published in Science on June 24, 2011. Fascinating stuff!

Aaahhh, sleep. We know we need it and can’t function properly without it. If you’ve been unfortunate enough to experience insomnia, you can attest to the importance of sleep. However, the purposes of sleep that we are all likely familiar with (such as restoration and memory processing), have not actually been proven. Currently, the only way for scientists to really study the function of sleep is to withhold it and observe the consequences. It is not yet possible to put subjects to sleep on demand and study the benefits of sleep…that is, UNLESS your subject is a fruitfly…in which case you are in luck.

Researchers out of Washington University have found a way to actually induce spontaneous sleep in Drosophila (aka. fruitflies). With some molecular magic, researchers were able to express a specific channel that is temperature sensitive (TrpA1) in neurons that project to the part of the brain involved in sleep regulation. By then moving the flies to an environment with a temperature of 31°C, the flies basically fell asleep. Since the goal is to study the effects of natural sleep, researchers had to be sure that this induced sleep was indeed molecularly and behaviourally similar to “real” sleep. Locomotion following induced sleep was not affected, flies were woken with mechanical pertubation (I assume this to mean shaking of the tube the flies had been placed in), and feeding the flies caffeine prior to exposure to the 31°C environment weakened sleep induction. As well, genes that are normally down-regulated with real sleep were also shown to be down-regulated with the move to the 31°C environment. These functional and molecular findings are all characteristic of spontaneous sleep, suggesting the researchers were successful in inducing a real-type sleep.

Now convinced that they could induce sleep, the researchers set out to determine whether sleep had a function in long-term memory. First a little background. There is something called the synaptic homeostasis model which hypothesizes that synaptic connections (think of this as communication between neurons) are increased while we are awake, and decreased during sleep. This downtime during sleep is believed to be necessary, otherwise brain circuits would become overloaded and we would be unable to learn new things and memory would start to be affected (just think of how your ability to learn or remember things is affected after even 1 night without sleep). When flies are placed in large social groups (~90 flies) the synaptic connections are increased (causing overload of brain circuits), so without sleep, long-term memory in these flies is affected. (BTW, long-term memory in flies is measured by something called courtship conditioning in which male flies learn who are the appropriate receptive females to mate with via certain cues, such as pheromones.). However, when flies expressing the TrpA1 gene were placed in a large social group (thereby overloading the brain circuits), and then moved to 31°C where sleep was induced, long-term memory was restored. To take this a step further, induced sleep was even able to result in long-term memory formation of a behavioral protocol that normally only induces short-term memory. Wild!!

So, here we are with the first steps in really uncovering the true functions of sleep. How can this research be extended to humans? Well, we are far from being able to induce natural sleep in people, but we can at least uncover mechanisms occurring in other species and glean from these findings the functions of sleep as well how sleep interacts and affects other biological processes.

(Reference: Donlea JM et al. Science 2011;332:1571.)

A Career in Technical Science Writing?

“If you find you like doing the writing rather than the science aspect, you like interpreting the data rather than generating them, and you are the person everyone goes to because they need something written…this might be a good career for you.”

This quote describes me to a T. It was taken from this article in Nature about jobs in technical science writing. http://www.nature.com/naturejobs/2011/110714/full/nj7355-255a.html.

While I’ve been thinking for a while now that science writing just may be the career for me, technical science writing actually never crossed my mind. These writers focus on providing information on various products and services that scientists, researchers, engineers, physicians, etc. may use. Yes, these are the people who write that instruction manual that came with your ELISA kit, but thanks to the online and social media age we currently live in, they are also involved in writing for websites, wikis, podcasts, and blogs.

To be honest, the writing part sounds like it would be a bit too tech-y for me, since it would involve writing strictly for the users of products and not the general public (as in science journalism), and therefore any personal flair or creativity may be out the window. BUT, having said that, there would still be opportunity for constant learning!

Something to think about anyways. I’m finding the idea of a freelance career very attractive…dabble in some technical writing here, do a little journalism there…any advice or thoughts on this?

Check out the wonderfully informative link above to the Nature article for more information on careers in technical writing.

Honeybees…doin’ dances, savin’ lives

Wow – my first post. This is nerve-wracking in a here-I-go-putting-myself-out there-please-oh-please-like-me kind of way. Here goes nothing!

In case you weren’t aware, honeybees are about the coolest little creatures out there. Admittedly, as a child I was TERRIFIED of anything with a stinger…ask the stranger who almost fell off his bike and over the bridge after I screamed bloody murder when a bee landed on me…sorry man. BUT now that I’m all grown-up and matured (ahem…) I have come to appreciate the little guys, and I have science to thank for that.

Honeybees are classified as a eusocial insect, meaning they have reached the highest level of social organization. Their society consists of a queen, drones (all males whose only purpose is to mate with the queen, after which he dies), and worker bees (all females) who go through a sequence of duties in their lifetime: housekeeper, nursemaid, construction worker, grocer, undertaker, guard, and finally, after 21 days they become a forager, searching for food. [Insert your choice of gender role-related joke here].

I think foragers are just the most amazing little guys…uh, gals. Upon finding a food source, they return to the hive and must communicate to the other foragers where to find said food. In order to do this they perform 1 of 2 dances, a “round” dance or a “waggle” dance (giggle). The round dance is performed if the food source is less than ~50m away. The waggle dance is performed if the food source is a longer distance away. Since the foragers must travel further to find these far away sources, they require more information – just how far away is it? which direction? These details are actually incorporated into the waggle dance. The duration of the dance indicates the distance to the food source, while the angle the dance is performed at represents the direction to the food source relative to the sun – AND the bees can even correct for the changing position of the sun!! AMAZING, no? Check out this video yanked from you-tube to see the waggle dance.

Just when you thought they couldn’t get any better, turns out honeybees can actually be trained to sniff out unexploded landmines…yes, you read that correctly…read on.

First a little background. Unexploded ordnance devices (UXOs), better known as landmines, obviously result in humanitarian crises, not only due to the number of people who are killed or injured, but also due to the loss of agricultural land. Therefore, research into how these devices can be safely detected and removed is ongoing.

Currently, landmines can be detected by using handheld metal detectors, however, since these require a person to operate them, this is obviously not a preferred method. Another method of detection is the use of dogs trained to detect the scent of chemicals associated with the landmines. Again, this method requires a human handler, putting both the handler and the dog at high risk.

Researchers out of Montana State University and the University of Montana have developed an amazing detection technique that would avoid the need to put humans or other animals at risk. ENTER THE HONEYBEES. These researchers have actually trained honeybees to detect a specific chemical (called 2.4 DNT) associated with landmines. The bees are first conditioned to the chemical by providing a feeder nearby the honeybee hive that is pumped with syrup (jackpot for the bees!) which contains the scent of the DNT. Very quickly, the bees associate this smell with a rich food source. The feeders are then moved further away from the hive, forcing the bees to search out and forage for the food reward. Finally, many feeders are placed are placed at long distances from the hive, only some of which are pumped with the scented syrup. This requires the bees to search even further and distinguish between feeders for the reward. Ultimately, the bees will spend greater periods of time where the chemical scent of the DNT is the strongest, which will presumably be where the landmines are located. By using a fancy-schmancy detector called a scanning lidar instrument, a spatial map of honeybee densities can then be developed, in turn mapping where landmines are present.

The scanning lidar instrument was improved numerous times until it was able to distinguish between moving vegetation and honeybee densities. Field experiments successfully showed that the instrument can correctly detect densities of honeybees indicating the presence of the DNT chemical. While improvements to the instrument are still required, such as refinement to provide even finer differentiation between moving vegetation and honeybee densities, changes to allow a stronger signal to be elicited from the bees, and presets implemented in order to automatically adjust for weather conditions, these findings are certainly promising.

While, of course, questions remain, (ie. what happens when the bees eventually realize the scent of DNT does not always mean food? how long after training can mapping the bees be considered accurate?) overall, the brilliance of the honeybees is matched by the scientists in this case. By using the natural ability of the honeybee to forage for desirable food sources, it may be possible to save tens of thousands of lives each year. The ingenuity in marrying the complex physical and engineering feat of the detection instrument (not to mention the processing of all the data collected) with a naturally occurring biological phenomenon is truly astounding. Can’t wait to see where this research goes!

(Reference: Carlsten ES et al. Applied Optics, 2011:15;2112-2123.)