Sei sulla pagina 1di 99

ERD

Examine.com

Research Digest

Issue 3

January 2015

Table of Contents
05

Heart benefits of alcohol may not apply


to everyone

17
21

Type 2 diabetes: a preventable disease

33
43
51

Whence the hype?

Investigating a progression of carb and


saturated fat intakes

Running on empty: can we chase the fat away?

Fitting into your genes: do genetic


testing-based dietary recommendations work?

61

Combating obesity through intermittent


fasting

70

How does a lifetime of marijuana use affect


the brain?

79

A mouses microbiome may cause its


brain to leak

87

Ask the Researcher: Stuart M. Phillips,


Ph.D., FACN, FACSM

92

INTERVIEW: Ramsey Nijem

From the Editor


However, more research is needed ...
Have you ever seen that line in a journal article? Of
in line for randomized trials, such as the impact of
course you have. Its a part of almost every article that [INSERT NUTRIENT OR DRUG HERE] on heart diswe review for ERD. Is more research ever not needed? ease biomarkers in [INSERT POPULATION HERE].
Treatment is funded more often than prevention, and
A fellow researcher and I would talk about how ubiq- multimodal prevention is funded much less often
uitous this phrase was, and whether it really meant
than interventions investigating a single method or
anything. He eventually wrote a letter to the editor
pharmaceutical.
of an epidemiology journal, including some analysis
on how often the phrase was used in major journals. Maybe that seems backwards. But its not easy to test
Three years later, I still run across the phrase a dozen the combined impact of getting regular sleep, eating
times a day. This may never change.
mostly unrefined foods, getting time outside in the
sun, and carving out time to relax and get some perWhy is this phrase important? Well, it ties in to one
spective. Actually, its pretty difficult to test even one
of the most important, yet least talked about issues
of those interventions. Plus theres much less money
in health research: when are new trials justified, and to be made on prevention, especially when it comes
what exactly should new trials test? Theres a field of
to free interventions, than there is to be made by sellresearch called value of information analysis, which ing treatments.
places a dollar amount on the public health value of
each unit of new research on a given topic.
Theres a phrase that refers to the inherent nature of
human existence, including choices and difficulties:
There are only so many research dollars available. Not The Human Condition. Sometimes, I think there is
every topic can get funding for a large randomized
a counterpart in The Research Condition. Health
trial, and many important topics go unresearched. Id research is complex and shifting, and somewhat
like to know whether taking vitamin D in the morn- inherently flawed. Single trials cant conclusively
ing causes different effects than night-time ingestion. answer questions. Subtle differences in methods and
Will we see research on this topic? Probably not.
samples lead to different results. Research doesnt
really flip flop very often its just a much more
Major issues that have already been addressed by
iterative and grueling process than the public knows.
animal studies and observational trials often are next And its why more research is always needed.

Kamal Patel, Editor-in-Chief


3

Contributors
Researchers

Trevor Kashey
Ph.D(c)

Alex Leaf
M.S(c)

Courtney Silverthorn
Ph.D.

Pablo Sanchez Soria


Ph.D.

Kamal Patel
M.B.A., M.P.H., Ph.D(c)

Arya Sharma
Ph.D., M.D.

Natalie Muth
M.D., M.P.H., RD

Stephan Guyenet
Ph.D.

Mark Kern
Ph.D., RD

Gillian Mandich
Ph.D(c)

Margaret Wertheim
M.S., RD

Zach Bohannan
M.S.

Sarah Ballantyne
Ph.D.

Katherine Rizzone
M.D.

Editors

Gregory Lopez
Pharm.D.

Reviewers

Heart benefits of alcohol


may not apply to everyone
CETP TaqIB genotype modifies the association
between alcohol and coronary heart disease:
The INTERGENE case-control study
Introduction

With advice coming from everyone from physicians to bartenders, a common message broadcast during the past couple decades has been that
moderate consumption of alcohol is not just allowable, but beneficial for
heart disease. Indeed, imbibing to the tune of one drink daily for women,
or two drinks daily for men, has been associated with lower risk of cardiovascular disease.
Proposed mechanisms for the protective effect of alcohol on coronary heart
disease (CHD) include the potential benefits from the antioxidant effects
of polyphenols in wine, and an increase in high density lipoprotein (HDL)
levels. HDLs most well known function is to transport cholesterol from
arteries throughout the body back to the liver, preventing cholesterol from
being deposited in the arteries, which would cause blockages.
5

Lipid-containing particles in the blood often gain


and lose different types of lipids, such as cholesterol and triglycerides. The ability of HDL to transfer
cholesterol into particles like VLDL is partially regulated by cholesteryl ester transfer protein (CETP).
CETP promotes transfer of HDL cholesterol into
VLDL, and in exchange HDL receives triglycerides.
CETP is hence thought to reduce HDL cholesterol,
so less CETP in your blood means HDL particles
would balloon up with
more cholesterol, and
more CETP would mean
HDL particles would carry less cholesterol.

sequences at a given site in the DNA. Both versions


of the DNA sequence would be considered normal,
with neither likely to directly cause debilitating disease, like a rare mutation might. However, different
polymorphisms may still influence susceptibility to
disease.
This study looked at how two polymorphisms in
the CETP gene affect the odds of having CHD at
varying levels of alcohol
intake. The two different
alleles (gene variants) of
CETP are called B1 and
B2. B2 is associated with
decreased CETP mass
and increased HDL cholesterol. Given that we
have two copies each of
gene, the three different
genotype options in a
given subject are B1B1,
B1B2, or B2B2.

HDL can be
anti-inflammatory
or inflammatory,
depending on the
disease state of
the body.

Hold on, less HDL cholesterol isnt that a bad


thing? Not necessarily,
as HDL is more complex
than just the good cholesterol moniker it has
taken on in public parlance (and unfortunately
physician office parlance as well). HDL also has a
lesser known, but important role in the immune
system, performing a variety of functions, such as
binding toxic substances in the blood. HDL can
be anti-inflammatory or inflammatory, depending
on the disease state of the body. HDL and LDL are
markers of disease, but they each have physiological functions important to the body, and neither are
absolute determiners of or protectors against heart
disease.
Back to CETP. There is a known polymorphism in
the gene that encodes CETP called CETP TaqIB. A
polymorphism is when a particular gene has two
or more relatively common possible nucleotide

A previous study showed that men with B2B2 genotype who have an ethanol intake of 50 g (about
three drinks) or more per day had about a 60%
lower risk of heart attacks than men with lower or
no alcohol intake. This protective effect of larger
amounts of alcohol was not seen in people with the
B1B1 or B1B2 genotypes. On the other hand, in a
study in a Mediterranean cohort, no interaction
between CETP TaqIB, alcohol intake, and CHD was
observed.
Why is that? One reason could be simply different
populations. As seen in Figure 1, different populations can have substantially different CETP
genotype frequencies. Rodents such as mice have no
6

Figure 1: CETP B2B2 allele frequency in different populations

CETP gene, and also have lower risk of atherosclerosis, though many other factors may be responsible
this. Complete CETP deficiency is a rare mutation
in humans, although its much more frequent in one
area of northern Japan. While the frequency of this
mutation is higher in people with heart disease, at
least in that area of Japan, recent studies have shown
that the extremely cholesterol-rich HDL in these
people still maintains its antioxidative function and
ability to move cholesterol out of areas of cholesterol
buildup. So the impact of CETP on heart disease is
still very much up in the air.
The aim of the current study was to re-examine
the effect of alcohol intake and its interaction with
CETP Taq1B polymorphism on CHD odds.

Moderate alcohol intake is often encouraged to


help ward off heart disease. This advice is largely
based on HDL effects, but these effects may also
be modified by your genotype.

Who and what was studied?


Population
This case-control study took place in Sweden as part
of the INTERGENE research program, which aims
to assess the interaction between genetic susceptibility and chronic disease in southwest Sweden. Cases
with heart disease were compared against controls
who didnt have heart disease, to assess how alcohol
and genetic variation impacted disease prevalence.
The CHD cases were patients under age 75, admit7

ted to three regional hospitals for acute coronary


syndrome and diagnosed with myocardial infarction. Of the CHD patients who agreed to participate,
618 patients were included (453 men, 165 women).
Of those, 209 men and 86 women had a first-time
myocardial infarction, while the remaining 323
had an exacerbation of previously diagnosed CHD.
The controls without CHD were randomly selected
individuals aged 25-74 at the time of sampling, and
2,921 of them were included.
Intervention
The data collected for analysis in this study was
CETP genotype, as well as self-reported information
about alcohol intake, including frequency of intake
of different types of alcohol (low-alcohol beer, medium-strong beer, strong beer, wine, dessert wine, and
spirits) with eight response categories ranging from
never to three or more times a day.
Alcohol intake information referred to intake over
the previous one-year period for controls and for the
one-year period prior to the most recent coronary
events for study participants. Age and sex-specific
standard serving sizes for alcoholic beverages were
used to calculate the daily ethanol consumption.
Daily alcohol intake was divided into three levels
(low, medium, and high), and the odds ratio (OR)
was calculated for having CHD based on genotype
and alcohol intake. Abstainers were classified into a
fourth group, though high/intermediate intake was
compared to the low group, not to the abstainers.
All models were adjusted for age, body mass index
(BMI), HDL, sex, and smoking habits. The tertile
cut-offs are shown in Table 1. In this study, high

alcohol intake for men was considered about one


drink or more daily, while low intake was about less
than a drink daily. For women, high alcohol intake
was classified as about a drink or more daily, while
low alcohol intake was less than of a drink daily.

Table 1: Tertiles of Ethanol Intake


Men (g/day)

Women (g/day)

Low

< 6.5

< 3.2

Medium

6.5-13.1

3.2-6.3

High

> 13.1

>6.3

One drink is 14 grams of ethanol, which is the


equivalent of about 12 ounces of beer, five ounces of
wine, or 1.5 ounces of 40%-alcohol spirits.
This study examined just over 600 cases of heart
disease and almost 3,000 controls, and classified
how much alcohol they drank into three categories that differed based on sex.

What were the findings?


Characteristics of Case and Controls
For both men and women, there was a smaller percentage of alcohol users in the cases compared to the
control groups. For women, 80% of CHD cases and
87% of controls reported using alcohol. For men,
89% of cases, compared to 93% of controls, reporting drinking alcohol. People with CHD also had
lower average ethanol intake compared to controls.
There were no significant differences in the distribution of CETP genotype (B1B1 versus B1B2 versus
B2B2) between cases and controls.
The cases were older than controls (around 62 years,
compared to 51) and sicker. Almost 20% of the peo8

ple with CHD had diabetes, compared to under 5% in the control


group. In addition to being heavier, people with CHD were more
likely to be smokers.
Alcohol Intake on CHD
In the entire cohort, intermediate drinkers had a 35% lower odds of
CHD, compared to low drinkers, regardless of genotype. High drinkers had a non-significant 10% lower odds compared to low drinkers.
Those who abstain from alcohol are often found in observational
studies to have a higher risk of heart disease than moderate drinkers. In this study however, both low drinkers and abstainers had
increased odds compared to moderate drinkers, and low drinkers
did not have lower odds than abstainers. This suggests that the factors typically attributed to abstainers that may impact heart disease
(different social habits, higher previous alcoholism, etc.) may not
have had a large impact in this population.
Genotype on CHD
There were no significant effects of genotype on CHD odds in
the whole cohort, when researchers used B1B1 as a reference. For
B2B2, the 10% lower CHD odds was not statistically significant.
When the same logistic regression model was not adjusted for HDL
cholesterol, the B2B2 genotype was associated with a 29% lower
CHD odds in the whole cohort. The fact that adjustment for HDL
level reduced the effect of B2B2 on CHD odds is not surprising, as
the CETP gene is known to be involved in the regulation of HDL.
Alcohol Intake and Genotype on CHD
B2B2 homozygotes had a remarkable decrease in CHD odds when
they were intermediate alcohol drinkers (79%) and high drinkers
(52%) as compared to low drinkers. In B1 carriers (B1B1 or B1B2
genotypes), intermediate drinkers had a 20% lower odds of CHD,
though it was not statistically significant. B1 carriers who were high
drinkers had essentially the same odds as low drinkers.

Why odds reduction


instead of risk reduction?
You may have noticed the word odds
popping up a lot in this review. The reason stems back to this study not being
a randomized trial. It didnt actively test
interventions on different groups of people, and see what develops over time.
Nor did it observe participants and measure variables as time progresses, like
a prospective observational trial does.
Rather, at one slice in time it estimated
previous alcohol intake and tested for
CETP alleles in a group with heart disease and a group without heart disease.
Since the study was a case-control study,
it cant use the simpler and more intuitive risk terminology. Randomized trials
happen over time, hence you can be sure
that giving the intervention preceded the
outcome, and estimate the risk of the
outcome based on what intervention
was given. That isnt true of case-control studies such as this one, and hence
you can only measure the odds of the
outcome in one group versus another group. However, when a disease is
rare, happening in around 10% or less
of the population thats studied, the
odds ratio and relative risk
will be approximately the same, due the
mathematical
formulas for
each converging.
9

B2B2 Genotype in Intermediate Drinkers


B2B2 intermediate drinkers had a substantial and significant 59%
reduction in CHD odds compared to non-B2B2 intermediate drinkers.
Prevented Fraction
Based on the authors calculation of prevented fraction, this population
would have had around 6% more cases of CHD if the combination of
B2B2 and intermediate/high alcohol consumption had not existed.
While B1B1 and B1B2 genotypes werent associated with lower
heart disease risk, B2B2 intermediate drinkers had 79% lower
risk than low drinkers, and B2B2 high drinkers had a 52% lower risk. These numbers equate to an estimated 6% reduction in
CHD for the overall population.

What does the study really tell us?

Based on the results of the current study, intermediate to high alcohol intake does not significantly reduce CHD odds in people with
B1B1 or B1B2 genotypes. In B2B2 genotypes, intermediate alcohol
intake was associated with a 79% reduction in CHD odds, while
high alcohol intake was associated with a 52% odds reduction.
These results also held up to a variety of sensitivity analyses, such as
measuring alcohol intake in four cutoffs rather than three, including
or excluding adjustment for HDL and various other potential confounders, or when analysis was restricted to those age 60 or older or
those who were enrolled at their first cardiac event.
One strength of this study was that different cut-offs of alcohol
intake were taken into account, rather than just comparing low and
high intake. The models were adjusted for age, BMI, HDL, sex, and
smoking habits, to correct for common confounding factors. The
authors also tested additional factors, like leisure time physical activity, financial security, education levels, marital status, and diabetes
status, but these had no effect on the results. It could be surmised
that intermediate drinkers have more healthy behaviors than the
10

high alcohol group, but at least for the factors mentioned, this was not the case. Thus, the protective
effect of B2B2 at intermediate and higher alcohol
intakes could not be explained by HDL cholesterol
or other lifestyle and socioeconomic variables.
That being said, the cases and controls differed
widely on a variety of characteristics associated with
disease, such as age, weight, and diabetic status. It is
possible that there were other important confounders that were not controlled for.
The study also didnt discuss potential mechanisms
that may explain the results. Previous research
in Norwegians showed that HDL may not be so
important for the protective effect of alcohol on
heart disease. However this Swedish study looked

Earlier studies
didnt take into
account CETP
genotype, and
likely showed a
less substantial
but still protective
effect of alcohol
intake due to a
dilutional effect

specifically at CETP, a gene that appears to be only


involved in transfer of cholesterol from HDL to
other lipoproteins. Yet it found that the additional
protective effect of CETP in intermediate and high
drinkers (on top of just the alcohol intake) was not
explained by HDL levels. This could be due to a variety of factors perhaps a simple measurement of
HDL cholesterol is less important than the number
and type of HDL particles. As was referenced before,
HDL can be anti-inflammatory or pro-inflammatory
depending on physiological context, so simply sticking HDL into a regression may not fully describe the
role of HDL in the relationship between CETP genotype and heart disease odds.
The study results didnt change when sensitivity analysis was performed with different alcohol
intake cutoffs and different confounders. However,
the cases and controls differed in a variety of
characteristics, and its possible that important
potential confounders werent controlled for.

The big picture

Having the B2B2 genotype didnt have a strong protective effect on its own, and neither did drinking
intermediate or high amounts of alcohol on its own.
But combining these two factors was associated with
a substantial reduction in the odds of heart disease.
The authors focused mostly on intermediate intakes
in their discussion, but high intakes also had a substantial reduction in odds, at 52% (compared to 79%
in intermediate drinkers). This may be because high
intakes come with much higher risks.
Earlier studies didnt take into account CETP genotype, and likely showed a less substantial but still
11

protective effect of alcohol intake due to a dilutional effect meaning that the substantial odds
reduction in people with B2B2 likely may have been
diluted by the lack of CHD odds reduction in people
with B1B1 or B1B2 genotypes.
These results confirm a previous study, which
showed that men who were B2B2 homozygotes
with an alcohol intake of 50 grams a day or more
had lower myocardial infarction risk, and the risk
reduction was the strongest when the participants
drank 75 grams a day or more. In the current study,
however, the greatest risk reduction was seen at an
alcohol intake of 6.5-13.1 grams a day, significantly
lower daily intake than seen previously.
It is surprisingly easy to derive different conclusions
based on something as simple as cutoff points the
same data can be sliced into two parts with high
versus low intakes, or several different intakes. And
the reference group can also differ between studies.
In this study, the reference group was made up of
low alcohol drinkers, rather than those who totally
abstain, as abstainers can be quite a diverse group
that includes anybody from former alcoholics to
those who dont drink for religious reasons. Some

large and well known previous studies, such as the


Harvard-run and U.S.-based Nurses Health Study
and Health Professional Follow-Up Study, suggest a
protective effect of the B2 allele. The reference group
in that analysis, however, was abstainers rather than
those with a low alcohol intake. Women in those
studies were found to have stronger benefit from
the B2 allele than did men, which was not found in
this Swedish study. Because study designs and populations differ, its hard to directly compare different
CETP studies.
This study also had some important methodological limitations. Subjects were queried on frequency
of alcohol intake, but were not asked about portion
size. Standard portion sizes were used to calculate
daily alcohol intake. This could lead to inaccuracies
in daily intake data. In addition, under-reporting
of alcohol intake is common during self-reporting,
which could skew the intermediate and high tertiles
of intake. Furthermore, CHD cases could also have
reduced their alcohol intake in response to the diagnosis or under-report intake if they think they are
supposed to limit intake, but this effect is likely to be
same regardless of CETP genotype. This is a weakness of the case-control design, as a prospective study

12

With comparison groups this small,


this study is just one more step in the
progression of studies on the topic, rather
than being the final word on alcohol and
heart disease.
that collects data before CHD develops may be less
subject to this kind of under-reporting. Its also possible that intermediate alcohol users could also have
generally healthier eating and lifestyle habits that
were not captured in the logistic regression model.

LDL, no matter how strong the associations appear.


While HDL = good, CETP = bad is a simplistic
and inaccurate way of thinking, it is surprisingly
pervasive. CETP may promote heart disease in some
situations, and have no effect in others.

This is also just one study among several on the


topic, some of which show conflicting results. This
paper was done on a geographically limited sample
in Sweden, so the results may not apply to those in
another region, like East Asia or Central America.
The small sample size also limits the conclusions
that can be made from this paper. Headlines reading Heart benefits unlikely from alcohol likely
wont mention that this study only included 13 cases
who had the B2 allele and were intermediate alcohol drinkers. With comparison groups this small,
this study is just one more step in the progression of
studies on the topic, rather than being the final word
on alcohol and heart disease.

Thus meta-analyses of CETPs overall effect on lipids and heart disease risk may inadvertently gloss
over interaction effects from factors like alcohol
intake levels or other variables that may moderate
CETPs effects. The topic of heart disease, alcohol,
and HDL is a great example of how focusing on a
single article abstract without context, even if that
abstract describes a well-conducted meta-analysis,
can be quite misleading. A meta-analysis is only
as good as the studies it contains, and the more
complex the interactions get and the more heterogeneous the study designs are, the higher the risk of
a meta-analysis coming to erroneous conclusions.
A meta-analysis of seven studies found that alcohol
did not interact with the B2B2 genotype, but it compared current drinkers versus nondrinkers, which
is likely to be too crude of a comparison to uncover
the more complex relationship found in this study.

Its important to remember that a variety of factors


could influence the effect of alcohol on heart disease, other than just genetics, such as age, sex, and
insulin resistance. Observational studies cannot
attribute causation or lack of causation to HDL or

13

This study confirms some previous evidence while


conflicting with other evidence, likely due to
dividing alcohol intakes into different levels while
using low drinkers as the reference group rather
than abstainers. The study is another part of the
CETP and heart disease puzzle, which is yet to be
fully solved.

Frequently asked questions

Does frequency of alcohol consumption matter?


Would 49 grams of alcohol once weekly (average of
seven grams/day) be just as beneficial for CHD risk
in a B2B2 homozygote as daily alcohol intake of 7
grams?
Its unclear from these study results how frequency
of alcohol intake affects CHD risk reduction. Since
binge drinking is not advised, the smaller amount
would be more consistent with current health guidelines for daily consumption. Heavy drinking increases
the risk of some types of stroke and atrial fibrillation,
which highlights the variety of other cardiovascular
outcomes that are related to alcohol consumption.
Is B2B2 protective for CHD when combined with
intermediate alcohol intake in both men and women?
Its unclear at this point whether the B2B2 genotype with intermediate alcohol intake is protective
against CHD in women. The study under review
and Nurses Health Study may have not had a large
enough number of heart disease cases to detect
these effects. For women who are non-drinkers or
low drinkers, increasing alcohol intake to reduce
CHD risk wouldnt necessarily be advised, given other data that suggests a higher risk of other
chronic disease, including breast cancer, linked with
alcohol intake.

Context is also very important: the


additional effect of alcohol on heart disease wont be nearly as important for a
young person without many risk factors
as it is for someone who has already
had heart disease. The combined risks
of alcohol side effects, plus potential
risk of alcoholism, may very well outweigh alcohol benefits for heart health
even if one is a B2B2 carrier.
Why are studies on cardiovascular
effects of alcohol and CETP so conflicting?
Its not really possible to do a randomized trial of different alcohol
intakes, and see what the cardiovascular effects are. Without RCTs,
observational studies in different
populations couple with mechanistic and animal studies to form the
evidence base.
Analyses in observational studies can use a variety of statistical
methods and control for different
possible confounders, which could
lead to different conclusions even
using the same data. So, even
though the largest meta-analysis
on CETP to date shows that the
B2 allele has a statistically significant but weak protective
effect, the result is heavily dependant on the
methods used
by the studies
it included.
14

Additionally, the mechanisms by which CETP may


help prevent or promote heart disease are also not
clear. In other words, this is a research area that is
still progressing, and disagreements exist within the
academic community. We will keep our collective
eyes out for new studies on this topic.
Does my CETP allele mean that I have higher risk of
heart disease?
This is the million dollar question, for which there
is only a five cent answer: we dont know. Although
this particular study had compelling results due
to studying a variety of alcohol intake levels and
adjusting for a variety of variables, CETP study
results in general are really all over the place.
For example, one review found that the effect of
B2B2 differed depending on the population that was
looked at. In participants with a high risk of heart
disease it was protective, while in general populations it promoted heart disease! The frequency of
B2B2 also differed, being much less frequent among
those with high risk. B2B2 sometimes could predict
whether a lipid-lowering drug would prevent heart
disease, and sometimes couldnt.

[...] there's only


a small portion
of the population
for whom
alcohol intake is
protective against
CHD, and most
all of them are
unaware that they
have a potentially
protective gene.

Can I take a drug to modify my CETP activity and


prevent heart disease?
Because increased CETP activity decreases HDL
levels, this became a research target for new medications in the 2000s. One promising drug, torcetrapib,
reliably raised HDL levels by inhibiting CETP activity, as well as lowering LDL. However, the trial was
terminated early due to torcetrapib causing a 25%
increase in cardiovascular deaths alongside a 60%
increase in deaths from any cause.

heart disease. The effect of your genotype may be


modified by your diet, habits, medications taken
(especially statins) and even other genes. HDL and
LDL by themselves dont mean that much in isolation, and neither does your CETP genotype. Some
people are able to get a portion of their genomes
sequences through services such as 23andme, and
that may help inform the effect of alcohol on a particular individuals heart health. That being said, the
evidence is nowhere near concrete, and the uncertainty about alcohol benefits on heart health is one
of the major takeaways on this topic.

So to repeat: we dont know quite how CETP affects

What should I know?


15

In short, moderate alcohol consumption may


not protect everyone equally from heart attacks.
Protective effects likely depend on genetics. The
results of this study raise the question of whether the
recommendations regarding alcohol intake for the
prevention of CHD are too overarching. Substantial
CHD odds reduction was only seen in people who
were B2B2 homozygotes, with intermediate to high
alcohol intake. For someone giving advice about
how to prevent heart disease (like a physician, or
someone advising an older parent), keep in mind
that the evidence is still quite mixed on this topic.
In the context of public policy, the authors estimated that 6% of heart disease was prevented by the
combination of B2B2 and intermediate/high alcohol
intake. This is not a huge amount for something that
can have several important detriments like drinking
alcohol does.
Its important to note that only 19% of the entire
cohort in this study had the B2B2 genotype. While
the frequency of this genotype in the general population is unknown, the beneficial effect of alcohol
intake on CHD odds would only apply to the small
segment of the population who are B2B2 homozygotes with intermediate to high alcohol intake.
Perhaps in the future, genetic testing will help us
determine our behaviors around alcohol. But for
now it seems theres only a small portion of the population for whom alcohol intake is protective against
CHD, and most all of them are unaware that they
have a potentially protective gene.
Well discuss the potentially complex relationship
between alcohol and heart disease in the private
ERD readers Facebook group. Join us!

16

Type 2 diabetes:
a preventable disease
By Stephan Guyenet, Ph.D.
Three thousand and five hundred years ago, ancient Egyptian physicians reported
excessive urination in some of their patientsa key diagnostic sign of diabetes.
The mummy of Queen Hatshepsut, a powerful pharaoh who ruled ancient Egypt
during this time period, suggests that she was obese and likely suffered from type
2 diabetes. Throughout history, other royals have been posthumously diagnosed
with probable type 2 diabetes, including the portly King Henry VIII of England. Diabetes has been a scourge of the affluent for thousands of years.
Diabetes is defined as a failure of blood glucose control, leading to excessively elevated blood glucose.
This failure of blood glucose control results from
insufficient action of the pancreatic hormone insulin,
which normally constrains blood glucose concentrations, both in the fasting state and after meals.
During type 1 diabetes (formerly called juvenile-onset diabetes), the bodys immune system attacks and
destroys insulin-secreting beta cells in the pancreas,

leading to a near-total disappearance of circulating


insulin. In type 2 diabetes (formerly called adult-onset diabetes), the bodys tissues lose their sensitivity
to the insulin signal. The pancreas compensates by
secreting more insulin, but eventually the beta cells
are unable to maintain this excessive level of insulin
secretion, insulin levels decline, and blood glucose
levels rise.

17

Diabetes is extremely rare in cultures


that maintain a lifestyle similar to our (nonroyal) distant ancestors, yet more than a
third of modern Americans are projected to
develop diabetes at some point in life.
This failure of blood glucose control, and accompanying metabolic disturbances, leads to the familiar
signs and symptoms of diabetes: excessive thirst
and urination, glucose in the urine, excessive hunger, weight loss, fatigue, slow healing, and eventually,
vascular disease, kidney failure, as well as nerve and
retinal damage.
The reason type 2 diabetes is no longer called
adult-onset diabetes is that it now occurs in children as well as adults. This trend is part of an
increase in global diabetes risk that affects people of
nearly all age groups in all affluent nations. Diabetes
is extremely rare in cultures that maintain a lifestyle
similar to our (non-royal) distant ancestors, yet
more than a third of modern Americans are projected to develop diabetes at some point in life. Nearly
all of these cases will be type 2 diabetes. Fortunately,
the causes of diabetes are well known, so much so
that we know how to prevent the large majority of
cases. Lets have a look.
Obesity
Over the last century, but particularly the last three
decades, Americans have bought progressively longer belts. In 1971, 15 percent of Americans were
obese, yet by 2009, that number had more than dou-

bled to 36 percent. The rest of the affluent world is


following closely behind. Excess body fat is likely the
single largest contributor to the modern epidemic of
diabetes.
The following graph illustrates the relationship
between body mass index (BMI; a measure of body
fatness) and diabetes incidence over a five-year period in American men:

Diabetes Risk According to BMI

A BMI between 18.5 and 25 is considered lean, 25


to 30 is considered overweight, and 30 or greater is considered obese. As you can see, the risk of
18

developing diabetes increases rapidly with increasing BMI, and the relationship is extremely strong.
A man with a BMI greater than 35 (obese) has a
42-fold greater risk of developing diabetes than a
man with a BMI below 23 (lean). If we zoom in on
the lower end of the graph, we can see that diabetes
risk increases by 50 percent even before we leave the
lean BMI range, and more than doubles for people
who are only slightly overweight!

Diabetes Risk According to BMI

people gain fat, lose muscle, and become more sedentary with age.
Physical activity
Muscle tissue is the single largest user of glucose
in the body, and when its fuel needs are high, it
increases its sensitivity to insulin to accelerate glucose uptake. Because of this, physical activity causes
a rapid and profound increase in muscle insulin sensitivity, leading to an increase in whole-body insulin
sensitivity. This increase in insulin sensitivity only
lasts a few days, so regular physical activity is essential to maintain it.
Not surprisingly, people who are more physically
active have a lower risk of developing diabetes, and
the association is substantial. People who engage
in regular vigorous exercise, or even walk regularly,
have just over half the diabetes risk of people who
are the most sedentary.

Countless experiments show that this is more than


just an association: excess body fat contributes
to the metabolic disturbances that lead to type 2
diabetes. This appears particularly true of the visceral fat that surrounds the organs underneath the
abdominal wall.
Age
Nearly all lifestyle-related disorders are strongly
linked to age, and type 2 diabetes is no exception.
Among the elderly, the yearly likelihood of being
diagnosed with diabetes is more than 30 times greater than among young adults. Part of this excess risk
isnt linked to age directly, but to the fact that most

Genetics
One of the most effective ways to avoid type 2 diabetes is to choose your parents wisely. All of the most
common forms of diabetes, including type 2 diabetes, have a strong genetic component. Like most
lifestyle-related disorders, diabetes is not usually
caused by a single gene variant. Rather, its caused by
complex interactions between many different gene
variants and the environment in which a person lives.
Possibly for genetic reasons, certain racial groups are
at a higher risk of diabetes than others. For example, Asians, including people of Indian descent, are
at a higher risk of developing type 2 diabetes at any
given BMI. In other words, a modestly overweight
Indian or Chinese person may have the same diabetes risk as an obese Caucasian person.
19

The genes that influence type 2 diabetes risk tend to be


involved in the development and function of the insulin-secreting pancreas, and to a lesser extent, body fatness.
Some of these genes may determine how well beta cells
are able to cope with the metabolic battering that accompanies obesity and insulin resistance.
Preventing type 2 diabetes
Some risk factors arent modifiable: we simply have to live
with them. We cant change the genetic cards weve been
dealt, nor can we roll back the years of our lives that have
elapsed. Still, the risk factors we can control are so powerful that they can eliminate the large majority of type 2
diabetes risk. Several randomized controlled trials have
clearly demonstrated this, including the massive Diabetes
Prevention Program (DPP) trial. This trial reported that
a combination of dietary weight loss and regular exercise
reduced the risk of developing diabetes by an astounding
58 percent over a 2.8-year period in pre-diabetic volunteers. Several similar trials conducted in other countries
and other racial/ethnic groups reported almost identical
results. This is one of the greatest triumphs of modern
biomedical science.

diabetes risk
increases by 50
percent even
before we leave
the lean BMI
range, and more
than doubles
for people who
are only slightly
overweight!

Keep in mind that these trials started with people who


were already nearly diabetic, and who didnt lose much
weight or adhere particularly closely to the intervention.
Imagine what a lifetime of healthy living could do.

Stephan is an obesity researcher, neurobiologist, and author.


In addition to his research, he enjoys synthesizing and communicating science for a general audience. He has a B.S. in
biochemistry (University of Virginia) and a Ph.D. in neurobiology
(University of Washington). His blog Whole Health Source is a free
resource for anyone who loves the science of health.
20

Investigating
a progression
of carb and
saturated fat
intakes

Effects of step-wise increases


in dietary carbohydrate on
circulating saturated fatty acids
and palmitoleic acid in adults
with metabolic syndrome
Introduction

Saturated fat reduction has long been a major target of dietary


guidelines, although recent meta-analyses have failed to show
an association with heart disease. Current recommendations
in the U.S. include limiting saturated fat intake to less than
10% of total energy intake. However, a reduction in fat intake
typically leads to an increase in carbohydrate intake. A consequence of overconsumption of carbohydrates is increased de
novo lipogenesis (DNL). DNL is a process which involves the
synthesis of fatty acids from non-lipid sources, such as carbohydrates or amino acids. Interestingly, even energy-balanced
diets, and single-meal consumption of carbohydrates above
the normal oxidative capacity of the body have been shown to
21

increase DNL. The percentage of ingested carbohydrate contributing to DNL is however quite minor in
people who arent insulin resistant and overfeeding
on refined carbohydrate.
The major end-product of DNL is the saturated fat
palmitic acid (denoted 16:0, referring to 16 carbons
and zero double bonds), which can be desaturated
within the body to form the monounsaturated fat
palmitoleic acid (16:1). Higher blood levels of palmitoleic acid have been associated with an increased
risk of metabolic syndrome and greater amount of
inflammatory markers. Palmitoleic has mixed evidence however, also being associated with some
positive biomarkers such as higher HDL and greater
insulin sensitivity. Divergent impacts could be due
to the effects of different lifestyle factors and different physiological conditions (such as how much of
DNL is from adipose tissue versus from the liver).
This study sought to assess how incremental changes in dietary carbohydrate intake and decreases in
saturated fat intake affect plasma saturated fatty
acid and palmitoleic acid levels. The study was conducted in adults with metabolic syndrome under
hypocaloric conditions.

Saturated fat is commonly targeted for reduction


by dietary guidelines. This typically leads to an
increase in carbohydrate intake, which at high
levels may cause the body to create fats through
de novo lipogenesis. This study investigated several levels of saturated fat and carb intake to see
how they affected plasma saturated fats and palmitoleic acid.

Who and what was studied?

The study was an 18-week controlled dietary


intervention in which the participants were initially fed a low-carbohydrate diet that gradually
shifted to a high-carbohydrate diet over six consecutive phases (from lowest carb to highest carb:
C1C2C3C4C5C6).

Prior to beginning the six feeding interventions, the


participants were instructed to follow a low-carbohydrate run-in diet for three weeks that mimicked
the first low-carbohydrate phase, in order to initiate
metabolic adaptations to carbohydrate restriction.
Baseline and run-in nutrient intakes were determined with the help of three-day food logs.

The percentage of ingested


carbohydrate contributing to DNL is
however quite minor in those who arent
insulin resistant and overfeeding on
refined carbohydrate.
22

All food was provided for the subjects during the


18-week intervention. Participants picked up their
meals three to four times per week, and if the subjects could not travel to pick up their food, the
researchers arranged for delivery in order to ensure
that every subject received their food as planned.
Blood testing was done at baseline, after the run-in
diet, and after each phase (before transition to the
next diet) to determine fatty acid composition and
other blood markers.
Over the entire 21-week period (intervention and
run-in), the subjects diets were designed to produce
a 300 kcal deficit per day. Resting energy expenditure (REE) was estimated at baseline with indirect
calorimetry and multiplied by an activity factor to
estimate the total daily energy expenditure (TDEE)
of the subjects. Protein was held constant at 1.8
grams per kilogram of ideal bodyweight. As carbohydrates were increased every three weeks over the
six feeding phases, total fat was decreased to maintain energy intake. Thus, across the entire study,
protein and calorie intake was similar. Saturated fat
was also maintained, at 40% of total fat intake. In
comparison, Americans only derive around 34% of
their calories from any kind of fat, with around 13%
coming from saturated fat.

Indirect calorimetry
Indirect calorimetry measures the production of
carbon dioxide and consumption of oxygen to estimate heat production. This is then entered into an
equation to estimate resting energy expenditure.
Although not without error, indirect calorimetry
remains the gold standard for measuring energy
expenditure in laboratory settings.

Only very-low and non-caloric products such as


coffee, tea, water, and diet soda were allowed to be
consumed by the participants in addition to the
provided foods. Beef, eggs, and dairy were the primary protein and fat sources, with higher and lower
fat versions used depending on the study phase.
Low-glycemic carbohydrates were emphasized
throughout.
The subjects were 12 overweight and obese men and
four women with metabolic syndrome, between 30
and 66 years old (average 44.9) with BMI ranging
from 27-50 kg/m2 (average 37.9). Exclusion criteria
included having diabetes, liver, kidney, or other metabolic or endocrine dysfunction. Participants who
were physically active were asked to maintain their
activity levels while sedentary people were asked not
to begin an exercise program.
This study investigated the effects of various carbohydrate diets on a group of overweight and
obese participants. Study participants initially ate a low-carbohydrate diet that turned into
a high-carbohydrate diet over 18 weeks, in six
phases.

What were the findings?

Energy intake (EI) across the feeding interventions


averaged about 2,500 kcal per day and protein intake
averaged about 125g per day (20% EI). As designed,
protein and energy intake remained constant over
the 18-week intervention. As seen in Figure 1, carbohydrate intake started at an average of 47 grams
per day (7% EI) and rose to an average of 346 grams
per day (55% EI). Total fat intake started at an average of 209 grams per day (73% EI) and dropped
23

Figure 1: Carb and saturated fat intake by study period

to an average of 80 grams per day (28% EI). The


authors claim that compliance was high, based on
verbal communication and inspection of returned
food containers. There were no dropouts.

Compared to baseline, fasting glucose & insulin,


HOMA-IR (measure of insulin resistance), and
systolic and diastolic blood pressure significantly
decreased at C1, but were not significantly altered
throughout the six feeding phases.

Both body weight and fat mass (measured by DXA)


showed a significant decline from baseline to C1
(about seven kilograms and four kilograms, respectively), and continued to decline throughout the
entire intervention, ultimately resulting in an average loss of about 10 kilograms of bodyweight and
eight kilograms of body fat. Neither weight loss nor
fat mass were significantly different between C4 and
C6, suggesting that most of the change occurred in
the first 12 weeks (run-in, C1, C2, & C3).

Despite saturated fat intake starting at 84 grams per


day and decreasing to 32 grams per day, the proportion of total saturated fatty acids in blood lipids was
not significantly affected. Palmitic acid (16:0), the
predominant saturated fatty acid within blood lipids,
significantly increased from baseline to C1 and significantly decreased from C1 to C2, C3, C4, and C5.
C6 was not significantly different from C1.

Total, LDL, and HDL cholesterol values were not


significantly altered across any of the feeding phases.
Triglycerides levels dropped about 22% from baseline to C1. These levels stayed constant through C5
and had a significant return to baseline values at C6.

Stearic acid (18:0, which is commonly found in animal fats and cocoa) was not significantly changed
in cholesterol esters. But from baseline to C1, it was
significantly reduced in phospholipids and also
decreased in triglycerides through the intervention,
ending with a significant reduction in C6 relative to
C1. Phospholipid concentrations showed an oppo24

site pattern, increasing throughout the intervention and ending with


a significant increase in C6 relative to C1.
There was a significant reduction in total monounsaturated fatty acid
concentrations from baseline to C1 only. Similar to 18:0, as carbohydrate increased, plasma oleic acid (18:1) decreased in triglycerides,
but increased in phospholipids.
Palmitoleic acid (16:1) was significantly reduced from baseline to C1
in triglycerides and cholesterol esters, and trended for an increase in
phospholipid concentrations. All these markers showed increasing
concentrations with increasing carbohydrate intake and ended the
intervention with significantly greater concentrations of palmitoleic
acid at C6 relative to C1.

Lipoproteins and
lipid fractions
This study looked at how much palmitoleic acid was contained in three
different locations in blood plasma: triglycerides, phospholipids,
and cholesterol esters. Lipoproteins
shuttle lipids (such as fatty acids

There was great individual variation in palmitoleic acid concentrations during each diet phase with notable outliers. For instance, one
subject had triglyceride concentrations of palmitoleic acid rise by
nearly three-fold from C1 to C4 (2% to about 5.8%) and further rise
from C4 to C6 (about 5.8% to 7%). However, another subject showed
no changes across the entire intervention, and another showed
reductions as carbohydrate intake increased.

and cholesterol) around the body.


Phospholipids form the outer shell
of lipoproteins, while cholesterol
esters and triglycerides make up the
majority of the core.
So the phospholipid fraction refers
to the fats that are contained in the

Study participants lost body weight and fat over the 18-week
intervention, with most of the changes occurring in the first 12
weeks. The blood samples researchers analyzed suggested that
carbohydrate intake can influence blood levels of compounds like
palmitoleic, stearic, and palmitic acid.

phospholipids, with the same reasoning for triglyceride fraction


and cholesterol ester fraction.
Sometimes these different fractions
respond the same way to diet, and
sometimes they dont. Hence its
important to measure all of them.

What does the study really tell us?

There are numerous studies showing associations between higher proportions of palmitoleic acid in blood and tissue, and adverse
health outcomes such as metabolic syndrome in adults and adolescents, hypertriglyceridemia, type-2 diabetes, coronary heart
disease, and prostate cancer. However, since none of these studies
establish causality, it is possible that these conditions lead to high25

er proportions of palmitoleic acid (for example,


palmitoleic acid may be the bodys attempt at a protective response to what is being eaten) rather than
vice-versa. With the mixed associations shown in
studies, it is hard to know for sure what the exact
health effects of palmitoleic acid are.
It is also difficult to quantify the amount of palmitoleic acid needed to increase the risk of these
endpoints, as few studies have done so. In the
Physicians Health Study, one standard deviation
increase in plasma phospholipid palmitoleic acid
concentrations was
associated with a
significant 17%
higher risk of heart
failure even after
adjustment for
BMI, alcohol consumption, smoking,
exercise, and plasma
omega-3 levels.

reductions in weight and fat mass were observed,


making the causative factor difficult to isolate. And
there was no weight loss matched control group to
account for weight loss effects. Between the lower
palmitoleic acid concentrations, the weight and fat
loss, and the reduction in carbohydrate intake, we
cannot say which came first and which led to which.
On the other hand, by the end of the intervention,
when carbohydrate intake was similar to baseline
intake (346 grams vs. 333 grams) plasma palmitoleic acid levels returned to levels similar to those
observed at baseline
despite significantly
lower weight and fat
mass, strongly suggesting that it was
carbohydrate intake
that influenced plasma palmitoleic acid
levels.

With the mixed


associations shown
in studies, it is hard to
know for sure what the
exact health effects of
palmitoleic acid are.

In the study under


review, baseline
daily intake of carbohydrate and fat
averaged 333 grams and 130 grams, respectively.
During the first phase of the intervention, carbohydrate intake dropped to an average of 47 grams,
while fat intake rose to an average of 209 grams.
It was during this time that the most significant
changes in blood lipid fatty acid concentrations
occurred, including a major reductions in palmitoleic acid levels. Additionally, this was when significant
improvements in insulin sensitivity, blood pressure, and plasma triglyceride levels were observed.
However, this was also when the most significant

The authors also


repeated the entire
experiment backwards in five
additional subjects
(from high to low carbohydrate intake) and found
that plasma palmitoleic acid responded in the exact
opposite pattern as the main study group, which
supports the idea that carbohydrate intake influences palmitoleic acid concentrations. Even so, the
overall diets were hypocaloric, and we cannot conclude how carbohydrate intake would influence
palmitoleic acid levels under eucaloric or hypercaloric contexts.

26

This study provides evidence to suggest carbohydrate intake influences palmitoleic acid levels.
Although evidence is mixed, high levels of palmitoleic acid in the blood are associated with
metabolic syndrome, type 2 diabetes, coronary
heart disease, and other health problems. In this
study, participants experienced a drop in palmitoleic acid levels when they were eating low-carb
meals in the first phase of the study.

The big picture

With 18 full weeks worth of food provided for the


participants, this study provided a well-controlled
environment in which to study the effects of diet
on palmitoleic acid. Yet despite the findings from
this study, the relative risk from various palmitoleic acid concentrations in the blood remains to be
determined. In the previously mentioned Physicians
Health Study, the highest quartile had an average
palmitoleic acid level of only 0.50%, whereas in the
current study, even when phospholipid palmitoleic
acid concentrations were at their lowest during the
low carbohydrate phase, absolute concentrations
averaged 0.61%, putting these participants above the
vast majority of the Physicians Health Study subjects.
Other blood lipid changes add further complexity to the implications of this study. For instance,
increasing carbohydrate intake led to greater phospholipid oleic acid concentration, which in contrast
to palmitoleic acid, has been shown to attenuate the
pro-inflammatory and cytotoxic effects of excessive
saturated fatty acid incorporation. Myristic acid,
which showed a reduction with carbohydrate restriction, plays a physiologically critical role in de novo
ceramide synthesis (necessary for regulating cell dif-

ferentiation,
proliferation, and
apoptosis) and
has been shown to
increase delta-6 desaturase activity (first step
in creating long-chained
polyunsaturated fatty
acids such as EPA, DHA, and
arachidonic acid from their shortchained precursors).
The applicability of this study to real-life situations is uncertain. There were only 16 participants,
with widely varying BMIs, each using a particular
dietary composition for a limited period of time.
The effect of carbs on blood lipids was confounded
by the weight loss that was designed into the study,
without a weight loss control group that would help
to isolate the effects of carbs. Also, a variety of different outcomes were measured. So while palmitoleic
acid was emphasized in the title and study discussion, other important outcomes had different results.
For example, outside of C1, cholesterol and blood
pressure didnt change regardless of diet. The subjects in this study already had metabolic syndrome,
27

so changes in
things like blood
pressure and triglycerides may be
more important
than changes in
bound plasma fatty
acids, since some of
these fatty acids are
linked to metabolic
syndrome (which
they already have)
while blood pressure may have a
more direct impact
on their health.
Also, circulating free fatty acids, which are linked to
metabolic and heart health, were not assessed.

gram, the Egg


Nutrition Center,
and the Robert
C. And Veronica
Atkins Foundation.
The funding sources
did not have a say in
designing the study
or writing the manuscript. However,
these organizations
are quite clearly
interested in the
research on saturated fatty acids, thus
the variety of studies funded by them. The primary investigators are
also noted low-carb researchers. This also doesnt
mean the study is biased, but it is one thing to keep
in mind when interpreting the study findings. A
given topic (here, the effect of carbohydrate intake
on plasma saturated fatty acids), can be explored
in a variety of different ways, and the results can be
interpreted by the study authors in different ways
as well. Its important to look at the broader context
of literature and the nitty-gritty study details rather
than just take the authors word for it.

A given topic [...]


can be explored in
a variety of different
ways, and the results
can be interpreted by
the study authors in
different ways as well.

While the total proportion of plasma saturated fats


didnt differ in any of the diet phases, different individual plasma fatty acids can have different effects.
Palmitic acid, the predominant saturated fatty acid
which was noted in the paper to be a predictor of
metabolic syndrome and heart disease, was actually
lower in phospholipids (but not the other two lipid
fractions) from C2-C5 than it was during the low
carb C1 or high-carb C6 periods. This finding was
not explained, nor were changes in stearic acid and
oleic acid. So while a variety of fatty acids were measured and reported, palmitoleic acid was the only
one focused on in the discussion. Unfortunately it
was also the only focused on in many news stories
with inaccurate headlines such as Heart disease and
diabetes risk linked to carbs, not fat, study finds.
It must be noted that this study was funded by the
Dairy Research Institute, The Beef Checkoff pro-

Other plasma fatty acids, such as palmitic,


myristic, and oleic acid, may be important for
evaluating the health effects of different carbohydrate and fat intakes. Although measured, these
were not a focus of the study. Nor were more
direct predictors of heart and metabolic health,
such as blood pressure. The study was funded by
dairy, beef, and low-carbohydrate organizations.

28

Frequently Asked Questions

What else influences plasma palmitoleic acid levels?


The current study lends support to the idea that palmitoleic acid concentration in the plasma is more reliant on carbohydrate intake than
fat intake. However, the study was conducted under hypocaloric conditions, and previous research has suggested that dietary intake of
palmitoleic acid (which is rich in a few select foods such as macadamia
nuts) does significantly influence plasma concentrations during weight
maintenance. Alcohol has also been suggested to reduce palmitoleic
acid concentrations, with one study reporting significantly lower levels
in people consuming more than 100mL of ethanol consumption per
week (seven regular 12-ounce beers) compared to people consuming
less. This study also found palmitoleic acid concentrations to be independent of smoking status.
How do various biomarkers of fatty acids in the body differ?
Biomarkers of fatty acid composition differ from dietary intake, in
that biomarkers reflect both the intake and the utilization of the fatty acids. Because not everyone is similar in how we absorb, transport,
and metabolize nutrients, biomarkers allow us to look beyond simple
dietary intake and focus on the physiological consequences of consuming certain substances. Moreover, biomarkers can provide a long-term
picture of dietary intake.
Due to the essential nature of fatty acids in cell structure, assessment
can involve numerous body tissues in addition to blood and urine (e.g.
hair, nails, skin, breath, saliva, feces). However, measuring blood plasma is the most common method. Serum triglycerides reflect dietary
intakes over the past hours to days, whereas cholesterol esters and
phospholipids reflect daily intakes. Only body fat stores (adipose tissue)
tend to reflect long-term dietary fat consumption (e.g. years), and even
this measure can be inaccurate in people who have experienced cycles
of fat loss and gain.
How strongly is palmitoleic acid associated with heart disease, when
compared to other biomarkers?
Although statistically significant, the strength of the relationships
29

Only body fat stores tend to reflect


long-term dietary fat consumption
(e.g. years), and even this measure
can be inaccurate in people who have
experienced cycles of fat loss and gain.
between palmitoleic acid and health parameters is
low to moderate. For instance, in one study of over
3200 Chinese adults, palmitoleic acid concentrations
could only explain about 37% of the variance in
triglyceride levels and 14% of the variance in HDLcholesterol levels.
It should also be kept in mind that fatty acid levels
in any biomarker represent a proportion and not an
absolute measure. Thus, greater integration of certain fatty acids into the biomarker can reduce the
percentage of other fatty acids without their absolute
amount changing. All of the aforementioned studies
demonstrating associations between fatty acids and
health outcomes were based on percentages, making it difficult to draw conclusions as these are not
quantifiable values. One person could have double
the amount of palmitoleic acid in serum as another
person and still have similar percentages if they also
have double the amount of blood lipid.
There is also evidence of seasonal variations in fatty acid profiles. One early study showed greater
proportions of saturated fatty acids in the adipose
tissue of the legs and arms during summer compared to winter. This difference was attributable to
a reduction of palmitoleic and oleic acid levels, with

a simultaneous increase in palmitic, myristic, and


stearic acid levels. Although these changes were in
adipose tissue and not serum biomarkers, it raises the question of whether the current study could
have been influenced by seasonal changes as its six
month duration, by necessity, spanned more than
one season. Since subtle changes in plasma fatty acid
levels were tracked over increments of time, it would
be difficult to differentiate what changes were at least
partly a result of the season.
What dietary sources have a lot of palmitoleic acid
in them?
According to the USDA nutrient database, roasted
chicken skin from the leg and thigh contains the
greatest amount of palmitoleic acid with 2.8 grams
per 100 grams of food. Beef fat follows with about
1.9 grams, then turkey skin with 1.34-1.5 grams, and
finally butter at 0.96 grams. Poultry skins contain
the most palmitoleic acid on average, followed by
beef fat and butter. Macadamia oil is a rich source,
containing 19% palmitoleic acid.
Keep in mind that palmitoleic acid is different than
trans-palmitoleic acid. The latter comes from very
limited sources, mostly red meat and dairy from
grass-fed cows, and is not synthesized by the body.
30

Transpalmitoleic
acid in plasma lipids and adipose tissue has been repeatedly associated with
better metabolic outcomes, as shown in this paper
by ERD reviewer Stephan Guyenet, Ph.D.
Are there benefits to palmitoleic acid from diet? In
plasma? Elsewhere?
A very recent study published in December of 2014
found that feeding mice 300 milligrams of pure
palmitoleic acid per kilogram of bodyweight daily,
in addition to their normal diets for ten days significantly increased glucose uptake in fat tissue through
increased expression of glucose-uptake transporter
4 (GLUT4; necessary for insulin-stimulated glucose
uptake into tissues). This was despite no changes in
plasma fatty acid levels.
Earlier studies have also found palmitoleic acid to
enhance glucose uptake and insulin sensitivity of
skeletal muscle, and reduce liver fat buildup. The
authors of this study suggest that palmitoleic acid
may act as a major signaling lipid produced from
fat tissue for communication with distant organs. In
obese sheep, infusion of palmitoleic acid twice daily for 28 days preserved insulin sensitivity before
beginning an obesogenic diet, possibly through a
reduction of intramuscular fat.
It appears that the benefits of palmitoleic acid

revolve
around insulin-mediated glucose disposal into
both muscle and fat tissue. This raises an interesting
contradiction, with the studies demonstrating associations between palmitoleic fatty acid levels in the
blood and some adverse health outcomes such as
diabetes. Like certain cholesterol markers, palmitoleic acid may be more of an indicator that something
might be physiologically wrong rather than a cause.
DNL is one possible cause of increased palmitoleic
acid levels, and very high levels may be a marker that
something is increasing DNL to dangerous amounts
(such as prolonged overeating of carbohydrate, or
worsening glucose tolerance from uncontrolled
diabetes, both of which can disrupt carbohydrate
metabolism). Suggesting that palmitoleic acid is
100% detrimental does not seem accurate given the
complexity of evidence on the topic.

What should I know?

This study suggests that the presence of certain fatty acids in blood lipids appears to depend more
on carbohydrate than fat intake under hypocaloric
conditions in overweight and obese people with
metabolic syndrome. There were minor but uniform changes in a few select fatty acids, such as
31

myristic acid, oleic acid, and palmitoleic acid, but no


significant changes in total saturated and monounsaturated fatty acid concentrations.
There was also inter-individual variance in the palmitoleic concentration response to carbohydrate
intake, which is important given the small sample
size. While most subjects showed a positive association, others stayed relatively unchanged and
some showed an inverse association. Moreover,
there was greater variance as carbohydrate intake
increased. The absolute palmitoleic concentration
varied between about 2-4% in plasma triglycerides
when carbohydrate intake was lowest during C1, but
varied between about 2-7% during the high-carbohydrate C6 phase.
Still, the implications of changes in plasma palmitoleic acid levels have yet to be determined. Many studies
demonstrate associations between adverse health
outcomes and increased palmitoleic acid levels, but
reverse causality cannot be ruled out, nor differing
impacts of palmitoleic acid in different contexts. We
also do not know what influence many other dietary,
lifestyle, and environmental factors have.
Rather than having obvious health implications for
differing carb levels, this study serves as additional
evidence for those eating low-carb higher saturated
fat diets (and losing weight) who are apprehensive about impacts on their plasma fatty acids. As
is the case with cholesterol, what you eat does not
translate directly to what is floating around in your
blood. However, the lack of correlation between
dietary saturated fat and plasma saturated fat was
already shown by a previous paper from the same
research group (albeit only the triglyceride fraction
was studied).

Its also important to know what this study does not


show: it doesnt show that DNL happens at major or
dangerous rates when eating moderate carb levels,
it doesnt show that increasing levels of carb intake
increased overall plasma saturated fat, and it doesnt
prove that low-carb diets are superior to moderate carb diets for heart or metabolic health. While
weight loss decreased as carbs were added, that may
very well be due to increased water weight or changes in compliance.
The authors conclude that the increased proportions of palmitoleic acid concentrations may signal
impaired carbohydrate metabolism, yet in vitro and
animal studies have suggested that palmitoleic acid
is insulin-sensitizing. It seems prudent not to draw
health-based conclusions from this study. Rather,
the conclusion appears to be that consumption of
carbohydrates can have an impact on plasma fatty
acid proportions in overweight and obese individuals under hypocaloric conditions. Whatever health
implications this may lead to will require further
testing to illuminate.
The health implications of this study are unclear.
The lack of impact of dietary saturated fat on plasma saturated fatty acids was already shown in
previous studies. This study did show an effect of
carbohydrate on palmitoleic acid levels, but the
relative importance of that is unknown.

Low carb diets are nothing if not controversial. For


some evidence-based discussion on their potential
health effects, check out the ERD private Facebook
group.

32

Whence the hype?

The association between exaggeration in health


related science news and academic press
releases: retrospective observational study
Introduction

When it comes to health news, even though we


know not to believe the hype, hype still happens
and it has an impact. Not only is the publics use of
health care services influenced by the media, but
even professionals arent immune. Press coverage of
medical research findings is associated with those
findings being cited more by other scientists.
Even doctors in the ER test more for certain infections that have been getting heavy press coverage.
Since the press is so influential, its important that
the media reports medical findings accurately. But
it doesnt seem like thats happening: past research
has shown media coverage of medical and nutritional research is often distorted.
But all of this doesnt imply that the blame lies
with the science journalists. They are often under

immense pressure to write more, fast, which


encourages reliance on press releases and summaries from news agencies, universities, and other
public relations outlets. This is why its quite possible that journalists are reporting the information
they are receiving (fairly) accurately, and instead
it is the information sources they rely upon which
lead to media hype.
Indeed, a previous study of press releases from medical centers found that many provided exaggerated
claims, while few provided caveats and precautions
about their claims. Similar results were found in
cancer genetics research, where press releases often
exaggerated causal claims which were then repeated by the media. But the origin of the hype may go
back even further than press releases. One study
found that exaggerated claims often could be traced
back to the abstract of the original journal article.

33

The purpose of the study under review was to


expand upon the research above and trace the
source of the hype in health science news.
Hype is ubiquitous in health news reporting.
But this hype may come from places other than
journalists exaggerating findings. Health news
impacts not only the general public, but also physicians and other researchers.

Who and what was studied?

The researchers began by searching for publicly-accessible press releases from Russell Group
universities (the top 20 research universities in the
UK) that covered research related to human health
and were based on peer-reviewed research published
in scientific journals in 2011. For each press release
based on published scientific research, associated
print and online news stories were then located.
Broadcast news wasnt examined in this study.

Even doctors
in the ER test
more for certain
infections that
have been
getting heavy
press coverage.

With all of this information in hand, the researchers


rigorously defined how hype was created at each of
the three stages: original journal article, press release,
and news report.
This was accomplished by creating a detailed coding
system, which reviewers could use to grade each
source for the kinds of claims they were making in
order to compare the hype level and notice any differences between the research, press releases, and
news reports. To do this rigorously, the researchers
focused on three specific areas:
Advice-giving (e.g. Eating chocolate may be
beneficial for or Doctors should advise
patients to). This was coded at four levels depending on how implicit or explicit the
advice was.
Causal statements from correlational research
(e.g. drinking wine might increase cancer
risk... from a study that only observed correlations between these two things). This was
coded at seven different levels, on a continuum from statements that explicitly mentioned
correlation, to those that were ambiguous (e.g.
wine linked to cancer risk) to those that were
explicitly implying causality (e.g. drinking
wine increases cancer risk).
Conclusions phrased in human terms when
research was done on animals, cells, or simulations (e.g. a pregnant womans stress levels
concerning studies that were only done in
rats). These were also coded at different levels
depending on how implicitly or explicitly the
conclusions were stated.

34

When coding for advice, the entire journal article, press release, or news
story was examined. There was a total of 213 press releases (116 of these
had news reports related to them), and 360 total news stories included.
Furthermore, for press releases and news stories, only the title and first
two sentences were coded, since news writing is formulaic and often follows an inverted pyramid structure, where the main claims are stated
first. A sample of 182 press releases, 95 with news, and 261 news stories
were used here. Only the abstract and discussion were coded for the
original journal articles. Finally, when examining human conclusions
from non-human studies, the main statements of 105 press releases (48
with news) and 115 news articles were coded, while only the abstract
and discussion sections of journal articles were coded.
Two other areas were also examined to get a measure of how well-justified the claims made in press releases and news articles actually were.
This was done by noting which press releases and news articles had
explicit caveats to their causal claims, advice, and inference to humans
(e.g. The scientists who carried out the study emphasized that they
could not say for certain...) and explicit justification for any of these
three types of claims (e.g. even after taking into account the effect of
extra body weight on blood pressure, there was still a significant link
with sweetened drinks). In addition to these two areas, some other
facts about the studies being reported were collected as well, such as
duration, sample size, and sources of quotes.
The researchers explicitly took the peer-reviewed journal article as the
baseline for the claims being made in press releases and news stories
concerning the research. The original journal articles themselves were
not fact-checked or examined to see if they were over-hyping anything.
Which is not surprising, given that the authors of this study arent likely to be experts in dozens of biomedical and health research areas. So
hype was measured by whether press releases and news articles were
exaggerated compared to the original journal article.
If the original journal article itself contained hype, this study would not
be able to detect it. But, if hype does exist in the original peer-reviewed
research (and the authors of this study think its likely), then any hype
35

But the origin of the hype may go


back even further than press releases.
One study found that exaggerated
claims often could be traced back to the
abstract of the original journal article.
found in this study is likely an underestimate of
overall hype, since hype originating in the peer-reviewed scientific literature is not being taken into
account. The researchers were also quite careful to
make sure that their coding scheme was reproducible. They did this by double-coding 27% of press
releases and journal articles, and 21% of news stories. They found that there was a 91% concordance
rate in coding. The researchers then ran simulations
to make sure that a 10% discrepancy in coding
wouldnt affect their main conclusions, and it didnt.
Researchers examined press releases from the top
20 research universities in the UK to determine
the origin of hype, or exaggeration, in media
reports on new scientific findings. Exaggeration
was determined by the presence of advice unsupported by scientific evidence and inappropriate
extrapolation of evidence.

What were the findings?

The researchers found that 40% of press releases contain more direct or explicit advice than the journal
articles upon which they were based did. Similarly,

33% of press releases contained more strongly-worded claims of causation than the associated journal
article warranted, and 36% of press releases inflated
relevance to humans from non-human studies. So,
it seems that press releases tend to add quite a bit of
hype in all three areas studied.
It was also found that 36% of news reports contained
more direct or explicit advice than the corresponding journal article. However, this does not
necessarily imply that the journalists were the ones
inflating the advice. The odds of exaggerated advice
in news was 6.5 times higher when the press release
contained exaggerated advice than when it didnt.
A similar pattern held for the two other areas of
hype examined. While 39% of news articles were
more strongly deterministic than what was warranted by the associated journal article, the odds that the
news had distorted causal statements was 19.7 times
higher if the press release also contained distortions.
Similarly, 47% of news articles reporting on non-human studies contained exaggerations, with the odds
of these exaggerations being 56% higher if the press
releases contained similar distortions. As seen in
36

Figure 1: How press release hype


correlates with news hype

Figure 1, hype occurs in both press releases and news


articles, and its much more likely to be present in
news articles if the press releases also contain hype.
So there is hype but why the hype? The authors
hypothesized that one possible motivation for exaggerating claims in press releases could be to increase
the chance that the press release will be picked up and
reported by the news. But when the researchers looked
at the data, they found that there was no statistically
significant association between the percentage of press
releases that had at least one news story published on
their topic and whether or not the press release was
hyped in any of the three ways this study examined.
Also, the average number of news stories per press
release did not vary between any of the three types of
hype. So, whatever the motivation, hype in press releases is not actually correlated with more press coverage.
Finally, the researchers found that caveats about and justifications for the claims being made were quite rare in
both press releases and news stories, with at most only
17% of these claims having some sort of caveat or justification (depending on the type of claim and source).
There was no association observed between caveats and
justification in press releases and news uptake, however.
But there was a strong association between press releases having caveats and justifications about their claims
and news sources having them as well.
The results of this study show that about 40% of
press releases generated by the scientists contain
the seeds of hype: exaggeration. Moreover, news
reports based on hyped-up press releases tended
to contain more hype and exaggeration than news
reports based on press releases with cautionary
statements.
37

What does the study


really tell us?

The study tells us that both biomedical press


releases and news reports contain exaggerations that go beyond the peer-reviewed
journal articles upon which they are based.
Specifically, this study looked at exaggerations of three kinds. They found that 33%
to 40% of press releases and 36% to 47% of
news reports contain stronger inferences
than were warranted by peer-reviewed
journal articles, depending on the type of
exaggeration.
This study also tells us that news reports
were much more likely to contain these
types of exaggerations if the associated press
releases also had them. The odds of that
news reports would contain exaggerations
was 6.5 to 56 times higher if the associated
press release also had such exaggerations.
While the population under scrutiny was
press releases from Russell Group universities in the UK, the authors explicitly state
that they have no reason to suspect that
this group of universities differs from other
sources of press releases in any significant
way, although this claim was not supported
or argued for in the paper. If the authors are
correct, these results should be generalizable
to press reports outside of Russell Group
universities and the news based on those
reports.
Overall, these results are at least consistent
with the hypothesis that a lot of the hype

Why correlation doesnt


necessarily equal causation
Correlation just means that when you see one thing occur a lot,
another thing occurs a lot along with it. For instance, in this study,
there is a strong correlation between news reports that have hype
and press releases that have hype. Assuming the observed correlation is actually true, then there are generally three explanations for
why it could occur:
A causes B: For example, it could be that press releases containing exaggerations are indeed picked up by the media
and repeated.
B causes A: This would be that exaggerated news stories
about a piece of research lead to exaggerated press releases.
Assuming that press releases are written before the news
stories, though, this possibility is unlikely here, since causes
dont work backward in time.
Some third factor causes both A and B: perhaps journalists
are ignoring the press releases and working directly from
the journal articles and interviews with the researchers. And
perhaps the press releases are doing the same. Thus, the
source of the hype in this case would be the the original
researchers.
There is no way to differentiate between these three possibilities from a correlation alone. However, one can narrow down the
possibilities through independent reasoning, as we did by using
temporal reasoning above. However, if possible, the best way to
establish causation is not through observational studies like this
one but through carefully controlled experiments where researchers actively intervene by changing only one variable and then
seeing what happens when compared to a control group. This is
part of the reason why randomized double-blinded, placebo controlled trials are the gold standard in the biomedical sciences.
38

found in medical reporting originates not with the


journalists reporting the news, but with the press
releases written by universities. But before jumping
to conclusions about causality ourselves, an important caveat must be mentioned, one with which the
authors of this study were also well-aware: this
study was observational in
nature, which means that
although it can provide
information on correlations, causality cannot be
directly inferred.

found in news stories were also found in the press


release, which also points to the medias reliance on
press releases. Finally, study details such as sample
size and study duration were very rarely reported in
the news if the press release did not include similar
details, but was usually reported in the majority of
news articles only if the
associated press release
had similar details.

72%

of quotes found
in news stories
were also found in
the press release,
which also points
to the medias
reliance on press
releases.

However, there are several


lines of reasoning to suggest that press releases are
indeed a major reference
for news articles. First,
other retrospective and
prospective studies have
found that press releases
influence news. Second,
the researchers of this
study took a look at the
dates, quotes, and areas of
focus in press releases and
news reports, and found
that these three areas seem
to point to reliance on
press releases by the media. Specifically, news stories
were only selected if they were published within 30
days of the press release date.
Furthermore, the authors found that 87% of news
articles selected were released within one day of the
publication of the associated press release, leaving
very little time for the journalists to do any additional independent research. Also, 72% of quotes

So, while causation cannot


be definitively established
in this kind of observational study, there is
additional evidence that at
least points in the direction that exaggeration
in press releases leads to
exaggeration in associated
news articles.

Finally, the authors found


no statistically significant correlation between
whether press releases
had any of the three types
of hype examined and
whether and how much
they were picked up by the
media. They also found that press releases including
caveats and justifications didnt seem to affect news
coverage. So, cautious, carefully-crafted press releases do not seem to be correlated with lower press
coverage, and over-hyped press releases dont seem
to get more press, either.

39

Though the study was observational in nature


and did not attempt to determine if the original
journal article contained hype, it provides evidence to suggest press releases can significantly
influence the way news is presented to the public.
The research also suggests that hyped-up press
releases get the same amount of coverage as press
releases with cautionary statements, due to the
news medias reliance on press releases.

The big picture

Media is often blamed for hyping medical findings,


but this study adds to a growing body of research
which suggests that the fault does not lie solely with
journalists. Many of the exaggerations found in the
news were also found in the press releases on the
same topic which preceded the news reports. Since
press releases are often crafted in collaboration with
scientists, both non-scientist writers at universities

and scientists themselves can take responsibility


for more accurate biomedical reporting by crafting
more careful press releases.
Journalists could in theory take more time to
independently check facts and read the pertinent
background literature, but the current journalistic culture has put a lot of pressure on journalists
to produce more material in less time than ever,
and so journalists may be forced to rely on easier
and quicker sources of information, such as press
releases and information from news agencies. And
an entire journalistic culture can be very hard to
change, particularly when its encouraged by a
changing industry.
It may seem that there are a lot of troubling findings in this study. But because the authors found no
incentive to hype up press releases (since more hype
doesnt lead to more press), they end with a hopeful
message: a relatively small handful of people in uni-

the current journalistic culture has


put a lot of pressure on journalists to
produce more material in less time than
ever, and so journalists may be forced
to rely on easier and quicker sources of
information, such as press releases and
information from news agencies.
40

versities can help create better health information


for everyone by crafting more accurate press releases
at little cost to themselves.

Frequently asked questions

Do scientists hype up their results in peer-reviewed


journal articles?
This study didnt examine hype in the original journal articles, instead using the peer-reviewed articles
as a baseline. However, the authors of this study were
clear that they thought it was quite possible that hype
occurs in peer-reviewed literature, too. However,
since assessing spin and hype in the scientific literature takes some expertise in specialized fields, this
is a much harder question to assess. Although one
study did find that spin of a certain sort could be
traced back to journal article abstracts, which is a
good reason to read more than just the abstracts!
Just how pervasive is the hype?
Its important to emphasize that just because hype

was found, that doesnt mean that everything is


exaggerated that, itself, would be an exaggeration! In this study, a large minority of news reports
had exaggerations of some sort, but it was still the
minority.
One of the exaggerations these authors looked at
was extrapolating results from non-human studies
to humans. Why is this a bad thing?
Because less than 10% of animal findings can be
used clinically in humans. There are lots of reasons
for this, from physiological differences and differences between how an induced disease model
behaves in an animal when compared to naturally-occurring human diseases, to systematic biases
and methodological flaws in animal studies, but
overall, there are many unknowns.
Animal experiments are very important to point
out promising leads for scientists to test down the
road clinically, and also help our understanding of
basic biomedical science. However, its pretty poor
reasoning to think that if something worked once or
twice in a petri dish or a rat, itd definitely work in a
human. A therapy may even be wildly successful
in rats, but cause terrible headaches and suicidal ideation. Rats dont really report those
side effects as effectively as humans do.
So, if the press gave me accurate information, Id be able to
make accurate decisions, right?
Not necessarily.
The idea that

41

Its important to
emphasize that just
because hype was
found, that doesnt
mean that everything
is exaggerated
that, itself, would be
an exaggeration!
all people need is good information to draw good
conclusions is called the deficit model of the public
understanding of science. If people dont have a deficit of knowledge, theyd make good choices and have
a greater respect for science. The cold hard facts are
all thats needed. But this model suffers from some
serious flaws.
Knowing the facts doesnt mean youll act on
them. Plus good reasoning skills, understanding of
extra-scientific culture and methods, and much more
is often needed. One of the motivations behind the
Examine Research Digest is to give you, the reader,
at least one more piece of the puzzle of supplement
science. Were not just trying to spoonfeed you
facts, but also hope to help you learn how to reason
through research a little better, a little bit at a time.
Also, science is an iterative process, and the popular press is not a great tool for reflecting that. If you
only read about studies in the media, science seems

to contradict itself constantly, but thats largely


because individual studies are conducted differently and some may have had errors. The overall
weight of the evidence is much less affected
by these individual studies, and that often isnt
reflected in the evening news.

What should I know?

Hype in news coverage of biomedical research


is correlated with hype found in press releases from universities. This strongly implies (but,
since this was an observational study, does not
definitely establish) that hype mostly starts not
with journalists, but with the university press
releases that summarize the biomedical research.
Journalists simply report on the hype that
already exists in the press releases. Furthermore,
hyped up press releases dont seem to draw more
news coverage, so theres little real incentive for universities to hype up biomedical research.
Keep in mind that every link in the research and
reporting chain can have an incentive to exaggerate.
While researchers do have a degree of accountability
due to peer review, the system is imperfect. Funders
can also indirectly influence research by selectively
funding certain studies, which researchers are well
aware of when attempting to attain grants. And this
study shows that exaggeration or inaccuracies can
be amplified further at the reporting level. So to truly understand a research topic often requires not just
knowledge of the specific topic at hand, but a deep
and broad knowledge of how research works.
To discuss recent examples of exaggeration in the
media and press releases, join us at the private ERD
readers Facebook group.
42

Running on
empty: can we
chase the fat
away?

Body composition changes


associated with fasted versus
non-fasted aerobic exercise
Introduction

The idea of fasted cardio to accelerate fat loss has been, for the most
part, based on a key assumption: with no food in our system, our fat
stores are the go-to energy source, assuming low- to moderate-intensity cardio training. The use of fat is facilitated by the low levels of liver
glycogen and insulin, and short-term studies suggest that fasted cardio
does increase fat oxidation over 24 hours. It stands to reason that if
done on a sustained basis, then there might be a greater amount of fat
loss compared to if the training was done after eating breakfast or in
the afternoon. But is this assumption correct?

43

Who and what was studied?

This is the first study to investigate the chronic


effects of fasted and fed cardio training on body
composition during a diet, which is likely a common situation in dieters. Previous research was
done on isocaloric or hypercaloric diets. Twenty
university-going females (average age of 22.4 years)
were recruited to participate in one hour of treadmill running, three days per week, while following a
hypocaloric diet for four weeks.
All the women reported
habitual aerobic exercise
several days per week
(some were off-season
collegiate track and
field athletes), but none
were involved in any
resistance training programs. The exclusion
criteria included injuries
and medical complications in an attempt to
ensure the women were
otherwise healthy.
As seen in Figure 1,
the treadmill running consisted of a five
minute warm-up and
cool-down at 50% of
the age-determined
maximal heart rate
(MHR), separated by a
50 minute bout at 70%
MHR. Heart rate monitors were used to ensure

exercise was at the appropriate intensity. The hypocaloric diet consisted of customized dietary plans
that induced a 500 kcal daily deficit. Food was not
provided by the investigators.

The Mifflin-St. Jeor Equation multiplied by the


moderately active activity factor was used to estimate total daily energy expenditure (TDEE), and
500 kcal was cut from this value. Protein was set
at 1.8 grams per kilogram of bodyweight and fat
at 25-30% total kcal.
Figure 1: Outline of cardio
Adherence to the
training protocol and diet
diet was monitored
through the participants
self-reporting, using
MyFitnessPal.
The women were pairmatched based on initial
body weight and divided into two groups:
FED and FASTED.
The number of athletes and non-athletes
were evenly distributed
between the groups.
All the women completed the exact same
four-week diet and exercise program, with the
only difference being
the timing of a meal
replacement shake (250
kcal; 20 grams protein;
0.5 grams fat; 40 grams
carb). The FASTED
group consumed it
immediately after the
44

Mifflin St. Jeor Equation


Men: TEE = (10W + 6.25H - 5A - 5)*AF
Women: TEE = (10W + 6.25H - 5A - 161)*AF
W = weight (kg) H = height (cm) A = age (years) AF = activity factor
Sedentary:
Lightly active:
Moderately active:
Very active:
Extra active:

little or no exercise
light exercise/sports 1-3 days/week
moderate exercise/sports 3-5 days/week
hard exercise/sports 6-7 days/week
very hard exercise/sports and physical job

training session, while the FED group consumed it


immediately before. The protein was from whey and
the carbohydrates from maltodextrin.
Body composition was assessed with the BodPod,
which has been shown to be reliable when used by
young athletic women. At baseline, the only significant difference between the groups was in age, with
the FED group being, on average, almost three years
younger (21 vs 23.8). The average baseline BMI was
23.3 kilograms/m2, and the average baseline body
fat percentage was 26.3% in the FASTED and 24.8%
in the FED group.
This study investigated the effects of fasted cardiovascular exercise on young women following a
hypocaloric diet. Study participants reported their
food intake through MyFitnessPal. Researchers
conducted the one-hour treadmill run test three
times a week, for the duration of the study.

What were the results?

After the four-week intervention, there was no


significant difference between the groups in any
measure of body composition. Both groups had
significant reductions in weight, BMI, and fat mass,

x 1.2
x1.375
x1.55
x1.725
x1.9

with trends for reductions in body fat percentage


and waist circumference, all while preserving their
lean body mass. But between the groups there were
no significant differences.
Specifically, the average weight lost in the FASTED
and FED groups was 1.6 vs. 1.0 kilograms, respectively, while the average fat lost was 1.1 vs 0.7
kilograms, respectively. On the surface this may
suggest a small advantage for the FASTED group,
with the relatively small sample size or short study
duration limiting the statistical power to detect significant differences. However, these differences were
not trending towards significance.
In fact, the p-values averaged 0.8-0.9 for the various
body composition measurements (these p-values
were found through correspondence with a study
author), indicating that the difference between the
groups had at least an 80% probability of being due
to chance. Finally, the FASTED group started with
a slightly greater body fat percentage and fat mass,
providing greater opportunity for fat loss from the
beginning. Still, its possible that a larger sample size
or longer duration may have changed the results.
The study also suggests that there was a dietary disconnect in the young women during the study. The
45

How does the BodPod work?


The BodPod is actually the name of one of two commercially available models for body
composition testing (the other being the PeaPod that is used with infants). The BodPod
works through a method called Air Displacement Plethysmography. The volume of
the body is measured indirectly by determining the volume of air that is displaced
within an enclosed chamber (the BodPod).
Once volume is known, density of the body can be calculated using this value
with the persons weight. The density is then entered into one of several population-specific conversion formulas to estimate the percent body fat. The way that it
works is similar as the body-fat estimating dunk tank (formerly the gold standard
before DXA), except it uses air instead of water.

TDEE was around 2150 kcal, which would make the


dietary plans based on 1650 kcal a day. At this 500
kcal per day deficit, average fat loss should happen
at around one pound per week, if we assume one
pound of fat is 3500 kcal (a rule of thumb that is not
always accurate). However, average fat loss was only
40-60% this amount and total weight loss was only
50-90%, which suggests that either the women may
have consumed more kcal than they were told, or
that the TDEE overshot actual requirements.
Either way, the women reported consuming an average of around 1240-1280 kcal/day, which is around
400 kcal less than they were told to eat. Similarly,
they only consumed about 1.2 grams per kilogram
of bodyweight in protein per day, compared to
the planned 1.8 grams per kilogram. Whether the
women were under-reporting, under-eating, or a
combination of both remains unknown. However,
under-reporting is somewhat likely given its prevalence in previous weight loss studies. You can plug
weight, calorie intakes, and calorie expenditure into
Dr. Kevin Halls NIH body weight simulator to esti-

mate weight loss for different situations, whether


from research or real life.
Both the fed and fasted groups had lost weight by
the end of the study, with the fasted group having
lost slightly more. However, these results were not
significantly different.

What does the study really


tell us?

This study clearly shows that a caloric deficit coupled with moderate-intensity aerobic exercise results
in weight loss. It is novel in that it also shows that
there is no difference in weight or fat loss when the
cardio is performed in a fed or fasting state, at least
over the four weeks tested. This may seem counter-intuitive based on simple biological reasoning,
but makes more sense when we take into account
how adaptive and complex the human body really is.
For instance, it has been previously demonstrated
46

that consumption of a light Mediterranean breakfast


before 36 minutes of moderate-intensity treadmill
running results in a significantly greater utilization
of fat 12 and 24 hours after the training session
when compared to the same exercise session done
fasted. It may be prudent to view body composition
goals and fat loss over the course of days rather than
hours, since the body uses readily available fuel but
then stores leftover fuel over time.
It is also worth considering the effect of the pre-cardio meal. The meal replacement shake used in the
current study contained 40 grams of carbohydrate
in combination with 20 grams of protein. It has
been shown that carbohydrate ingestion before and
during moderate-intensity cardio exercise reduces
the expression of genes involved in fat metabolism.
However, it has also been shown that although carbohydrate ingestion suppressed fat breakdown, the
rate of fat breakdown can still be in excess of the
amount of fat needed for energy production, and
thus the carbohydrates may not limit fat oxidation.

The big picture

Is it fair to say that this study has put a nail in the


coffin for fasted cardio? Well, not really. For young
women at a healthy weight who are truly eating
under maintenance, it is very applicable. But the
duration was fairly short at four weeks and the sample size small. Even with the highly insignificant
p-values, we cannot entirely rule out the possibility that subtle changes between the fasting and fed
groups would have taken more time or more people
to become apparent. Another confounding variable
is the uncontrolled dietary intakes: even though food
logs were collected daily, inaccurate measurements
and misreporting could have influenced outcomes.

We cannot necessarily extrapolate these results to


other populations. Light-intensity fasted cardio is a
common tactic for physique sports, such as bodybuilding when dieting for extreme leanness, and
it would be a long-shot to generalize these results

[...] evidence
suggests that
premenopausal
women derive a
greater proportion
of energy from fat
during exercise
when compared
to men, but that
males have a
greater basal fat
oxidation rate.
to those individuals without similar longer-term
studies performed to compare fasted and fed exercise conditions. That said, there is some evidence to
suggest that fat oxidation during exercise is independent of body fat percent and relies more on
cardiorespiratory fitness.

47

As for gender, evidence suggests that premenopausal


women derive a greater proportion of energy from
fat during exercise when compared to men, but
that males have a greater basal fat oxidation rate.
This may be due to differences in sex hormones and
sympathetic nervous system responsiveness, but we
would need another study like this conducted in
men to say for sure if fasted cardio would be superior to fed cardio.
Finally, older men demonstrate a higher basal
respiratory quotient relative to younger men, suggesting less basal fat oxidation, but they also show
less change in response to food intake, suggesting

that over the course of the day fat oxidation may be


somewhat similar between the age groups. How age
ultimately influences outcomes would require yet
another study.
These results are part of a growing picture of how
fasted cardio impacts weight loss. Studies have
shown that different populations, diets, and types of
cardio can impact results. As seen in Figure 2, previous research has been done on young men eating
hypercaloric and isocaloric diets, and overweight/
obese women eating their normal diet, with varying
results depending on the study.

Figure 2: Other research on fasted vs fed cardio

48

Though this study provides some evidence to suggest fasted cardiovascular exercise is not more
effective for improving the rate of weight loss than
fed cardio for young women, the limited nature of
the study means more research is needed before
these results can be applied to other populations.

Frequently Asked Questions

What about High-Intensity Interval Training (HIIT)


for fat loss?
HIIT is an entirely different beast than moderate
intensity cardio exercise. It requires more careful
programming in the routine to ensure adequate
recovery and isnt typically done on a daily basis. In

long-term study compares the modalities (HIIT


vs moderate steady state) in diverse populations.
Research shows that while the often cited post-exercise calorie burn from HIIT isnt that large, HIIT
may still have hormonal and appetite benefits that
impact fat loss.
Are there other reasons to perform fasted cardio?
Circumstances will mediate the answer to this question. For instance, fasted cardio exercise has been
shown to attenuate weight gain, enhance glucose
tolerance and insulin sensitivity, and increase gene
expression of enzymes involved in fat oxidation
in healthy males fed a fat-rich hypercaloric diet,
whereas the same exercise protocol performed after
consuming breakfast showed weight gain with no

Aside from actual health benefits,


some people dont enjoy exercise with
food in their stomach, and others have
more energy in the morning when fasted
training is commonly performed.
terms of actual fat loss, the breakdown and utilization of fat for energy is blunted at higher intensities
in favor of glucose, as higher intensities rely more
heavily on the anaerobic energy system.
Thus, between the reduced frequency, shorter duration, and greater reliance on glucose for energy,
HIIT may not be superior to steady-state cardio
for fat loss. But we wont know for certain until a

detectable improvements in glucose metabolism.


Aside from actual health benefits, some people dont
enjoy exercise with food in their stomach, and others have more energy in the morning when fasted
training is commonly performed.

What should I know?

There are two main takeaways from this study. This


49

is the first study to address the effects of fasted or


fed cardio under hypocaloric conditions, and it was
shown that there were no significant differences
between fasted and fed cardio in any body composition measurements. The greatest limitation is
likely the study population of young healthy women,
which makes generalizing the results to men and
people of different fitness levels difficult. To make
the results of such trials even more certain for any
given population, longer trial lengths, larger sample sizes, and finding a way to standardize the diet
more would help. All of these factors can make trials
much more expensive, however.
Second, this study lends further support to the idea
that we should remain skeptical of drawing longterm conclusions off short-term interventions. For
example: last February it was shown that measures
of muscle protein synthesis did not correlate with
actual muscle growth following a resistance training
routine in untrained males. Again, it would be difficult to generalize these results to experienced lifters,
but when taken in combination with the current
study it seems prudent to be critical of claims based
only on acute responses. Not every study applies
well to real-life health and fitness situations.
Have you done fasted cardio and lost a bunch of
weight? Burned out? Somewhere in between? Let us
know your n=1 experience, and what you think of
this study in the Facebook ERD private forum.

50

Fitting into your genes:


do genetic testing-based
dietary recommendations
work?
Disclosure of genetic information and change
in dietary intake: A randomized controlled trial
Introduction

Science fiction is full of stories of genetic testing


and its potential to revolutionize medicine and
human performance. However, its not clear if the
futurist hopes match scientific reality. Now that
consumer genetic testing is both cheap and accessible, researchers have begun to study whether or not
these services can actually help assess and manage
health risks.

Because its such a new field, most of the research


on the role of genetic testing for health management
has been focused on diseases with known genetic
risk factors, such as BRCA mutations, which greatly
increase breast cancer risk. As research progresses,
more and more genes and gene variants are being
identified as risk factors for disease. However, as
consumer genetic tests become more common,
theyve been used for a variety of lesser known exposure-disease associations based on more common
gene variations.
51

Genetic testing will likely become more prevalent as


it becomes cheaper, and consumers without much
knowledge of genetics or disease will have access to
information that they may not know how to handle.
Genes can affect a variety of nutrition-related areas
everything from how we metabolize different fuel
sources to how we absorb different nutrients.
But does it actually help people to have access to this
information? Do people who receive advice based
on genetic tests change their habits? The researchers
in this study assessed whether or not genetic testing and subsequent dietary recommendations had
an actual effect on diet, not just in the first days or
weeks after being tested, but up to a year afterward.
The most established associations in genetics are
for mutations that increase susceptibility to major
diseases, such as BRCA for breast cancer. With
the advent of direct-to-consumer genetic testing,
a variety of lesser known genes have been tested,
some of which can impact nutrients.

Who and what was studied?

This study was a follow-up to a previous study


assessing whether or not people thought genetic
testing and nutrition advice based on that testing
was useful. Because the participants of that previous
study thought that personalized nutritional advice
based on genetics was better and more understandable than general nutrition advice, the researchers
performed this study to assess whether or not the
participants actually used the advice they were given.
Both studies used the same large cohort of Canadian
participants, who represented the typical users of

Genes can affect


[...] everything from
how we metabolize
different fuel
sources to how we
absorb different
nutrients.
consumer genetic testing, which meant they were
mostly young, female, Caucasian or Asian, and
had at least an undergraduate degree. This is obviously not a typical cohort or representative of the
Canadian population as a whole.
Because this study specifically assessed intake of
four specific substances (caffeine, vitamin C, added
sugar, and sodium), the inclusion criteria included
people who consumed at least 100 mg of caffeine a
day, at least 10% of calories from added sugars, at
least 1500 mg of sodium per day, and no vitamin
C-containing supplements. These measures were
assessed using a food frequency questionnaire that
was emailed to all of the participants at the start of
the study.
Because of these requirements, only 157 out of 1639
participants in the cohort were eligible for this trial. Eligible participants were then randomized to
receive monthly dietary information that was either
based on their genetic risk factors (the intervention
group) or general recommendations (the controls).
52

Dietary recommendations for the intervention group


were based on whether or not the participants had
known variations of four genes, as seen in Figure 1:
CYP1A2: increased risk of heart attack and
high blood pressure when consuming >= 200
mg of caffeine. This gene encodes proteins in
the cytochrome p450 family, which includes
enzymes that metabolize nutrients and drugs.
One variant makes you a slow caffeine metabolizer (and hence more stimulated by caffeine)
and another makes you a fast metabolizer.
Many different medications can impact this
enzyme, and potentially further slow down the
breakdown of caffeine.
GSTM1 and GSTT1: increased risk of vita-

min C deficiency when consuming lower than


recommended amounts. These genes code for
glutathione S-transferases, which detoxify environmental chemicals. Glutathione and vitamin
C can protect each other from oxidation, and
serum vitamin C levels differ depending on
GST genotypes.
TAS1R2: increased risk of consuming excess
sugars. This gene codes for a taste receptor subunit that can influence your sweet tooth.
ACE: increased risk of high blood pressure
when consuming excess sodium. Some people
are more sensitive to salt than others, when it
comes to blood pressure. The ACE gene plays a
major role in determining salt sensitivity.

Figure 1: Frequency of risk alleles in intervention group

53

A reduction of sodium to 400 mg/


day has been estimated to prevent up to
28,000 deaths and be more effective than
using common medications to manage
high blood pressure
To assess the effects of the emailed dietary recommendations, the participants were sent additional
food frequency questionnaires at three and 12
months after the initial enrollment in the study.
Food frequency questionnaires are notoriously inaccurate, but they are regarded as the most cost- and
time-effective way of generally assessing the dietary
habits of a population.
The subjects were also sent monthly email reminders
of their dietary recommendations, which are likely not the most effective way to modify a persons
behavior unless that person gets very few emails and
has lots of free time. However, it does mirror the
real-life situation of ordering a genetic test from a
testing service, and getting email as the main form of
communication, rather than the more hands-on personal communication typical of many clinical trials.
This study used email to remind study participants of their nutritional recommendations,
which were based off of genetic testing. Email was
also used to track what the study participants
ingested during the study, specifically caffeine,
vitamin C, added sugar, and sodium.

What were the findings?

For most measures, the monthly recommendations


based on participant genotypes did not significantly
affect dietary intakes at three or 12 months, compared to the control group. The only exception to
this was sodium intake, and in that case, the difference was only seen at 12 months.

However, despite the difference between the intervention and control groups for sodium intake at 12
months, the intervention group with the ACE gene
variation still failed to meet the recommendations
they were provided, with only 34% meeting the lower recommended intake at 12 months versus 24% in
the control group.
However, the roughly 300 mg/day reduction in
sodium by the intervention group is still likely clinically relevant, as recent evidence by the Institute of
Medicine has pointed to sharp reductions in sodium being less beneficial than previously thought. A
reduction of sodium to 400 mg/day has been estimated to prevent up to 28,000 deaths and be more
effective than using common medications to manage high blood pressure, which shows that smaller
changes than sometimes deemed optimal can have
major impacts on a population-wide level.
54

These findings are likely related to the fact that most


of the participants had daily consumption values
that were within recommendations at baseline (91%
for caffeine, 86% for vitamin C, 76% for added sugars, and 61% for sodium). The fact that sodium
intake significantly changed may also be related to
the fact that 80% of participants with high-risk ACE
variants consumed beyond the recommended sodium level at baseline. For comparison, only 38% of
participants with the CYP1A2 variation consumed
more than the recommended amounts of caffeine
at baseline. Blood pressure may also have seemed
a more critical health issue for some of the participants than something like vitamin C deficiency,
since for better or for worse the former is typically
associated with heart disease, while the latter brings
to mind scurvy, pirates, and colds.
Furthermore, as one might expect from a study conducted via emailed food frequency questionnaires
to a population with highly varied
baseline data, the estimated intakes
varied greatly among participants, which often resulted in
standard errors larger than
the mean values themselves (for example, at 12
months, the change in
the control groups
caffeine consumption was -0.3 +/- 17.8
mg/day).

This high variance results in what statisticians refer


to as noisy data: measurements in which the varied
initial values make it difficult to make strong statistical conclusions. For instance, if one person drinks
no coffee and another person regularly drinks four
cups a day, thats a relatively large spread. If that first
person starts drinking one cup a day because they
took a stressful job, and the second cuts down to
two cups a day because of the advice they received,
those are both still within the initial range of zero to
four cups per day, so it makes it hard to determine if
those changes in consumption are normal or if they
were caused by the dietary advice.
The large amounts of variability at baseline make it
hard to generate meaningful conclusions later.
The majority of study participants did not significantly change their dietary habits over the
course of the study. Sodium intake was affected
most, though only 34% of the group met their
recommended sodium intake by the end of the
12-month study. The mean reduction of the group,
while not as large as intended, would still likely be
enough to make an important impact on a population-wide level.

55

What does the study really


tell us?

This study intended to clarify the real-life impact


of genetic testing based dietary recommendations,
and ended up showing that it had little to no effect
when it comes to behavioral modification, compared
to traditional advice on its own. Like many studies,
this is one in which the applicability of the results
is difficult to interpret,
and it could be that the
significant change noted
in sodium intake at 12
months is not applicable
to most other nutrients
(if sodium and blood
pressure is deemed a
more important issue to
act on than other nutrients), or the results may
be due to the vagaries
of statistical variation
(the control group used
for comparison had
increased sodium intake
at 12 months, making
the difference between the groups larger). This is
especially true because there was no effect on sodium intake at three months, and the decrease in
sodium intake at six months still didnt lead to the
participants meeting the recommended intakes.

On the other hand, one-on-one consultations with


genetic counselors or other experts could have a
greater effect than monthly email reminders, but
the difference between these two methods has not
been compared.
Based on the findings of this trial, the answer to the
question, Does nutrition-related genetic testing
affect lifestyle behaviors? is Maybe for some select
nutrients, but likely not in
a consistent and reliable
fashion.

reviews [...] have


found informing
people of their
genetic risk factors
does little to
actually change
behavior.

However, its also just generally difficult to extrapolate the findings of trials like this, which use samples
that are not representative of the general public. If
someone is highly motivated and dedicated to minimizing their lifestyle-associated risk factors, the
results of a genetic test may be more useful.

The Big Picture

The findings of this study


are generally in line with
other studies and reviews
that have found informing people of their genetic
risk factors does little to
actually change behavior.
It should also be noted
that this is still a very new
field, and there is a lack
of data on how genetic
information can best be used for behavioral modification. Its not necessarily clear how to best deliver
genetic information to people, and this is a major
confounder for any study at the moment. It could be
that genetic information can be an effective agent of
behavioral change, but we simply dont know how to
effectively deliver it or pair it with existing interventional strategies. We dont even know who will be
responsible for providing personalized recommendations will physicians work with genetic testing
companies? Will consumers mostly be interpreting
results on their own?
56

These difficulties apply to many biometric-based


intervention trials (meaning those that use a physiological measurement, such as blood pressure or
blood glucose), which are actually scientifically far
more complicated than they seem. It seems easy to
ask whether or not biometric test results affect behavior in a meaningful way, but there are many other
aspects to that question that may introduce scientific
uncertainty into those types of studies. For example,

motivation for a person to pay attention to advice,


and then there has to be even more motivation to
follow that advice in the face of lifes daily stress.
Because this study didnt specifically ask the participants if they wanted to change their habits based on
their genetic test results, that factor wasnt controlled
for at all. The results might have been very different
had the surveyed population been more (or less)
concerned with optimizing their health. Furthermore,

The results might have been very


different had the surveyed population
been more (or less) concerned with
optimizing their health.
researchers have to assess the reliability of their tests,
assess the effectiveness of how they report the results
of those tests to participants, decide how to measure
that effectiveness, and then determine whether the
effects they see are real or somehow related to the
nature of the study cohort or statistical anomalies.
Each of these points could be enough to write a thesis, so the field needs to grow substantially before
anything can be said with much scientific certainty.
In this study, these uncertainties were further compounded by the way the researchers interacted with
the study participants, which was almost entirely
through email. Most peoples inboxes are constantly spammed by a variety of newsletters and other
information, and its very easy for a monthly email
to become a monthly auto-delete or spam folder
denizen. There usually has to be some sort of major

a similar study on highly motivated populations, like


athletes or people recently hospitalized for health
issues, might have very different results.
Despite these issues, consumer genetic testing is still
a promising field because it offers a way to actually
act on all of the genetic information that has accumulated over the years. Without a cost-effective
way of sequencing individuals, all of the genetic
variations that have previously been associated with
disease are relatively useless. For example, even if
we know that CYP1A2 variation is related to caffeine-associated hypertension, it does little good
unless we have a cost-effective way of testing what
variant a person has. Consumer genetic testing may
provide this outlet to make gene association studies more useful by informing large populations of
their genetic variants.
57

However, its not clear how actionable this disclosure


is. And if it is actionable, its not clear if people actually care enough to change. The impact of research
on genetics (or epigenetics or microbiomics or any
other -ic) is difficult topic to assess, and despite
the modern advances in sequencing and genetics,
human behavior may be the limiting step in applying findings. The more biometric data were able to
find out about any given person, the more that an
age-old question applies: How would you live if you
knew how you were going to die?
The study doesnt address possible negative aspects
of genetic testing. Nutrient-related tests may be less
susceptible to major negative aspects, but its quite
possible that consumers could misinterpret a test,
and focus on a result when the true source of their
health issues lies elsewhere (in other words, a red
herring). Its even possible that someone might pin
their hopes on a nutrition-related intervention, and
stop taking a medication when they havent cleared
it with their doctor. This is a case of knowing just
enough to hurt yourself. Just because you know
what the MTHFR gene does (a gene that regulates
homocysteine, involves B vitamins, and is a topic of
much contention) doesnt mean that its the source
of all your health problems.
This study was limited by the population it investigated. Even the best advice is ignored if there is no
internal motivation for change. Additional studies on multiple populations such as those that are
interested in optimizing their health, or one that
is more reflective of the general population, can
shed light on the best way to deliver the results of
genetic testing and how to best structure lifestyle
changes based on those results.

Frequently Asked Questions

Is genetic testing useful for general lifestyle recommendations?


Genetic testing may help guide lifestyle choices,
but many of the tested genes (such as the ones in
this study) only show effects with intakes beyond
recommendations. So if you adhere to general
recommendations, it may be less useful. It seems
obvious that adjusting your lifestyle to address certain genetic risk factors would help reduce risk, but
that has yet to be definitively proven.

How
would you
live if you
knew how
you were
going to
die?
Who most benefits from genetic testing?
People with family histories of diseases may find
benefit from genetic testing, but its also a double-edged sword. There arent always preventative
strategies available for all of the diseases with highrisk mutations, so it may just fuel a sense of fatalism.
Similarly, researchers have yet to develop reliable
risk assessment models based on genetic screening.
Genetic counselors are specifically trained to help
people interpret and address the results of genetic
testing and familial risk factors.
58

People with family histories of


diseases may find benefit from genetic
testing, but its also a double-edged
sword. There arent always preventative
strategies available for all of the diseases
with high-risk mutations, so it may just
fuel a sense of fatalism.
Although we know that genetics has a profound
impact on chronic disease risk (especially from twin
studies), we dont know much about which specific
genes are involved. Not to mention that genes can
have complex interactions with other genes, diet, and
environment. In the case of most chronic diseases,
we dont have the ability to look for specific polymorphisms and give meaningful advice on that basis.
If a client brings me a genetic test and wants to train
or eat a certain way, what should I tell him or her?
This is a balancing act between your professional
opinion and your clients opinions. And different
states vary with regards to what credentials are
needed to give different types of advice, so make
sure to look into what you are and arent allowed to
do. The personalization offered by a plan that caters
to a clients test results may enhance adherence, and
its unlikely to be harmful if it encourages intake of
healthy foods, but you should always thoroughly
research a given topic before offering advice to any
clients. Most trainers lack the background in genet-

ics to fully understand a test result, and its easy to


jump to conclusions that arent truly evidence-based.
The real-life implications of different genetic tests is
still uncertain, which is part of the reason popular
testing company 23andme was reprimanded in 2013.
The FDA forced 23andme to stop marketing their
direct-to-consumer genetic testing service, as the
health reports provided by the company were seen
as being too close to disease diagnosis, and 23andme
was preparing to market the tests quite heavily to the
public. Thus 23andme now mostly provides raw data
without as much interpretation as they did previously.
This crackdown illustrated the many uncertainties
associated with genetic testing. Someone without
much knowledge of genetic epidemiology (which
is most everybody) might have a hard time
interpreting test results. It may not be optimal for
consumers to mostly receive raw information rather than health reports, but its also important not to
lead on consumers with test result interpretation
59

that may not be accurate. So its always a good idea


to get an expert opinion, such as from a genetic
counselor, for important health issues that may be
impacted by genetic tests.
How are the findings of this study comparable to
other biometric testing services such as microbiome
analysis?
Personalized biometrics is a rapidly growing field,
but its not necessarily clear how all that extra data
can most effectively influence behavior or risk factors. Information can change everything, and even
save lives. For some people, collecting tons of data
and tracking everything you do distracts from bigger
issues that impact health. People somehow managed
to stay healthy long before Quantified Self became
a buzzword.
Genetic test results from this study are quite different than microbiome analysis. Metrics like
microbiome composition can change relatively rapidly in response to behavioral changes. For example,
dietary changes, or even moving to a new home, can
change gut microbiome compositions. But since not
much is known about optimal microbiome composition, microbiome analysis may serve a more
informational role at this point, rather that to spur
direct and specific action (outside of a generally progut health lifestyle). That being said, much of this
is speculation, as its a young field with a constantly
evolving research base.

tle to no change in intake for the other items studied


(caffeine, vitamin C, and added sugar intake).
While genetic testing results may enhance adherence
to diet and supplementation plans, this study only
provides some evidence that it might be possible to a
small degree. Its also unclear if this is actually specific to genetic information, or applies to any type of
personalization.
Furthermore, effects may depend on the specific
compound studied. For example, based on the information found in this study, it may be much easier
for most people to reduce sodium intake than it is
for people to reduce caffeine consumption. Future
research testing other personalized recommendations for other dietary components, perhaps using
different controls, will help in developing this very
new research area.
Have you made changes based on genetic tests or
microbiome testing? Discuss genes and nutrition
over at the ERD private forum on Facebook.

What should I know?

This study tested how dietary recommendations based


on genetic testing results affected dietary intakes. The
effects were relatively minor and were only seen in
one of four measures, sodium intake. There was lit60

Combating obesity
through intermittent
fasting
Time-Restricted Feeding Is a
Preventative and Therapeutic
Intervention against Diverse
Nutritional Challenges

Introduction

Short-term fasting due to religious beliefs has been practiced for thousands of years.
More recently, intermittent fasting (IF) has been becoming more popular. There are
different kinds of IF, including randomly skipping a meal/meals, alternate day fasting, and using time-restricted feeding (TRF) windows.

61

A TRF protocol has participants consuming all of


their daily energy intake within a set window of
time (four hours, six hours, etc), inducing a 12-22
hour daily fasted window. While human trials are
limited, an increasing number of animal studies
are showing that TRF appears to be beneficial for
improving many chronic disease risk factors, even
while consuming a diet that should otherwise make
the animal obese and diabetic.
Prior to this new study, however, it was not known
if the benefits of TRF extended beyond protection
against high-fat diets, or if TRF could be protective
against excessive sugar or fructose intake. Questions
also remained about TRFs effect on pre-existing
obesity, as well as its lasting effects.
Time-restricted feeding (TRF) has been a part
of religious practices for thousands of years.
Recently it has captured the attention of biomedical research due to promising research for disease
prevention, mostly in animal studies.

Who and what was studied?

This is a very thorough animal study that looked at


the effectiveness of TRF against a variety of nutritional challenges. The researchers studied high-fat,
high-fructose, and high-sucrose diets consumed
within nine hour, 12 hour, and 15 hour feeding
windows. In rodent studies the term high-fat diet
doesnt just mean a diet high in fat. No avocados,
no cheese, no macadamia nuts. It means a purified
high-fat diet based on refined ingredients. Its calorie-dense and not very healthy.
The study also had groups alternating between five
days of TRF (simulating weekdays) and two days
of free-access to food (weekends). In addition, they
looked at both the immediate effects, as well as the
legacy effects, when the TRF routines were changed
to allow long periods of unrestricted food access.
While many of our human readers may follow a
high-fat diet with no ill-effects, it should be noted
that unrestricted (ad libitum) access to a high-fat
diet in mice causes obesity, insulin resistance, as well
as associated problems like dyslipidemia, hepatic
steatosis (fatty liver), and elevated cholesterol.

[...] it was not known if the benefits


of TRF extended beyond protection
against high-fat diets, or if TRF could be
protective against excessive sugar or
fructose intake.
62

Intervention
A total of 392 12-week-old male mice were subjected to a variety of feeding regimens and divided into
six main cohorts, all maintained on a 12-hour:12hour light:dark cycle, and fed during the dark-phase
when time-restricted. Feeding during the darkphase is optimal for mice, who are nocturnal. This
is opposite to humans, who (should) consume the
majority of their daily energy intake during the light
phase.
As seen in Figure 1, there were A LOT of variables
manipulated in this trial, producing many different diets. Refer to the figure or list to match up the
alphabet soup of different interventions to specific diet descriptions. The different individual diets

(bolded below) all fell into one of these six groups


(bolded and italicized):
1. High-fructose: fed a high-fructose diet for 11
weeks either ad lib (FrA) or TRF (FrT).
2. High-fat high-sucrose: fed a high-fat high-sucrose diet for 12 weeks either ad lib (FSA) or
TRF (FST).
3. High-fat TRF and 5T2A: fed a high-fat diet
for 12 weeks. With respect to feeding windows,
there were four different groups: either ad libitum (FA), in a nine hour TRF window (9hFT),
12 hour TRF window (12hFT), or alternated
between five days of nine hours TRF (week-

Figure 1: The many, many different diets variables tested in the study diets

63

days) and two days of ad lib (weekends) for 12


weeks (5T2A).
4. High-fat and normal chow: fed a high-fat
diet ad lib (FA) or in a 15 hour TRF window
(15hFT), or a normal chow diet either ad lib
(NA) or in a 15 hour TRF window (15hNT) for
nine weeks.
5. Short-term crossover (13:12): fed a high-fat
diet for 25 weeks with the feeding regimen
switched for some mice (to or from TRF) midway through the experiment (the FAA, FTT,
FTA, and FAT groups).
6. Long-term crossover (26:12): fed a high-fat
diet or a normal chow diet for 38 weeks with
the feeding regimen switched for some mice
after 26 weeks and then maintained another 12
weeks (FAA, FTT, FTA, FAT feeding groups
on a high fat diet and NAA, NTT, NTA, NAT
feeding groups on normal chow).

What were the findings?

The study authors provided a succinct summary of


its results:
TRF protects against excessive body weight gain without affecting caloric intake irrespective of diet, time
schedule, or initial body weight.
What exactly does that mean?
Bodyweight:
The results of this study provide some additional
evidence that a calorie is not always a calorie, at least
in mice. Mice fed a high-fat, high-sugar diet within
a nine hour window consumed equivalent calories

as mice given unlimited access, but gained half as


much weight. Interestingly, weight gain was similar
when mice were given a high-fructose or normal
diet either ad lib or TRF suggesting that fructose
isnt especially fattening in rodents.

The results
of this study
provides some
additional
evidence that
a calorie is
not always a
calorie, at least
in mice.
When comparing a high fat diet using nine, 12, and
15 hour TRF, food consumption was equivalent, but
longer feeding times resulted in greater increases in
body weight. The 9 hour group had a 26% weight
gain, while the 15 hour group gained 43%, and
the group with unlimited access gained 65%. It is
important to remember that all four groups were
consuming the same number of calories per day.
The authors didnt give much detail about how calorie intake was measured, as far as specific methods
used. Rodent calorie intake can be difficult to measure, depending on experimental conditions, and
64

TRF appears to be very effective in


protecting against weight gain during
a range of challenges, including highfat and high-sucrose diets, as well as
promoting weight loss and stabilization
in preexisting diet-induced obesity.
measurement technique is important for a trial such
as this.
To further test the effects of TRF, the researchers
set up three crossover experiments. When mice
were alternated between five days of nine hour TRF
(weekdays) and two days of ad lib (weekends) for 12
weeks, they only had a 29% gain in body weight, as
opposed to a 61% weight gain for the FA (ad lib fed)
mice. As with the previous cohorts, food consumption was the same between groups.
Another portion of the study was to determine the
longer term and lasting effects of TRF. Mice were
fed a high-fat diet for 25 weeks, with the feeding
regimen switched for some mice midway through
the experiment. The mice who were started on TRF
displayed rapid weight gain upon switching to ad
lib feeding, and in the end weighed as much as mice
who were always consuming ad lib (111% bodyweight gain). In contrast, the group who stayed on
TRF for the entire 25 weeks only had a 51% increase
in body weight.

on TRF for 26 weeks and then switched to unrestricted access. As expected, mice gained weight upon
switching to ad lib feeding though their weights
stabilized at a much lower increase in body weight
(106%) than mice never on TRF (157% increase in
body weight). Again, it needs to be noted that equivalent calories were consumed among all groups.
To determine if TRF could have benefits for mice
with pre-existing obesity, both the short and longterm crossover studies included a group which
were switched from ad lib to TRF feeding. During
the 25-week study, these mice had a small drop in
body weight and maintained this weight, which
was not different from the mice that were always
on TRF. Switching mice from ad lib high-fat diet to
TRF led to a 5% loss in body weight from the time
they changed, which is impressive compared to a
25% weight gain in mice who were always allowed
ad lib access to food. In the longer (38-week) study,
switching mice from ad lib high-fat diet to TRF led
to a 12% loss in body weight from the time they
changed, compared to an 11% weight gain in mice
who were always allowed ad lib access to food.

In the longer-term crossover study, mice were kept


65

TRF appears to be very effective in protecting


against weight gain during a range of challenges,
including high-fat and high-sucrose diets, as well as
promoting weight loss and stabilization in preexisting diet-induced obesity.
Body fat and inflammation:
While each experimental group had comparable
lean mass, it was the differences in fat mass that
made up the differences seen in total body weight.
Compared to ad lib fed, mice on TRF had reductions in body fat of 62% (high-fat, high-sucrose)
and 26% (high-fructose). Increasing the length of
the TRF windows (from nine hour to 12 hour to
15 hour) led to an increase in percentage of body
fat, but even the 15 hour window was protective
compared to ad lib consumption. Mice in the
5:2 group were also protected from excessive fat
accumulation (48% less body fat than ad lib feeding). Mice on a normal diet that were fed ad lib
but transferred to TRF has 55% less fat than mice
maintained on ad lib diets.
Reduced inflammation was also seen in mice on
TRF (by looking at mRNA levels of pro-inflammatory cytokines TNF-a, IL-1b, and pro-inflammatory
chemokine Ccl8/Mcp2).
Blood glucose regulation:
When mice were fed a normal diet, TRF did not
offer any extra advantage over ad lib when looking
at fasting glucose levels. However, on a high-fat or
high-sugar diet, TRF reduced fasting glucose levels compared with ALF. Fasting insulin levels were
reduced in all TRF groups fed a high-fat diet. In
the crossover studies, insulin levels were nearly 5
times lower in mice maintained on TRF compared
with ad lib, while groups who had some exposure

to ad lib and TRF had fasting insulin in between


those two groups.
A glucose tolerance test was also performed and
all of the mice, except the mice eating normal food,
showed improved glucose tolerance compared to
their ad lib counterparts. The crossover studies also
revealed that TRF can reverse prior glucose intolerance as a result of diet-induced obesity.
Lipids:
This study suggests TRF is protective for a lot of
things, and blood lipids were no exception. Liver
triglyceride levels were reduced in all mice on a TRF
high-fat, high-sugar diet compared with their ad-lib
fed counterparts. In addition, switching mice to TRF
prevented further hepatic triglyceride accumulation
in ad lib fed mice, suggesting TRF as a possible clinical tool against fatty liver disease. Likewise, serum
triglycerides were also normalized when mice were
switched from ad lib high-fat to TRF. On a lower fat
diet however, serum triglycerides were unchanged
between TRF and ad lib.
Cholesterol levels, both absolute levels as well as the
daily rhythmic variation, can also improve on TRF.
Mice fed either high-fat or high-sugar diets on TRF
had significantly lower serum cholesterol levels than
those on ad lib feeding. It should be noted that there
are substantial differences in cholesterol metabolism between humans and mice. For example,
rodents have very low LDL compared to humans
due to more rapid clearance by LDL receptors. Most
serum cholesterol is carried by HDL, and they are
extremely resistant to atherosclerosis because of that.
In experiments knocking out their LDL receptors,
their lipids become more human-like and then they
become more likely to develop atherosclerosis. This
66

would suggest that changes in serum cholesterol in


mice may be caused by different mechanisms than
could occur in humans.
Additional benefits: Mice on TRF showed better
coordination skills and improvements in physical
endurance tests (nearly double the endurance performance of the ad lib group), which were not the
result of greater muscle strength, fiber type or glycogen storage, but likely from improved metabolic
responses to mobilizing energy stores. Enzymes that
regulate glycogen synthesis and gluconeogenesis
(creation of glucose from non-carbohydrate sources) were affected by TRF, as was the anabolic insulin/
Akt and catabolic AMPK pathways, and a variety of
cycling amino acid metabolites resulting in more
favorable daily patterns.
Mice that were subjected to a restricted feeding
window gained less fat and had better blood lipid
profiles than mice that were allowed to as much as
they wanted, even though all the mice consumed
the same amount of calories.

What does the study really


tell us?

Although we are not mice, these models can be


extremely valuable for understanding the mechanisms behind metabolic health and disease states.
This study offers a great deal more information than
previous TRF studies, because they used not only a
high-fat diet, but also high-sucrose, high-fructose,
various TRF windows (nine, 12, and 15 hour), five
day TRF, two day ad lib feeding, and longer term (25
and 38 week) crossover studies to determine lasting
effects of TRF.

This study confirms that, in animals, TRF can be


an effective treatment for a variety of disease states
such as obesity, diabetes, high cholesterol, fatty liver,
and circadian dysfunction, in the absence of a calorie deficit. While the bad diets showed the more
dramatic effects of TRF, mice fed normal-chow still
showed better body composition.

Although we are
not mice, these
models can be
extremely valuable
for understanding
the mechanisms
behind metabolic
health and disease
states.
Keep in mind that these mice were always fed during
the dark phase. It has been previously shown in
rodents that the food timing relative to the light:dark
cycle is very important (even in TRF). Some mice
have experienced an 18-19% increases in body weight
when eating the same number of calories during the
wrong (light) phase, compared to the dark phase
(normal eating times). To extrapolate this to humans
we need to think of the opposite, and pick our nine to
12 hour windows during the daytime.
67

The big picture

TRF could promote wider adherence than conventional dieting methods, because the emphasis is on
the timing of food intake and not on calorie counting. There will certainly be future studies which
can investigate the mechanisms of action, as well
as a large-scale randomized control trial (RCT) in
humans.

Until future studies are done, we can only guess at


how much of these results will translate to humans.
There are a few existing human studies which use
TRF, but nothing on the scale that is needed. There
is a great deal of research on Ramadan, which features a month-long TRF window. However, these
meals arent aligned to circadian rhythms, occurring at night, instead of during the day. There are a
few other recent studies which show reduced daily
energy intake, and either improved or no-change
in insulin action.
Trying a TRF window for yourself could offer benefits with very little downside. However, if you are
prone to hypoglycemia, consult your doctor before
trying this. Please see the FAQ for additional precautions. Time will tell if a RCT can show similar
results to what this study has shown, but it is indeed
very promising from a number of angles and for a
diverse population.

FAQs

This is great! Could a similar study be done in


humans?
Not the whole study with all the measurements,
but some parts could be done and have been done.
To have complete control of the amount and type
of food eaten for 38 weeks while controlling the

light:dark cycle would be basically impossible.


However, while rodent studies can control more
factors between groups, randomization in humans
could help to minimize variation between groups.
Costs of testing so many interventions over a long
study may be prohibitive. In addition, all of the
animals were sacrificed in order to be fully studied, which doesnt go over very well with human
participants or the institutional review boards who
approve the study.
There are a few more aspects of the study that would
differ in humans. Humans normally consume food
in a time-restricted manner. If you have breakfast at
7:00 a.m. and dinner at 6:00 p.m., that is an 11 or 12
hour feeding window. Most people dont eat in the
middle of the night, but when mice are fed high fat
diets, they often do eat in the middle of the night
(much more often than when theyre fed healthy
chow). Also, in this study, the major effects of TRF
only manifested when animals were fed unhealthy,
purified diets. TRF didnt have as much of an effect
when mice were eating a healthy diet. Humans typically arent fed purified oils and refined foods in
high amounts over the course of months in studies.
Lastly, in humans, as far as we currently know, a
calorie really is a calorie (as long as its absorbed
into circulation). There is no other food property
or diet characteristic known to substantially impact
adiposity in humans. Mice are not like that many
studies have shown calorie-independent effects of
diet characteristics on adiposity. Mice are able to
modify energy expenditure more readily and to a
larger extent than humans.
Does it matter when my window is?
Its hard to say with certainty, but from the existing
68

literature the best answer would be to keep most


of your food intake in the light phase, since animal
data seems to suggest that eating in the wrong
(sleep) phase leads to greater weight gain. This
would change with the seasons, during the summer
the times can be more flexible, but during winter it
may be best to keep the window earlier in the day.
Remember, our bodys clock is the light:dark cycle,
and not the time displayed on your watch.
Do I have to skip breakfast?
No! A growing body of research suggests a high-protein breakfast may have favorable effects on appetite
control. Additionally, glucose tolerance is better in
the morning, compared with later in the day due to
circadian variation. This impaired evening glucose
tolerance is likely due to decreases in both insulin
secretion and insulin sensitivity. When considering the circadian variation in glucose tolerance, a
roughly 9:00 a.m. - 6:00 p.m. window may work well,
although a variety of individual factors play into
exact timing.
Can my window change from day to day?
This study showed benefits from five days of TRF,
followed by two days of ad lib, suggesting that there
is some flexibility for the eating phase, and you
do not necessarily need to follow a rigid daily time
window. Keeping most of your food intake to the
light phase, but moving it up or back by a few hours
depending on the day could probably still be okay.
Is there anyone who should NOT try this?
Yes, extended fasting windows can be a stress on the
body. Often a good stress, but someone who is dealing with a lot of other stressors in their life should
approach this diet conservatively. Also, athletes
should probably not get too ambitious, particularly

those who want to bulk up or are still growing (high


school and college athletes), or people in very high
volume and high intensity training phases, such as
cyclists or triathletes. People with advanced liver
disease should speak to their doctor before practicing TRF. While TRF may be protective against fatty
liver, a bedtime snack is typically recommended for
people with advanced liver disease.

What I should know?

This animal study suggests that keeping food intake


within a nine to 12 hour daily feeding window can
be beneficial in a number of different ways. These
results become more apparent when consuming a
poor diet that would otherwise lead to obesity and
metabolic dysfunction, but benefits also extend to
animals eating an otherwise normal diet.
The natural question that arises is: Should I try time
restricted feeding? We dont know how well the benefits shown in this study applies to humans, given
the physiological and environmental differences
from rodents, but restricting food to moderate
daily feeding windows is unlikely to do harm for
most people. Access to food at all times of the day,
during all times of the year is not necessary for most
humans, and trying a different eating pattern may
produce quite beneficial results without having to
micromanage different parts of your diet.
To discuss all the different possible types of intermittent fasting protocols, and their impacts on humans,
check out our private Facebook group for ERD
readers.

69

How does a lifetime


of marijuana use
affect the brain?
Long-term effects of
marijuana on the brain

Introduction

Marijuana use is popular due to the psychoactive effects of gamma-9-tetrahydrocannabinol (THC). Its known that marijuana has a multitude of effects on the
brain, as seen in Figure 1, but understanding the exact effects can be a complicated scientific process.
Within the brain, there are two major types of cells: neurons and glial cells,
pictured in Figure 2. Neurons are the cells that respond to and carry electrical signals, while glial cells provide support and protection to the neurons.
Networks of cells form either gray matter or white matter tissue. Gray matter and
white matter both are made up of neurons (and glia), but the gray matter is the
cell bodies that contain the nucleus and most of the cellular machinery, while
the white matter are the thin telephone lines between neurons, wrapped in a
myelin sheath (which is a structure that is part of a specialized class of glial cells).

70

FIGURE 1: Brain areas affected by marijuana

Figure 2: Brain terminology

Gray matter is involved in everything


the brain does such as processing
and cognition activities, including
decision making and self-control. Its
grey due to the lack of myelin, the
insulating sheath around the outside
of some brain cells. White matter
physically connects and coordinates
communication between different regions of the brain by carrying
electrical impulses from neuron to
neuron. The myelin is white in color,
which distinguishes it visually from
the gray matter.
This particular study investigated
several specific regions of the brain,
in addition to types of brain tissue.
The forceps major and forceps minor
are two regions of white matter in
the brain. The forceps major connects the occipital lobes within the
cerebral cortex and the forceps minor
connects the frontal lobes of the
cerebral cortex. Within the frontal
lobes, there is a region of the brain
called the orbitofrontal network. This
network is made up for four lobes:
the left and right orbitofrontal cortex
(OFC) and the left and right temporal lobes. The primary function
of this region of the brain is decision-making, specifically the analysis
of the possible rewards of a decision.
This region of the brain displays high
levels of activation during addiction-seeking behaviors like heavy
drug use. Specifically for marijuana
71

use, the OFC also has a high concentration of cannabinoid 1 (CB1) receptors, the receptor that binds THC.
Previous studies looking at the effects of marijuana have conflicting results. Some studies showed
increases in tissue volumes in certain regions of the
brain, others have showed decreases in the same
areas of the brain, and still others have shown no
effects. This
could be due
to differences in the
study populations, either
in regard to
the participant characteristics, or in the level of marijuana use.
Other potential confounding variables include only
investigating a particular age range or duration of
marijuana use, the enrollment of subjects who used
other substances along with marijuana, or designing
the study to only look at a single region of the brain.

Who and what was studied?

This was an observational (non-interventional)


study that compared 48 regular cannabis users
with 62 non-users of similar age and sex. A regular user was defined as someone who self-reported
using marijuana at least four times a week and took
a drug test (via a urine sample) that was positive
for THC at enrollment. A non-user was defined as
someone who
self-reported no
marijuana use
and had a negative drug test at
enrollment.

Research on marijuana
impacts on brain function
has had conflicting results.

This study attempted to overcome those limitations


by looking at a broad range of participant ages,
evaluating a subset of participants who exclusively
used marijuana, and using several different types
of MRI scans to evaluate a number of factors in the
brain globally.
Marijuana affects a variety of brain regions,
including a region called the orbitofrontal cortex
(OFC) involved in decision-making. Research on
marijuana impacts on brain function has had conflicting results.

All study participants took an IQ test at the beginning of the study.


Marijuana users were also assessed for behavioral
issues related to possible marijuana dependency
through the Marijuana Problem Survey (MPS). The
MPS asks participants to identify and rate problem
areas such as missing work, conflicts with family
and significant others, or legal issues as a result of
their marijuana use.
Once enrolled, study participants underwent three
different MRI scans to assess different structural and
functional aspects of the brain:
a high resolution visual MRI scan to quantify the
amount of gray matter in the participants brain
a functional MRI (fMRI) scan to determine
functional connectivity, or how much blood
flow occurred in different brain regions
a diffusion tensor imaging (DTI) scan to deter72

mine structural connectivity, or how much white matter


exists between different regions of the brain, and how organized the white matter is
Because marijuana is often used along with other substances, the
researchers separated out a subset of the cannabis users into exclusively
cannabis users, who had no self-reported use of alcohol or tobacco. This
allowed the researchers to determine if any of the structural and functional changes seen in the MRI scans were due to cannabis use alone.
Three different MRI scans were used to assess gray matter, white
matter, and connectivity between brain regions. Users of marijuana
only, rather than users of marijuana and other substances, were also
tested separately.

What were the findings?

The researchers noted a statistically significant difference in the


IQ scores of the cannabis users, compared to the non-users.
Mean IQ scores were approximately five points lower among
cannabis users, even though the educational levels between the
two groups were similar. However, further statistical analysis
did not indicate a direct causal link from marijuana use, to
neural abnormalities that may arise from its use, to lowered
IQ. A number of alternative factors, such as genetics and environment, could be involved in this causal pathway, or even
explain the difference between the groups themselves. Though
untested, its also possible that a lower IQ could increase the
likelihood that someone will become a regular marijuana user,
rather than lowered IQ being an effect of heavy marijuana use.
The average IQ score in the cannabis users group was
approximately five points lower than in the non-users group,
though statistical analysis could not confirm if this was a
cause or an effect of marijuana use, or due to other factors.

73

The subgroup of participants who were exclusively


cannabis users had similar MRI results as the cannabis users group as a whole. This indicated that
any changes in brain structure and function were
correlated with cannabis use and not the use of
other substances.
The study compared the high-resolution MRI scans
of the cannabis users with the non-users. A significantly lower volume of gray matter was seen in the
right middle orbitofrontal and left superior orbitofrontal regions of the brains of cannabis users. This
structural difference, however, cannot be determined to be a result of cannabis use, since this is a
correlational finding, thus suggesting it is also possible that subjects with lower volume of gray matter
are more likely to become chronic users.
When the researchers looked at fMRI scans to
detect brain function in the gray matter, they
found that the cannabis users group had more
functional connectivity in the four nodes of the
OFC regions of the brain, as measured by blood
flow in the gray matter. Even though there was less
gray matter in the cannabis users brains, the tissue
that was there showed increased functional connectivity at rest when compared to the non-users
brains. The researchers believe this to be a compensatory mechanism to maintain brain function even
as brain volume decreased.
Next, the researchers looked at the structural connectivity of the same regions of the brain, measuring
white matter in the forceps minor region that interconnects the different areas of the OFC. The DTI
scan uses magnetic resonance to measure the diffusion, or passive movement of water through regions
of the brain, providing information about the micro-

[...] the
tissue that was
there showed
increased
functional
connectivity
at rest when
compared to
the non-users
brains.
structure and level of organization of the axons in
the brain tissue.
One indicator in the DTI scan is fractional anisotropy (FA). FA reflects the density and myelination
of the axons, and is measured on a scale from 0 to 1.
A higher FA means that water diffusion is restricted
to a single direction, implying that the local water
is inside long thin fibers (axons) as opposed to little lumps (cell bodies). This is indicative of a more
fibrous and organized region. A lower FA means that
water diffusion is less restricted and indicates a less
organized and axonally-dense region of the brain.
Another indicator in the DTI scan is radial diffusivity (RD). This is a measurement of diffusion along
74

two axes, which is decreased in more mature white matter brain tissue
and increased when cells in white matter become demyelinated.
The DTI scan showed that cannabis users as a group had higher FA
measurements in the forceps minor, but not in the forceps major
region. This effect was localized to the forceps minor region, as no statistically significant effects were seen in the forceps major region within
the occipital lobes. In summary, this region looked more organized and
more myelinated in cannabis users. A more organized neuronal network with more myelination can result in more efficient transmission
of electrical signals in the existing brain tissue. Since this increased
organization was seen in the forceps minor, which connects the frontal
lobes, this could possibly translate to compensatory improvements in
short-term memory, attention, and motivation.
The MRI scans were correlated with the intake data, and some interesting
patterns emerged. While the cannabis-using group as a whole had higher FA and lower RD indicators in the DTI scans, when the researchers
looked at how long each individual participant had been a cannabis user,
they found that there were highly significant correlations between the
DTI scan indicators and lifetime duration of cannabis use. This makes
the causal, rather than just correlational explanation a bit more likely.
There was an initial improvement in these scores (increased FA and
decreased RD) over the first several years of cannabis use, followed by
an overall decline in these indicators as usage became more long term.
The participants who had been using cannabis the longest had indications that their white matter was less organized and more demyelinated
than participants who had only been using cannabis for a few years.
Additionally, the researchers found that the functional connectivity
measured in the fMRI scans showed strong correlations with a participants score on the MPS. The less functional connectivity seen on the
fMRI scan, particularly in the left temporal cortex region of the brain,
the more likely a participant was to have behavioral and social problems
related to their cannabis use, as indicated by higher MPS scores. Within
the exclusive cannabis use group, there was also a statistically significant correlation between gray matter volume in the OFC and scores on
75

There used to be misconceptions that


the brain was a relatively unplastic organ
after adulthood, but more and more
research is finding that both positive and
negative changes can take place [...]
the MPS: as the amount of gray matter in the brain
decreased, the MPS scores increased.
The fMRI also showed that activity in the OFC correlated with the age that the participant began using
marijuana - the earlier the participant had become a
regular user, the greater functional connectivity was
measured.
Participants who had only been using cannabis
for several years showed higher indicators of axonal organization and brain tissue maturation, but
these measurements declined as cannabis usage
became more long-term.

The big picture

The biggest challenge when interpreting this study is


attempting to determine cause and effect out of all
the correlations in the data. Are people who have
higher IQs or more gray matter less likely to become
a chronic marijuana user, or is the marijuana use
causing that physical change? At least one study has
suggests that children who had smaller OFC volumes were more likely to become marijuana users in
their teens.

Since this study (and many others) only looks at a


single point in time using MRI scans, its not possible to determine which variable is the cause and
which is the effect. The researchers who conducted
this study noted that longitudinal studies would be
needed to fully understand this. However, since people who had used marijuana for a longer period of
time had stronger associations with brain structure
and function, that does boost the likelihood of the
causal explanation. Some mechanistic plausibility
also exists, as both animal and human studies have
found potentially neurotoxic effects of marijuana.
Whether the reductions in gray matter are a cause
or effect of cannabis use, the brain is a complicated
organ, and appears to attempt to compensate for
reduced tissue volume by increasing the functional
connectivity of the present tissue. This may be why
the new marijuana users had more organized white
matter and higher resting activity on the fMRI. Over
time, however, these indicators declined as additional structural changes took place in response to
cannabis use.
This is really the most interesting, and perhaps
slightly unexpected, part of the study. There used to
76

be misconceptions that
the brain was a relatively unplastic organ after
adulthood, but more and
more research is finding
that both positive and
negative changes can
take place due to a number of different external
effects. The initial effects
of marijuana on the brain
seem to be the brains way
of attempting to maintain regular function in the
face of tissue loss. The earlier a subject began using
marijuana, the more pronounced these initial compensatory effects were, since the brain still is more
neuroplastic (building and wiring connections)
through adolescence and into the early 20s. Starting
to use marijuana later in life would not be as efficient
at taking advantage of this increased neuroplastic
stage.

[...] the balance


of positive and
negative impacts
from marijuana is
hard to evaluate.

The correlation between gray matter volume and


scores on the MPS is unsurprising, given that the
function of gray matter is decision making and
self-control. A person with a lowered capacity to
make decisions and exercise self-control is more likely
to have issues with social and psychological activities.

Frequently asked questions

Could different strains of marijuana have different


effects on the brain?
Its definitely possible. Its not known if THC specifically is the cause of any structural and functional
changes in the brain, but the OFC is a region of the
brain that has a high level of cannabinoid 1 receptors, which bind THC. Strains that have higher

levels of THC might have


greater long term effects
on this region of the brain.
Other constituents of
marijuana have effects on
the brain as well (although
they are much less psychoactive or not at all),
and strains vary in the
ratio of THC to these other constituents.

How significant are the differences in IQ seen


between the two groups?
The five-point difference between the marijuana users
and non-users is within one standard deviation. Both
groups actually scored above average (106 and 111,
respectively). This is comparable to the difference
seen between adults with some college education but
no degree, and adults with a college degree. Keep in
mind though, the actual education levels were basically the same between the two groups.
Do randomized trials on marijuana show impacts
on cognition?
There have been a variety of cognition-related randomized trials done on chronic marijuana users,
with results typically showing some impairment.
For example, one trial found that marijuana acutely
decreased blood flow in attention-related areas of
the brain. Prospective observational studies have
also found potential brain-related harm from marijuana use. One found that persistent marijuana
use over the course of years was associated with
increased cognitive problems and general decline in
neuropsychological functioning.

77

On the flip side, there has been an increasing amount of research on


potential pain-related and other benefits of marijuana use, which involves
other parts of the nervous system and brain. Systematic reviews of randomized trials have found benefit for neuropathic pain, and potential
for helping with other kinds of pain such as that from fibromyalgia and
rheumatoid arthritis. Given that few if any randomized trials test chronic
effects over a period of years, the balance of positive and negative impacts
from marijuana is hard to evaluate.

What should I know?

Several different types of MRI scans found differences between the brains
of long-term marijuana users and non-users. Regular marijuana users had
lower volumes of gray matter, but also had indicators of increased connectivity and functional connectivity in several regions of the brain. People
who had only used marijuana for several years had more connectivity in
their brains white matter tracts, but these factors declined as use became
more long term. Marijuana users also had slightly lower IQs, but it was
not clear if this was due to marijuana use or other factors.
Examine.com has compiled a wealth of scientific knowledge on the effects
of cannabis supplementation on their Marijuana page.
To discuss the impact of marijuana on brain function, but without the
generalities and annoyances that often come with debating such a topic on
the web, visit the ERD private forum on Facebook.

78

A mouses
microbiome
may cause its
brain to leak

The gut microbiota influences


blood-brain barrier
permeability in mice
Introduction

Your gut has much more to do with your brain than just the influence it has when youre passing by the donut shop. Since our guts
take in all the fuel we need from the outside world, and our brains
are necessary for navigating the outside world, the two need some
way to communicate with each other. This method of communication between the gut and brain is called the gut-brain axis.
In the 1970s, the molecular mechanism through which the gut and
brain communicated was beginning to be understood. Several proteins and peptides (which are made out of the same building blocks
79

Figure 1: What is the blood-brain barrier?

as protein, but are smaller) were discovered, that


were both produced by and affected the gut and
brain. But a problem arose: communication between
the brain and gut largely involved big molecules like
proteins and peptides. How could such large molecules get across the blood-brain barrier (BBB)?
In order to answer this question, we first need to
understand what the BBB is and what its function
is. The BBB exists to make sure that compounds in
the blood dont necessarily enter the brain, and that
your brain keeps whatever nutrients it needs. You can

check out the details in Figure 1. There, you can see


that the BBB keeps out large molecules, while being a
little more loose about certain types of smaller molecules, while also selectively letting other molecules in.
The BBB is mainly made up of endothelial cells (the
kind of cells that line the inside of blood vessels) that
are tightly knit together by tight junctions, which
are composed of several types of proteins. The purpose of tight junctions is to make sure substances
dont accidently slip in between cells. Two of the proteins which make up tight junctions are claudin and
80

occludin, which are discussed later in the review.


Since the BBB is made up of cells that are tightly-woven together, it is very hard for larger molecules to
pass from the bloodstream into the brain. If they do
get through, they do so either selectively through
transporters or because the BBB is damaged, leaky, or
otherwise compromised in some way.
Why does the BBB exist? Well, the brain is a pretty important organ, and so its wise to be selective
about what gets into and out of the brain. For
instance, if you get an infection, the BBB will hopefully stop the infection from reaching the brain. But
the BBB also plays a major role in the developing
brain as well. One way it does so is by helping to
regulate the environment of the growing brain to
create an optimum environment for development. It
also helps protect the growing brain from toxic outside influences, such as bacteria colonizing the gut
of newborns, during the so-called critical period of
brain development. In summary, if your brain is a
club, the BBB is like the bouncer: it does its best to
stick to the list of approved molecules, and doesnt
hesitate to bar entry to the less desirable clubgoers.
Like bouncers, the BBB also has to be trained. The
authors of this study hypothesized that gut microbiota (the bacteria that normally live in our gut and
usually dont cause any problems, and in fact can

help us) may play a role in this training. The gut


microbiome is known to contribute to other areas of
mammalian development, such as gut development
(including aspects of how it functions as a barrier)
and even other aspects of brain development. So its
not a far leap to suspect that the gut microbiome
may influence the BBB as well. The authors of this
paper set out to test exactly this hypothesis in mice.
Gut bacteria have been observed to influence brain
development, as well as influencing gut integrity.
This inspired researchers to examine whether or
not gut bacteria can also influence the integrity of
the blood-brain barrier (BBB), which selectively
allows molecules in and out of the brain.

Who and what was studied?

Two types of mice were looked at in this study:


pathogen-free and germ-free. Pathogen-free mice
had normal gut microbiota and lacked any kind of
bacteria that normally causes disease in mice. Germfree mice, however, had no bacteria in the gut at all.
Several questions were addressed in this study:
Does the gut microbiota of the mother affect
the BBB of the developing mouse fetus? This
was addressed by comparing how well an anti-

[...] the brain is a pretty important organ,


and so its wise to be selective about what
gets into and out of the brain.
81

body could cross the BBB of fetuses in mothers


who were either pathogen- or germ-free.
Do gut microbiota affect the permeability of
the BBB in adult mice? To address this question, both germ- and pathogen-free mice were
compared and the permeability of the BBB
was tested.
What do the tight junctions of these mice look
like? The composition and appearance of tight
junctions were examined to see if there were differences between germ- and pathogen-free mice.
Can changing the microbiota change the permeability of the BBB? Germ-free mice were
colonized by normal gut flora to become pathogen-free mice. The permeability of the BBB as
well as tight junctions were examined before
and after colonization to see if changing the gut
microbiome could actually change the BBB.
This study compared the permeability of the BBB
in germ-free mice (mice who had no gut bacteria
at all) to pathogen-free mice (who had normal gut
bacteria, but none that normally cause disease in
mice).

What were the findings?

The researchers found that the gut microbiota of the


mother mouse can affect the BBB permeability of
the developing mouse fetus. Specifically, BBB permeability of fetuses with germ-free mothers (who
had no significant levels of bacteria in their gut) was
greater than the BBB permeability of mice whose
mothers were merely pathogen-free (and so had

normal gut flora). This difference in permeability was observed toward the later part of gestation.
The increase in permeability was associated with
decreased expression of occludin, one of the main
proteins which make up tight junctions.
How could microbes living in the mothers gut affect
the permeability of the fetus BBB? The authors did
not determine a specific mechanism, but speculate
that lower BBB permeability would be beneficial for
fetuses whose mothers have normal gut flora. The
reason is that maternal gut microbes may require
higher nutritional demands in late pregnancy, which
would require tighter BBB permeability so that these
metabolic demands dont impose a cost on the growing brain of the fetus.
But what about in adult mice? Do the gut microbiota affect the permeability of the BBB in them?
Again, the answer is yes. The permeability of the
BBB in adult mice was greater in germ-free mice
than in pathogen-free mice in three separate tests
of BBB permeability. The researchers made sure that
this difference was not due to increased blood vessel penetration of the brain, since a higher density
of blood vessels would mean there are is a higher
chance that something would penetrate randomly,
just like how buying several lottery tickets increases
your chance of winning.
The differences in permeability could be accounted for by the differences in tight junctions found in
adult germ-free versus just pathogen-free mice. The
tight junctions of germ-free mice had lower levels of two major tight junction proteins, occludin
and claudin-5, as compared to pathogen-free mice.
Also, the more leaky tight junctions of germ-free
mice looked more diffuse and disorganized under
82

the microscope when compared to those of pathogen-free mice.


Finally, would taking germ-free mice with no gut
flora and colonizing them with normal flora change
their leaky BBB? It turns out that, again, the answer
is yes. Colonizing the gut of germ-free mice led to a
less permeable BBB, along with increased expression
of occludin and claudin-5.
But how do the gut bacteria talk to the BBB and
affect its permeability? One possible mechanism is
via short-chain fatty acids (SCFAs) which are synthesized specifically by bacteria. SCFAs are known
to affect the permeability of the gut, so perhaps they
affect the permeability of the BBB too. The researchers tested this hypothesis in two ways: by colonizing
germ-free mice with single strains of bacteria that
produce SCFAs (Clostridium tyrobutyricum, which
produces butyrate, and Bacteroides thetaiotaomicron,

The permeability
of the BBB in adult
mice was greater
in germ-free mice
than in pathogenfree mice in three
separate tests of
BBB permeability.

which produces acetate and propionate), and also by


just feeding an SCFA (butyrate) to germ-free mice,
and then measuring the effects on the BBB. They
found decreased BBB permeability in mice colonized by either kind of bacterium, as well as in mice
who were fed butyrate. In fact, the BBB became just
as impermeable in these mice as in mice who were
pathogen-free.
The researchers found that the BBB was less leaky
when mice had normal gut flora. Completely
germ-free mice had a more permeable BBB. Also,
germ-free mice subsequently colonized with
normal flora experienced decreased BBB permeability. Interestingly, the BBB of mouse fetuses was
also leakier if their mother was germ-free, than
if she was just pathogen-free. The permeability of
the BBB may be in part affected by short-chain
fatty acids produced by normal gut bacteria,
which travel through the bloodstream and ultimately help make the BBB less permeable.

What does the study really


tell us?

This study tells us that having healthy gut microbes


in mice makes their BBB less permeable. Specifically,
having normal gut flora in mother mice helps their
fetal mice develop a more impermeable BBB. Also,
adult mice with normal flora have a less permeable
BBB, which is associated with lower expression of
certain tight junction proteins and more disorganized and diffuse-looking tight junctions.
Finally, these effects of normal gut microbes on the
BBB seem to be causal, since colonizing germ-free
mice with normal flora seems to decrease BBB per83

The science of our gut microbiota is still


young, and we can say little for certain, but
its starting to look like the microbes in our
gut could play many roles in maintaining a
healthy body.
meability. The mechanism for these changes are not
known, but may have something to do with SCFA
production of normal gut flora, as both monocolonizing germ-free mice with SCFA-producing
bacterial strains and just feeding germ-free mice an
SCFA decreases BBB permeability.
Lets also briefly mention what this study does not
tell us. It does not tell us anything about humans
taking probiotics. It does not say much about the
consequences of BBB permeability and whether its
good or bad. It also doesnt spell out the mechanism through which gut microbiota influence the
BBBs permeability, although it does contribute evidence for a possible answer. Any inferences beyond
the main point of this paper (that healthy gut
microbes in mice makes their BBB less permeable
compared to having no gut flora at all) would be
unreliable guesswork.
This study tells us that germ-free mice have more
permeable BBBs than pathogen-free mice. Any
additional extrapolation is speculation.

The big picture

The science of our gut microbiota is still young, and


we can say little for certain, but its starting to look
like the microbes in our gut could play many roles
in maintaining a healthy body. The gut microbiota
may affect cognition, and could possibly play a role
in obesity. The impact of probiotics on athletic performance has also been examined. Fecal transplants
to re-establish healthy gut flora are also starting to
be tested as treatments for disease, as a recent study
on ulcerative colitis in children demonstrates.
This study adds one more piece to this puzzle by
showing that gut microbiota play a role in creating
a less leaky blood-brain barrier in mice, and perhaps more surprisingly, that the gut microbiome of
a mother can, at least in mice, influence the BBB of
the fetus. Gut microbiota influence goes far!
However, the results of animal studies dont necessarily hold for humans. It is worth noting that
independent research has been done on the microbiome of prenatal and neonatal humans. For
instance, the gut microbiome is markedly different in babies that were born via natural methods,
as compared to babies born by c-section. The gut
84

microbiome of c-section infants was much less diverse, for up to


six months after birth.
This is the typical time period where the human diet begins to vary,
naturally contributing to microbiome diversity. The immune system goes through a lot of development during infancy, and research
shows that microbe changes via birth method may influence the
immune system in the long term. Thus, there is emerging evidence
that the gut microbiome of infants can vary, and that this may have
long-term effects.
This work also demonstrates the importance of animal research.
Animal studies can be quite important because work that is
impractical, cost-prohibitive, or unethical on humans may not be
deemed so for animals. For instance, this study could not have
been done on humans. The work that would go into maintaining
a sterile or monocolonized human GI tract is neither practical or
ethical.
The science of how gut microbes affect health is still young.
This study adds one more piece to the puzzle by suggesting the
possibility that gut microbes can affect distant organs, like the
brain, in unintuitive ways.

Frequently asked questions

What about the brain itself does a sterile environment for an


extended period of time impair neurogenesis?
This study did not reveal any differences in neurons themselves between germ- and pathogen-free mice, and the long
term effects of gut sterility on the brain itself were not studied.
However, there is emerging research that shows the gut may
influence local neuronal development, so an impact on neurogenesis is at least theoretically possible. Also, one study in
mice found that impaired neurogenesis due to stress could be
improved by administering a probiotic.

[...] gut
microbiota
play a role in
creating a less
leaky bloodbrain barrier
in mice, and
perhaps more
surprisingly,
that the gut
microbiome
of a mother
can, at least
in mice,
influence the
BBB of the
fetus.
85

What else influences BBB permeability?


While a lack of gut microbes is related to increased
permeability, there are many other factors that can
dictate BBB permeability. To list just a couple of
examples, increased ammonia content in the blood
during liver failure and high levels of c-reactive protein are associated with BBB permeability. Also, some
rodent data shows downregulation in tight junction
protein expression in mice that consume ethanol.
What is a good way to keep my gut microbiome
diverse?
Check out ERD # 2s article Of Mice and Guts for
more information on microbiome diversity.

What should I know?

This journal article was published in Science


Translational Medicine, a well-regarded interdisciplinary journal whose purpose is to help connect
basic science research with eventual clinical applications. The study results may not be directly
applicable in terms of directing human interventions, but it connects two massively important areas
the microbiome and the gut-brain axis.

blood-brain barrier (BBB) than mice with completely sterile guts, containing no bacteria of any kind.
Interestingly, the gut microbiome of pregnant mice
affected the BBB of their unborn fetuses. Mice with
healthy gut flora had fetuses whose BBB was less
permeable than those who were germ-free.
Finally, colonizing a germ-free mouse with a healthy
microbiome induces changes in their BBB, making
it less permeable. One mechanism by which gut
microbiota may influence the BBB is through producing of short-chain fatty acids, which enter the
bloodstream and eventually impact the BBB, making
it less permeable.
Whats your gut instinct about all this? Let us
know your thoughts on the ERD private forum on
Facebook.

The blood-brain barrier can both be be damaged


by disease and a cause of disease, and getting certain potentially important medications through the
blood-brain barrier is an active area of research.
From this study weve learned that the microbiome
may impact the blood-brain barrier, and human
studies on the topic are likely to follow. In short,
this is a research area that could pay dividends for
human health in the near future.
Specifically, this article showed that mice with
normal, healthy gut microbiomes had a less leaky
86

Ask the Researcher


Stuart M. Phillips,
Ph.D., FACN, FACSM
Stuart graduated with a Ph.D. from the University of Waterloo in Human Physiology in 1995. He joined McMaster in 1999 and is now a Professor in the Department of Kinesiology and an Adjunct Professor in the School of Medicine at McMaster University.
Stuart is a fellow of the American College of Sports Medicine (ACSM) and the
American College of Nutrition (ACN). His research is focused on the impact of
nutrition and exercise on human skeletal muscle protein turnover. As well he
is keenly interested in diet and exercise-induced changes in body composition
and the influence of all of the aforementioned in aging persons. An enthusiastic and energetic group of graduate students and research fellows are the true
heart of Dr. Phillips more than 180 publications, 120 public scientific presentations, and continuing enthusiasm for science and research.
You just published a review of protein for weight
loss. The somewhat new and mildly controversial
protein leverage hypothesis is mentioned. Whats
your take on that?
The Simpson and Raubenheimer leverage hypothesis is an interesting one, and one that may have some

applicability in humans see this study for a good


review and meta-regression of sorts. In short, I think
the whole protein-seeking behaviour espoused by
these two researchers and their teams is viable, but it
seems that most peoples natural setpoint for protein intake is around 15-17% of total energy intake
you have to consciously move toward higher
87

intakes and it does appear that leverages energy


from other macronutrients fat and carbs and can
control energy intake.
That was a pretty complex topic for an opening question. Taking a step back what originally brought
you into protein research?
Ha ha, yup deep end question to begin! I did an
undergraduate degree at McMaster in Biochemistry
and had an epiphany of sorts when I took a nutritional biochemistry course in my fourth year. It
changed the way I thought about things! I signed
up for a masters and studied protein and endurance athletes - my first paper - from then on I was

I dont think Id have done another macronutrient


fat or carbohydrate because that has so much to
do with obesity and diabetes, which, for whatever
reason, didnt interest me. With protein, theres less
people who study it and were a smaller group. Im
happy to have chosen protein, it still intrigues me
and I think its been a good career decision (Ill pretend I chose to study protein I think it chose me).
But change anything? No, not likely I am focused
more now on some paradigm-changing work, which
I always think is important. Ive still got a few more
years left and my focus will likely gradually change
in the next few years.

When you find something youre


passionate about and enjoy what youre
doing, its rarer to actually call it work.
hooked! I loved learning, I loved school, I loved
nutrition, and I liked (yes only liked) research it
wasnt until the last stages of my Ph.D. that I truly
grew to love research. It just evolved from there.
My passion for protein was honed while I was a
postdoc in Bob Wolfes lab from 1995 to 1998. I
learned so much there and had a ton of fun at the
same time! When you find something youre passionate about and enjoy what youre doing, its rarer
to actually call it work. So I work at what I love
doing and so far its been a lot of fun!
If you could go back in time, is there something else
you might have focused on?

Although youve been extremely successful in


research, the struggle of climbing the ivory tower is
tough. Whats your view of the publish or perish
environment of academia?
There are many types of ivory towers, just as there
are many types of occupations that people with a
Ph.D. pursue. In fact, most people (i.e., more than
80%) with a Ph.D. dont go into academia. Within
academia there are ivory towers that value teaching
more than research, but at McMaster the climb is
based on research first, and teaching second. So it
does force a publish or dont do well atmosphere,
though you can still publish and perish in my view!
With that pressure it does create a stress, since so
much of your work is evaluated by nameless and
88

faceless people who figuratively hold the cards and


can turn you one way or another. It has also lead
to folks doing some pretty weird things, and even
twisting or making up data, which really casts a
shadow on everyone in science.
But, like most jobs, perseverance and patience
tend to win out. Having good mentorship meant I
learned early how to write and craft decent grants,
which really helped early in my career. I think
(hope) Ive passed some of that on to others who
have trained with me! The trainees from my lab are
now installed in a few institutions around the world
and I hope they taken the good (and not so much of
the bad, ha ha) with them.

commodities, which some automatically assume


means youre an industry shill and have no morals
and are bound to speak the industry/commodity
party line. Its hard to live within that shadow, but
weve managed to blend basic science with, I think,

[...] weve
managed to
blend basic
science with,
I think, good
science from
industry money
too. I dont feel
like Ive sold my
soul and I sleep
well at nights.

Has the publish or perish atmosphere had impacts


on your life and how you do research?
Sure, early in my career I was very absorbed in my
work and was not, at times, the best partner to my
wife and even perhaps the best father to my kids
(Ive been trying to make amends before my oldest
boy now 15 begins to see me as nobody other
than the jerk who holds the car keys!!). It pays to
have an understanding and supportive spouse. I do,
however, think its true that those that tend to rise in
academia have to spend a disproportionate amount
of time in their work environment to succeed. I
doubt whether thats untrue in other professions,
however, and like most things that are a passion it
never really feels like work. I do have the greatest
wife on the planet, however, who I might mention
is also an academic in exercise physiology, but definitely keeps our lives and the household show on the
road!

good science from industry money too. I dont feel


like Ive sold my soul and I sleep well at night. Ive
been asked by some people why I just dont do more
basic work and get more government money.

From a research perspective, the pressure to publish


has also meant doing research with industry and

Honestly, its not like I havent tried, but funding is


tight, very tight, in Canada and everywhere. And
89

[...] so much of your work is evaluated


by nameless and faceless people who
figuratively hold the cards and can turn
you one way or another. It has also lead
to folks doing some pretty weird things,
and even twisting or making up data
while weve done well, relatively speaking (Id rate
my grant success rate at just above 20%), the sums
of money and budget restrictions are a big hurdle
for Canadian researchers to be competitive internationally. Still, blessings counted, fingers crossed,
weve done and continue to do better than average.
Of course all of this is due to the students, who are
the true lifeblood of our success as a group.
Some lifters and athletes go well beyond recommended protein intakes, and approach two grams
per pound of bodyweight a day. Would you expect
nitrogenous waste from this approach to have side
effects over time?
I think the point everybody has forgotten, or
perhaps were never taught, is that nitrogen (the
essential nuclide of amino acids) is metabolically
toxic in mammals. In fact, most species have evolved
a mechanism to get rid of nitrogen ammonia in
fish, uric acid in birds because theres no place to
store extra amino acids. Amino acids are used for
protein-requiring processes or they are not. You
cant store them, you cant magically make them
into something to be used later, so they are deam-

inated (the nitrogen is taken off and transferred to


another compound) and urea gets made.
So when people recommend two grams of protein
per pound (i.e., 4.4 grams per kilogram) they lack
a basic understanding of how and or why higher
protein might even possibly be used by the body
at that kind of level! Now, people can twist studies
and show whole body protein turnover measures
that support this estimate, but thats not muscle!
Its time for a serious reality check for anybody who
spouts those numbers or recommends huge doses of
supplements like BCAAs (which have next to zero
evidence for their effectiveness for building muscle,
but thats another story ).
Back to the question, does this kind of intake cause
harm? The biggest bugaboos for the higher protein diet are clearly bone loss and kidney disease.
The teaching on both, old school teaching, is that
higher protein lowers blood pH, which causes bone
resorption. Calcium is leached from your bones.
This results in your bones getting brittle as you
progress toward osteopenia and osteoporosis. This
90

I loved learning, I loved school, I loved


nutrition, and I liked (yes only liked)
researchit wasnt until the last stages of
my Ph.D. that I truly grew to love research.
theory is known as the acid ash hypothesis acid
from protein in blood and ash, or calcium from
your bones. The take home on this theory can be
neatly summarized in a series of nice meta-analyses thus an evidence-based answer: Evidence
suggests a linear association between changes in calcium excretion in response to experimental changes
in net acid excretion. However, this finding is not
evidence that the source of the excreted calcium is
bone or that this calciuria contributes to the development of osteoporosis.
Furthermore, There is no evidence from superior
quality balance studies that increasing the diet acid
load promotes skeletal bone mineral loss or osteoporosis Promotion of the alkaline diet to prevent
calcium loss is not justified. And finally, All of the
findings from this meta-analysis were contrary to
the acid ash hypothesis This meta-analysis did
not find evidence that phosphate intake contributes to demineralization of bone or to bone calcium
excretion in the urine.
Dietary advice that dairy products, meats, and
grains are detrimental to bone health due to acidic
phosphate content needs reassessment. There is no
evidence that higher phosphate intakes are detrimental to bone health. I think those analyses close

the book on some of the claptrap that some folks


spout about higher protein and bone. Renal disease
is a little more granular and harder to nail down,
but I think Ill go with the quotes from the WHO
report on Protein and Amino Acid Requirements in
Human Nutrition that states, the suggestion that
the decline of glomerular filtration rate that occurs
with advancing age in healthy subjects can be attenuated by reducing the protein in the diet appears to
have no foundation.
In addition, in the most recent revision of the DRI
by the Institute of Medicine that section on protein
requirements also states that there is no relationship
between increasing protein intakes and decline in
renal function in people with normal renal function.
Now, if you have a diseased kidney, then its perhaps
not a good idea to be eating lots of protein, there
is pretty clear evidence that a low(er) protein diet
(exact level not known) does extend lifespan. My
take: its hard to find evidence that intakes higher
than 1.6-1.8 grams of protein per kilogram of bodyweight are able to substantially augment gains in
muscle mass, as reviewed here, here, here, and here.
We need to get you back on here sometime Stu.
Always something new to learn. Thanks so much for
taking time to talk with us!
91

INTERVIEW:
Ramsey Nijem

First, I would like to thank the Examine.com team


for the interview. I am honored and will do my best
to not bore the highly educated audience.
Second, I would like to thank the other members
of our sport science staff who seamlessly combine
over 60 years of experience in the NBA with an evidence-based approach.
Lastly, I am obliged to say that the responses
below are on my behalf and do not represent the
Sacramento Kings or the NBA.
What do you do for the Sacramento Kings? How did
you get into strength and conditioning, and when
did sports nutrition come into the picture?
I am the assistant strength and conditioning coach
for the Sacramento Kings. We do not have a head
strength and conditioning coach. Our Director of
Sport Science, Robert Chip Schaefer, was con-

sulted for this interview as he oversees everything,


including nutrition for our team, and has seen it all
in his over 20 year NBA career. As an aside, Chip
is one of only a handful of people that can claim
to have worked with both Michael Jordan and
Kobe Bryant for all of their championship rings (11
combined).
My role is split, and shared with Chip, between
traditional strength and conditioning responsibilities and sport science. We continually collect data
and use the objective numbers to influence our
treatment and training decisions. Everything from
movement screening and joint range of motions to
power characteristics and on-court player loads are
collected regularly. We dont claim to be the first to
do this, nor do we pretend to have all the answers,
rather we pride ourselves on our interdisciplinary
approach, as our ultimate goal as a sport science
staff is to keep our guys healthy.
92

My desire to become an NBA strength and conditioning coach came the day I realized I wasnt going
to make it as a player. I figured if I couldnt play in
the NBA, then Id do everything I could to train the
guys that do. So I went on to earn a Masters degree
in sport performance and I am now working toward
my Doctorate degree in human and sport performance. Sports nutrition is obviously relevant when
studying and applying sports science to maximize
an athletes potential.
When considering the demands placed on a
high-level athletes
body, one is remiss
if nutrition is not
considered every
bit as important as
training and recovery. Indeed, what
an athlete puts
into their body
will influence their
ability to perform
over the course of the season. In a world where even
the slightest advantage counts, nutrition and supplementation offer an opportunity to train harder,
recover quicker, and ultimately perform better than
the competition.

in their nutrition and are able to stay healthy and


perform well, its my opinion that they are not optimizing their potential to train, recover, and perform
every night. I have observed that it often takes
something to trigger a player to proactively change
his eating. Whether its a slump in performance, a
cold, an injury, the grind of the season, or relative
old age, most these guys need an experience to wake
them up a bit. Perhaps thats just human nature.
The NBA season is looooooooong. How do players
cope with the grind of training, competition, stress,
and injury?
Looong indeed.
This is the essence of
what we are trying
to figure out with
all of our data. How
are guys adapting
to the stress that the
NBA season brings
and how can we
help them combat
the stress that undoubtedly wears them down? The
short answer is they keep up with treatment, training, and get as much rest as they can. But how we
go about managing loads on the court and in the
weight room is the complexity that brings us sport
science nerds to the drawing board. As a sport science staff we are able to watch the loads accumulate
over time and see how their body is reacting. We
can use that data, in an ideal world, to structure
training and treatment to allow for recovery, yet
provide enough of a stimulus to keep them strong,
explosive, and injury free. I say ideal world because
the most influential stressor to the NBA player is the
volume of games, and that is unchangeable. An NBA
team can have four games in five nights, and in an

[...] if I couldnt play


in the NBA, then Id do
everything I could to
train the guys that do.

Do you find any correlation between what a given


player eats and how well he performs?
Guys who eat a nutrient dense diet, especially diets
high in vegetable consumption, seem to resist the
viral infections that invariably run through teams
each year. These types of infections can affect a
player for weeks, so avoiding them can have a tremendous impact. Although I am not nave to the
fact that some players may not take much stock

93

[...] if I could force these guys to take


my advice, then they would take the
principles we have set in place and fully
commit to them year-round. Thats not to
say they dont do a good job already, but
there is always room for improvement.
average month a team will play 15 games. Its a war
of attrition.
The Western Conference has eight teams that could
potentially compete for the championship. Crazy!
Oh yeah, this is supposed to be about nutrition ...
what are some interesting things about how other
teams eat and train? Does it differ much from team
to team?
The nature of the game day schedule doesnt allow
for time to sit down and talk shop at a philosophical
level, and most interactions are mid-court banter
during pre-game warm-up. Thus it is hard to comment on the degree of difference between teams,
although Im sure training and nutrition practices can vary greatly. Ive seen videos of fellow NBA
strength coaches having players hex-bar rack pull
over 400 pounds from a mid-shin height (quite
impressive when considering limb lengths and
the amount of work being done), while Ive heard
stories of other coaches preferring to have players
perform banded glute and core work all session long.
On the nutrition side, I know some teams dont pro-

vide any fried foods or sodas and some may prohibit


junk food on the plane, while other teams have a
more hands-off approach and allow the players to
make their own decisions.
Im not here to say what the best training and nutrition practices are, but Im confident enough to
know a few things to be true. A training approach
that emphasizes the SAID and progressive overload
principles while appreciating the value of injury
risk reduction and movement quality is going to
provide a great return on investment. In a similar
even-keeled fashion, a nutritional approach that
emphasizes lean protein options, fruits and veggies, and complex carbohydrates will provide a
great nutrient return on caloric investment. These
approaches dont sell DVDs and t-shirts, but they
produce favorable results and are consistent with the
evidence-base.
Its hard to tell until you stand next to them, but
NBA players can be quite massive. How much does
a big forward or center, like DeMarcus Cousins, eat?
94

Id imagine they need quite a bit of fuel to run up


and down the court, plus practice and gym work.
I couldnt tell you exactly how much our guys are
eating, but suffice to say it is a lot. A typical NBA
big is likely burning between 4,000 and 5,000 calories (numbers approximated using Harris Benedict
formula for DeMarcus Cousins). Toss in an overtime or two for a big minute guy and you can
imagine that number can climb pretty high. We
take weights and skinfolds regularly (two to three
times per month, depending on our schedule) and
most guys are able
to remain calorically
neutral (neither surplus or deficit) over
time on their own,
with little weight
changes between
measurements.

If you could force players to take your advice on


strength, conditioning, and nutrition, what are some
important gems that they should keep in mind?
If I could force these guys to take my advice on
strength, conditioning, and nutrition, it would be to
view these things as ways to optimize their potential.
When a guy is making millions playing basketball,
it is hard to get them to see the value in some of
these things. They figure, Hey, I made it here doing
what Im doing, so why do I need to change? This
attitude is short-sighted, in my opinion. If they can
begin to appreciate
the value of training
(strength and conditioning), recovery,
and how nutrition
is involved in all of
it, they may see the
potential to play at a
higher level for longer. Undoubtedly
a stronger, better-conditioned,
better-recovered
athlete is a better-performing athlete. Not
to mention, all of these things can prolong careers
(which of course means more money for them). If
I could change their perspective on this stuff, my
job would be much easier. Time often changes their
perspective, as our veterans are willing to dedicate
more time to these things. But if young guys bought
in sooner, the results would speak for themselves.

An NBA team can


have four games in
five nights, and in an
average month a team
will play 15 games. Its
a war of attrition.

This lifestyle offers no


shortage of calories
for the guys to fuel
up on. In addition
to catered meals at
the practice facility,
hotels, plane rides, and arenas, we carry around a
nutrition trunk stocked with snack options for the
guys. Not eating can be more of an issue with our
schedule. When you get into a city at 2:00 a.m. and
have shoot-around at 10:00 a.m., guys will sacrifice
breakfast for an extra 30 minutes of sleep. Although
they are encouraged to make the right choices, like
calling for room service before going to practice
on an empty stomach, they do not always listen. In
these instances, well provide them with something
from the trunk.

When it comes to eating, I would force players to


take a more conscious approach to their nutrition
habits. Things like increasing lean protein consumption and limiting sugar consumption (should not
95

be read as an anti-sugar suggestion) to make room for


more nutrient-dense options like vegetables, whole
grains, and healthy fats. As far as training goes, I
would force players to dedicate off-season time to
the weight room. The NBA schedule makes it nearly
impossible to make any meaningful training adaptations during season, which makes the off-season an
important time for improving strength and power,
adding muscle, and improve movement quality and
groove movement patterns to reduce injury risk.
To summarize, if I could force these guys to take my
advice, then they would take the principles we have set
in place and fully commit to them year-round. Thats
not to say they dont do a good job already, but there is
always room for improvement.
Are supplements common in the NBA? If so, which
ones?
Supplements are absolutely common in the NBA.
Although, similar to nutrition practices, supplementation varies from team to team. The most common
supplements are whey protein, creatine, a multivitamin, fish oil, and vitamin D. Other common
supplements are high glycemic energy chews or gels
and caffeine. For guys with sleep issues, we provide
ZMA, and for chronic pain problems we provide glucosamine chondroitin. Although we acknowledge
some supplements are largely anecdotally supported
(e.g. ZMA), while others may be limited to specific
conditions (e.g. glucosamine and osteoarthritis), the
value of anecdotal evidence and placebo effect cannot
be discounted at this level. If a glucosamine chondroitin supplement rids a guy of knee pain even one game
sooner than if he had not taken it, then its use is justified in my opinion. A potential for benefit with little to
no risk is a win for us.

[...] the
dedication
that they have
is a habit
that could be
adapted by
anyone in any
line of work
and would be
beneficial.

96

In a world where even the slightest


advantage counts, nutrition and
supplementation offer an opportunity
to train harder, recover quicker, and
ultimately perform better than the
competition.
Being surrounded by world-class athletes, have you
noticed any particular habits that readers might
want to know about? Even though Charles Barkley
says hes not a role model, I know that many people
look up to professional athletes.
The amount of time these guys put in should be
applauded. Most guys are at practice early for treatment, weight room work, and shooting, yet stay after
for the same things. Surely they get compensated
well for what they do, but the dedication that they
have is a habit that could be adapted by anyone in
any line of work and would be beneficial. With that
said, they also have habits that should be avoided.
Not prioritizing their nutrition, strength, and conditioning can have a huge impact on their health and
longevity. The offseason is not only a time for them
to refine their game, but an opportunity to become
a better, less injury prone athlete. Most guys are only
focused on playing and not the off court work. This
is unfortunate and can backfire, as the NBA is only
becoming more and more athletic.

Thank you to the Examine.com team for their


high quality work. Your website and products are
influential in my nutrition and supplementation
approaches and I cant thank you enough for doing
my homework.
Thanks so much for taking some time out for us
Ramsey! This is really, really cool inside information
to learn. Sacramento is an intriguing team in an
extremely tough conference, so its good they have a
smart nutrition and conditioning team supporting
them. We look forward to watching the rest of the
season, and best of luck to you.

Ramsey Nijem is the Assistant


Strength and Conditioning
Coach for the Sacramento
Kings. He has an M.S. in
Kinesiology and is currently in
a doctorate program.
97

ERD
Until Next Issue...

The next issue of ERD will come out the


first week of February. In the meantime:
Comments? Questions?
Concerns?

Join the
Discussion Online

Send your thoughts to


erdeditor@examine.com

Join the conversation on


Issue 3 in our exclusive
ERD Private Forum
on Facebook

Spread the Word

98

Credits
Copy Editor: Dmitri Barvinok
iStock.com/wetcake
iStock.com/ponkrit
iStock.com/macrovector
iStock.com/Matt_Brown
iStock.com/miflippo
iStock.com/razihusin
iStock.com/Graffizone
iStock.com/tacojim
iStock.com/Johny87
iStock.com/Bouganville
iStock.com/MmeEmil
iStock.com/CaroTroconis
iStock.com/pjohnson1
iStock.com/erierika
iStock.com/MariuszBlach
iStock.com/Acerebel
iStock.com/hocus-focus

iStock.com/SharonFoelz
iStock.com/Jag_cz
iStock.com/aluxum
iStock.com/Trifonov_Evgeniy
iStock.com/4kodiak
iStock.com/YinYang
iStock.com/amphotora
iStock.com/julichka
iStock.com/loooby
iStock.com/snokid
iStock.com/RichVintage
iStock.com/lzf
iStock.com/zmeel
iStock.com/EdnaM
iStock.com/f4f
iStock.com/bopav
iStock.com/sshepard

99

Potrebbero piacerti anche