There’s no ‘U’ in diet. But there is an ‘I’

Dear Doctor Ninja,

I’m a 33 year old female. I started a new diet this year, and I’ve lost a bit of weight already. It feels like I lose the same 15 pounds every year no matter which one I try. I’m starting to wonder which diet would be best for me to lose the weight permanently. There are so many articles on so many diets, that I don’t know how to choose! Help!

F. Ifteen

There are a couple of things that almost all diet-type science folks agree upon:

  1. If you’re not eating enough food, and don’t have a medical condition, weight gain is nearly impossible. And by nearly impossible, they mean that if it were to happen—that someone gained substantial amounts of weight while eating not enough food that could ONLY be explained by their food, that it would be a publishable case report in a very prestigious medical journal and you wouldn’t be hearing about it on a blog first because the news would be THAT big.

  2. A diet that cannot be followed by an individual, can’t produce results.

After that, it turns into a shitshow.

But, if we consider the these two things, a diet that will cause weight-loss really just has one purpose:

To give you a set of eating “rules” that you can stick to, that allows you to not eat enough food.

Not eating enough food is not the same as being hungry. And hungry is a state of mind that is driven not only by what’s in your tummy, but also when you’re used to eating, how much you’re used to eating, who else is eating with you, or at the same time as you, how certain foods make you feel, what memories certain foods evoke, and what food means culturally to you and what your beliefs around what the function of food is, apart from just calories. And more.

Diets aren’t made for everyone (one might even argue that they’re not made for any one). They make assumptions about what people will and won’t tolerate. For some people, eating three meals per day, no matter what is in those meals is mandatory. For other people, feeling full is more important than eating three meals a day. The range of priorities that we each MUST have in order to feel satisfied (or at least not angry) about our food is as wide as the number of diets available to us.

Thinking about what’s important to you when it comes to food and then picking the diet that respects those values is more important than picking a diet that is “scientifically best” (as though there was such a thing) and changing yourself to fit it.

The diet you pick is like picking who to date. You have a certain set of core values that you won’t give up for anyone. A compatible partner respects or even shares those values. Likewise, all relationships require some degree of compromise. But that compromise doesn’t involve changing who you are fundamentally. Your relationship with your diet shouldn’t be an abusive one.

And just like we outgrow some relationships, we can outgrow our diets. The diet that you partner with that enables you to lose those 15 pounds might not be the diet that enables you stay at your new weight, because at your new weight, some values can change. The good news is that breaking up with your old diet doesn’t have to be full of angst. You don’t even have to text it.

Arthritis and the price of wine

Dear Doctor Ninja,

I’m a man in his 60’s. I read a news story about how having things called Heberden’s nodes on your fingers means you probably have bad arthritis in your knees. I have these nodes on my fingers and now I’m worried about my knees. Should I be seeing my doctor about this?

B. Umpy

Heberden’s nodes are usually linked with finger arthritis. They look like small bumps on the back of your finger at the finger knuckle closest to your finger nail.

But what we are dealing with, in both the case of Heberden’s nodes and in the case of the study cited in the news article is the definition of arthritis, or rather the ways in which the word “arthritis” is used.

Cartilage is the extra layer of stuff that caps many of the bone ends in your body to allow them to move smoothly against each other. Strictly speaking, arthritis is the term used to describe any damage or wearing away of cartilage in a joint. We can tell there has been loss of cartilage either by looking at it directly with a camera in the joint (called arthroscopy), or with surgery (not that anyone would open up your knee wide open just to look in it); but also indirectly from tests like x-rays or MRI’s. X-ray or MRI are the most common ways to tell if there’s loss of cartilage.

However, that’s not the way most people think of arthritis. Most people think of arthritis as joint pain caused by loss of cartilage in the joint. Most people think of things like knee and hip replacements when they think of arthritis.

When a study is published that says that bumps on the back of your fingers are potentially a sign for arthritis in your knees that will get worse over the next 2 years, this can understandably cause some distress.

Here’s the thing: There are lots of people out there who have loss of cartilage on their x-ray or MRI who don’t have any pain or loss of function. And this has also been studied.

If you have arthritis (i.e. loss of cartilage) but no pain or loss of function, it is generally considered a non-issue. It’s only when the arthritis (i.e. loss of cartilage) comes with pain or loss of function that anything needs to be addressed.

So what about this study about bumps on the fingers and knee arthritis?

The study in question was a study looking at the association between the bumps on the fingers (Heberdern’s nodes) and the kinds of findings we associate with loss of cartilage in the knee. However, the study in question did not look at whether these findings were related to pain or loss of function.

However, since we know that a lot of people (50% in some studies) have “arthritis” on x-ray or MRI don’t have any pain or loss of function, what does it mean when someone else says having nodes on your fingers is linked to “arthritis” on MRI? How does knowing that your MRI will be more likely to look worse in 2 years help you change anything about the way you live, given that you might still only have a 50% chance of having pain with the worse-looking MRI?

A more expensive wine might make for a better story for whoever buys it, but the price on a bottle of wine doesn’t always make the wine taste better or worse. Price might be linked to taste rating if you can see both the price and taste the taste, but if the wine tastes fine (or even good, or amazing), and the price is low, does the price matter?

Fewer Flying F’s for Fighting Flu’s

Dear Doctor Ninja,

Are flu shots part of the anti-vaccination movement? After my social media post yesterday about my 5 year old and I getting our flu shots, I’ve heard from pro-vaxxers, anti-vaxxers, and people who give flu shots separate consideration from vaccines. What’s the story?

P. Rick

To answer your first question: Yes. Flu shots are vaccines and therefore would not be supported by people taking an anti-vaccination stand.

But what is the story?

Throughout modern history, there have been outbreaks of viruses. The symptoms we think of as the flu are caused by many different viruses. The outbreak of 1918, in a time before we even knew influenza was caused by a virus, it’s estimated that 500 millions people caught the spreading strain (thought to be the H1N1 strain) and that 50 million people died from it.

Since the flu is caused by viruses, there are very few treatments to cure it once you have it. Anti-viral medications help, but usually only shorten how long you have symptoms and sometimes make the symptoms less.

Since so many different viruses cause the flu, a flu vaccine can’t protect you from all of them. Every year, public health scientists and doctors (for instance, at the CDC in the USA) look at the information about what flu viruses have been circulating the most. They then make an educated prediction about which viruses will be the most common in the up-coming flu season, which usually runs Fall to Winter and vaccines to those predicted viruses are produced. Since it takes 6 months to mass-produce a vaccine, they have to commit to the prediction by February or March before the fall (September/October) in the Northern hemisphere, and September or March before the fall (February/March) in the Southern hemisphere.

As the saying goes, “It’s always twelve o’clock somewhere,” which means it’s also flu season twice a year globally. Since each flu season runs for almost 6 months, that means it’s almost always flu season somewhere too.

The most common question posed about the flu vaccine is usually, “Why did I still get the flu even though I got a vaccine?”

First off, you can’t tell you’ve been protected against things you can’t see. So while you might have gotten the flu last year, it’s difficult to tell which virus you got and whether you were _also_ infected with one of the viruses that were in the vaccine last year where you didn’t get symptoms.

Secondly, since you can’t be vaccinated against _all_ flu viruses, you can still get “a flu”, just not from the viruses you had a vaccination for.

Thirdly, there are some viruses that are difficult to make vaccines for. When these types of viruses are the predicted viruses, even if the prediction is right, it can be difficult to make an effective vaccine against them because of limitations on vaccine creation and production technology.

Lastly, sometimes, the prediction is off. You’re vaccinated and protected, but not against the viruses that are the most common in your area of the world.

So the reason why some people give flu vaccines a different consideration from other vaccines is this uncertainty. There aren’t a lot of different kinds of germs that cause tetanus. There’s not much guesswork in creating a tetanus vaccine. Therefore, if you were properly vaccinated against tetanus, it would be nearly impossible for you to get lockjaw if you were exposed to the tetanus bacteria.

So why get a flu vaccine at all?

When the weatherperson tells you it’s going to rain, you bring an umbrella because you don’t want to get wet. Even if it doesn’t rain, you’re ready. And sometimes, even though you have an umbrella, it rains sideways, so you still get wet. Or, it hails golfball-sized hail, and the umbrella isn’t really that useful.

One reason to get the flu vaccine is to protect yourself against the predicted viruses that cause the flu. And if you’re in a category of health where getting the flu is potentially a disaster (like young children, or the elderly), then an umbrella even on a day where it might not rain is a pretty good idea.

But other reason to get the flu vaccine is if you’re around other people where getting the flu is potentially a disaster. The thing about the flu is that you can pass on a flu virus before you feel sick. If you’re immune to the dominant virus in your area, you’re much less likely to pass it on. You’re a virus blocker. You break the chain for that virus. Which means you protect others who are either unable to protect themselves, or aren’t as good at it as you. You’re like the person who doesn’t forward email from Nigerian Princes to your friends and tells your mother to stop doing that.

To pot or not?

Dear Doctor Ninja,

I’m an active dude in his early 40’s in Canada where cannabis was just legalized. I’ve never used pot before. I’ve been hearing how CBD oils are great for better sleep, and less inflammation. I’m pretty interested in both of these benefits especially as I get older. But there’s a science reported on a website that says my brain could become less connected to itself if I use it (website included). What should I do?

G. Anja

The science you’re worried about is this: Zimmermann K et al. Emotional regulation deficits in regular marijuana users. Human Brain Mapping .Aug;38(8):4270-4279. doi: 10.1002/hbm.23671, 2017.

There are two answers (and one bonus answer) to this question, but all lead to the same answer.

This study picked 23 men who use marijuana, and 20 men who didn’t use marijuana.

Answer 1:

CBD oil is usually derived from hemp, and the main active ingredient is cannabidiol (hence CBD). The main active ingredient in marijuana is THC (tetrahydrocannabinol).

So before we move any further, this is a problem.

Suppose you have a friend who says, “We should go to this party. Because ballroom dancers love it and there are rave reviews from ballroom dancers online.”

The first question you should ask your friend is, “Are we close enough to ballroom dancers that we should consider ourselves essentially ballroom dancers and go to this party?”

If you’re a ballroom dancer, you should go. That sounds like a lot of fun.

If you’re not a ballroom dancer but you want to be a ballroom dancer, you should go. That sounds like a lot of fun.

If you’re not a ballroom dancer and you don’t want to be a ballroom dancer, then you have to decide whether you care enough about what ballroom dancers think about this party to commit your night to it. Chances are, if you’re not a ballroom dancer and don’t have aspirations to become one, the opinions of ballroom dancers are not terribly useful in making this decision.

The first question you should be asking about this science is, “Is CBD close enough to THC that I could consider them the same thing.” The answer at this point in time, seems to be “No.”

You can stop reading here if you want.

Answer 2:

Let’s pretend that you asked a question about THC oil, since that is going to be legal in Canada (it’s not legal to sell yet.) so that we can get on an apples-to-apples page.

In order to be considered a marijuana user in this study, you had use marijuana at least 3 days a week for the past year, and you had to have used it at least 200 times in your lifetime. You cannot have used any other illicit substance more than 50 times in your life or in the last 28 days before testing. You had to have not used marijuana for 48 hours before testing.

The authors did not define what was meant by “marijuana use”.

If you ask someone about their use of cookies, where use is eating and not smearing cookies on their skin, I think we would all agree that eating one cookie is a use of cookies. But we all know that there are people who eat just one cookie (it’s real, they DO exist) and then there are the rest of us who think of “one cookie” as a sleeve of cookies. You would never trust a diet study to help you decide about eating cookies that asked about cookie usage without quantifying what cookie usage meant.

So why would you trust a brain study to help you make a decision about THC use that asks about marijuana use without quantifying what marijuana use means?

Bonus Answer:

So far, we STILL can’t use this as a trusted reason on which a decision could be based.

So, let’s pretend that the authors DID quantify marijuana use, and that it magically fell in the range that you are thinking of using it.

The authors showed subjects either neutral or negative (i.e. disturbing) images while in an MRI machine. The asked subjects to rate how negative they felt when shown either neutral images, or disturbing images. On some disturbing images, subjects were asked to try to distance themselves emotionally from the disturbing image so that they felt less disturbed by them.

The authors noted that globally, when they considered all the ways you could compare the different groups and image conditions, that there was evidence to suggest that there was a difference in the whole group’s (both marijuana users and non-marijuana users together) negativity rating between neutral images and disturbing ones. They also found that there was evidence to suggest that there was a difference in negativity rating between disturbing images without the request to distance from the image, and disturbing images WITH the request to distance, again in the whole group.

There was no evidence to suggest that marijuana user were different from non-marijuana users.

The main conclusion is attempted to be justified by a test performed after the global test concluded that there was no effect of group (marijuana or no marijuana) which showed a statistically significant effect. Why the authors insisted on performing a test that was not statistically warranted is unknown, but it would be considered inappropriate.

This is the linchpin of this science. In order to state that differences in brain connectivity are related to a decreased ability to distance yourself from negative images, you have to prove that marijuana users are worse at distancing themselves from non-marijuana users. If you can’t show this, then no one cares what the differences are in brain activity. The authors depend on an inappropriate statistical finding to tie the two things together. That makes this knowledge unreliable at best. But in case Answer 3 doesn’t stack up, the TL:DR is the same because of Answers 1 and 2.

TL:DR :

I’m not saying that you should or you shouldn’t use CBD oils. That’s a choice that goes beyond whether your brain gets less connected to itself or not. But if the story the science you’re worried about doesn’t study the thing you’re using, doesn’t tell you how much is considered using the thing, and has a dubious statistical result, then it’s not a science that can help you make this decision, regardless of what that story says.

Vimms, Vigor, and Foursomes

Dear Doctor Ninja,

I’m very fair and burn easily so I cover up and wear a lot of sunscreen in the summer. That got me wondering about Vitamin D and vitamins. I’m not sure whether or not I should I be taking Vitamin D supplements, or any other supplements for that matter.

On the one hand, we’re told that we can get all of the essential vitamins and nutrients that we need from our food. On the other hand, many experts say that it’s just not possible to eat a diverse enough diet to get everything we need from food. But then, just to confuse things more, every few years a new study comes out claiming that vitamin supplements are actually bad for you! Argh! How can I decide whether or not to take vitamins?

Signed,

D. Ficient

The latest study to tell us taking vitamins can be bad, D. Ficient, is the 2018 Jenkins study, which looked at all the randomized-controlled trials on vitamin supplementation and risk of death. An increased chance of death was reported for niacin (when combined with a statin), and antioxidants. This often called “all-cause mortality” not because they studied death from all causes, but because all of the studies that were combined measured death by a variety of causes, and “all-cause mortality” is shorter than writing, “death from heart attack, stroke, breast cancer, prostate cancer, lung cancer, and so on and so on”.

What are vitamins for?

First let’s put vitamins in context. Vitamins were discovered only in 1912. That’s barely over 100 years ago. That means we had a cure for scurvy (citrus fruits) before we knew there was such a thing as Vitamin C. It also means that we’ve had comparatively little time to really investigate them. But this isn’t a question about vitamins so much as it is a question about vitamin pills or vitamins as supplements.

Vitamins as supplements weren’t available until 1920, and later marketed with brand names like Vimms and Stams. With a slightly tumultuous history of who was going to control their quality, vitamins went from being approved by the American Medical Association, to being classified as food, thereby bypassing some of the stricter regulations present in the pharmaceutical industry. They captured the imagination of the public, fuelled by advertisers, coinciding with futuristic ideas of “food pills” as being miraculous; and since true nutritional deficiency was a problem in the Depression and World War II, the effects of vitamins used to reverse deficiency diseases made for a very powerful story.

What is enough?

Recommended Daily Allowances date back to 1941, based largely on prevention of newly-discovered diseases linked with vitamin deficiencies. But today, Recommended Daily Allowances are based on different things for different vitamins:

The RDA for niacin is 16mg/day for men and 14 for women, which is based on preventing pellagra (the disease that occurs in niacin deficiency). Centrum contains about 16-20mg per tablet, depending on which source you use. The niacin dose in the largest of the 3 trials used to support the idea (published in the Jenkins study) that niacin when taken with a statin might be harmful was 2g of niacin for the majority of the trial (4 years duration roughly). “More than enough” in this case, is 20x what you would get in a multivitamin and more than you could reasonably eat in food.

The daily intake recommended to prevent rickets is 400IU, but Vitamin D’s RDA is actually 600IU per day, because its rationale is based largely on fracture prevention (and in particular, in women around the onset of menopause). It assumes all vitamin D is being consumed by mouth, and where people have minimal sunlight exposure. Six minutes of sun exposure (UV index 3 or higher) can produce 10 000 IU’s. This is harder to achieve in places where winter is cold or where sunscreen prevents enough UV exposure for your body to make enough vitamin D.

So understanding what is considered “more than enough” requires an understanding of where the idea of “enough” comes from.

I think that we can all agree that there is a point where too much vitamin is too much vitamin. It doesn’t matter where you get it. It can be easier to take them by pill because getting the same high dose through food might involve a lot more food than you might be able to eat in a day.

The idea of being “super-healthy”

There’s also an unspoken conversation going on when it comes to vitamins and that’s the idea that there is a state beyond healthy. Some experts call this “optimal”, but never provide a point of reference from which you could compare “optimal” from “not-optimal”. This “science” claims that preventing deficiency disease isn’t enough; that there is a level of vitamins in the body at which we can become “super-healthy”, as opposed to just, “not-deficient”. There’s a lot of overlap between this camp and the camp of people who say that it’s impossible to get enough vitamins through food alone (to reach this state of “super-health”) as it would require too much food.

An unwanted foursome

So we are left with one science telling us that we are potentially getting more than enough of a good thing (leading to a bad thing); another science telling us that we aren’t getting enough of a good thing (even if we are meeting the so-called minimum), and that we can’t get enough, so take this pill; and yet another science telling us that we can get enough of a good thing without taking a pill. It’s a foursome from hell, you’re in the middle, and the other three don’t look like they’re interested in each other. Being the centre of attention can be fun at first, but it gets exhausting fast when everyone wants you for themselves.

As in all encounters of this kind, whether you join in the fun depends not so much on what the party thinks of you, but rather, what you think of the party. In this case, every potential partner has a different definition of enough. The thing is: They’re not actually trying to entice you on whether you should take the vitamins or not, they’re trying to entice you to think differently on what “enough” means.

Depending on who you’re most attracted to, “enough” ranges from “enough to prevent a deficiency disease” to “enough to make yourself effectively immortal.” The partner you choose (because it’s not really a party of four, if the other three aren’t into each other), and the course of action you take has to do with which story you want to believe most.

Most research has been classified as low-to-moderate quality in this area, but does seem to agree that vitamin supplementation above and beyond deficiency prevention doesn’t help you live longer.

The mythology of vitamins is such a well-crafted story that is so strong that millions of dollars are spent to study it moderately-well. What we want to believe has to be tempered by what we can believe.

Who decides what is enough is you. And who decides whether you are healthy enough today to warrant taking supplemental vitamins is you.

Sources:

Jenkins DJA, Spence JD, Giovannnucci EL et al. Supplemental Vitamins and Minerals for CVD Prevention and Treatment. Journal of the American College of Cardiology 71(22): 2570-84, 2018.

Screen time

Dear Doctor Ninja,

I’ve just turned 40 and one of the first things my one of my girlfriends asked me was, “Are you going to start getting annual mammograms now?” I told her that I’m sure I would get a mammogram just as soon as my doctor recommended it. However, I’ve recently read an article about the science of mammograms and now I’m feeling conflicted. There has been some science that says that mammograms can cause more harm than good. There hasn’t been any cancer among my direct relatives, but my husband and I went through a stressful experience when he was diagnosed with stage III cancer five years ago. The thought of missing a cancer diagnosis stresses me out! What should I do?

Signed,

B. Oob

The main delivery of the message that mammograms can cause more harm than good, B.Oob, comes from the Cochrane Review. It is a systematic review that takes all of the available research on studies that looked at whether getting vs not getting mammograms were useful and combines them to come up with an umbrella conclusion. Keep in mind that the latest version of this is from 2013, and that there should be an update in the next couple of years.

It should be noted that this message only applies to women with an average-risk for breast cancer, and not for women who are classified as high-risk for breast cancer.

What does a “harm” mean anyways?

The harm that is defined by the authors of the review is essentially receiving a diagnosis of breast cancer when you do not, in fact, have breast cancer, which is also known as a false-positive.

The harms of a false-positive include:

  • The stress of knowing that you might have cancer if your mammogram test is either inconclusive or positive, while you undergo more testing, which may require surgery.

  • Getting surgery for breast cancer that turns out to be not-needed if you didn’t actually have cancer.

  • Getting radiation or chemotherapy for breast cancer that turns out to be not-needed if you didn’t actually have cancer.

Ok, so what does a “good” mean then?

The good that is defined by the authors of the review include:

  • Not dying from breast cancer

  • Not dying from from another cancer

  • Not dying from anything else

It’s less important to talk about the last two points, namely because mammograms don’t seem to prevent people from dying from cancers that aren’t breast cancer or from things like car accidents or heart attacks.

The percentage of women who did not die from breast cancer as a result of mammogram detection is estimated at 15%, so out of 2000 women over 10 years of screening, 1 woman would be expected to be saved from dying from breast cancer. That’s very underwhelming for “good”.

There’s awful lot of death there. However, note that “good” does not include:

  • Not needing a full mastectomy

  • Not needing chemotherapy or radiation therapy

  • Not having a higher stage of cancer (which indirectly affects treatment decisions)

  • Other bad things that where you don’t just die.

Screening in most cancers has to do with detection, but also early detection. The idea is that detecting a cancer when it is “small” and before it has spread anywhere, leads to better results, not only in terms of not dying, but also in terms of how complicated or risky treatments need to be. For instance, women who don’t need chemotherapy or radiation for their breast cancers have fewer complications after reconstruction. And while chemotherapy or radiation are not entirely dictated by how early a breast cancer is detected, there is _a_ part of that decision that is affected by stage.

Does early detection matter?

The authors of the review argue that early detection means less as treatments for higher stage (more severe) breast cancers improve. From this perspective, if we reached a point where all stages of breast cancer could be treated, including stage IV, metastatic breast cancer, then the need to screen would be much less important. But, this is also from the lens of preventing death, not necessarily other bad things that don’t involve just dropping dead.

No, no and no.

But let’s take this Cochrane Review on mammograms at face value. If mammograms DO result in more harm than good, then what?

There are other two Cochrane Reviews on breast cancer screening: one about doing breast self-inspections as well as doctors doing breast exams, and the other on using ultrasound with mammograms. Both conclude that neither are acceptable screening methods either, because they also result in “too many false positives”; and the only large scale trial on breast self-examination failed because too many people just stopped doing it. All three reviews on breast cancer screening, therefore, fall back on the, “more research should be done in the future” cliché. The issue is that people like you, young grasshopper B. Oob, are living now and need guidance now, and not in the future.

So if the answer that science seems to be giving us today is, “Not mammograms,” it’s not very helpful because there’s no, “Instead of mammograms, do this.” It’s like that friend who you’ve agreed to go to dinner with, who says, “No, not that one,” to every restaurant you pick without offering a restaurant that you could go to. What do you do with a friend like this?

You pick a restaurant and don’t ask them what they think.

And how do you pick a restaurant? You do some deep soul searching for what’s important to you.

Your choices today are:

  1. A mammogram every 2-3 years starting when you’re 40

  2. A mammogram every 2-3 years starting when you’re 50

  3. A mammogram every 2-3 years starting when you’re 60

  4. A periodic mammogram with longer than 2-3 years in between, starting at any of the ages above

  5. No mammograms unless something is “wrong”

The implications of a false-positive, as we talked about above, run from the stress of having to go through more tests, all the while being uncertain as to whether you have cancer or not, to potentially having surgery or treatment (radiation and/or chemotherapy) that turns out to be not-needed after the fact.

After 10 mammograms, the false-positive rate is somewhere between 20% and 60%, depending on which study you choose. The authors of the review, after considering all of these studies, estimated the false-positive rate to be closer to 30%. So for 2000 women over 10 years, about 10 of them will undergo cancer treatment that is not needed.

A mammogram every 3 years means it will take 30 years to reach 10 mammograms where the false-positive rate seems to get awkward.

There are also the “four mammogram” figures, which vary according to age group, where women who have had four mammograms have a 3.6% chance of a false positive in their 40’s, 2.8% chance in their 50’s and 2.1% chance in their 60’s. If taken every 3 years, it takes 12 years to get to four mammograms in total.

No one has studied whether mammograms are useful in women older than 70 yet, which is why most guidelines don’t have strong recommendations for women older than 70.

At some point in your life, you might stop caring about breast cancer; or at least, you might stop being interested in pursuing aggressive treatment for breast cancer. Coming to terms with our own mortality is an interesting and potentially freeing exercise to take on, but beyond where this question probably goes. When might that might happen depends at least partially on how you end up seeing the quality of your life as you get older, but definitely frames the decision to continue screening in a totally different light.

I need a decision!

Cancer touches everyone, even if it’s not in your immediate family. As with all things cancer and screening, choices depend on your own values. For some women, the risk of stress, and possibly undergoing treatment that turns out to be not-needed is a small price to pay for ongoing peace of mind, and/or detection, even if the risk of what turns out to be not-needed treatment climbs as your “screening time” marches on. For other women, 10-20 out of 2000 women screened for 10 years, being treated for a breast cancer they didn’t have is an unacceptable risk. It’s one thing to say, “Well, that’s less than 1 percent,” but when it’s you, it’s 100% you.

For others, it’s a calculated risk of “not yet”, particularly if they are in their 40’s, and at least one set of guidelines (the Canadian ones) agrees that mammograms before the age of 50 are not recommended. And for others still, it’s “not anymore” as they pass into their 70’s and beyond where having to go through treatment for breast cancer might be far less appealing.

Sometimes science doesn’t give us hard, absolute rules, and even when there seem to be hard, absolute rules, personal values always affect decisions on health. There is no single right answer, but there’s an answer that is right for you. Science isn’t a dictator, it’s more like a friend. In this case, though, it’s a pretty indecisive, kinda douchey kind of friend who can’t pick a restaurant.

This opinion does depend on whether you are considered at average or high-risk for breast cancer and the research presented here only applies to women of average risk. If you’re not sure where you fall on the risk scale, your doctor is a great place to start.

Sources:

1. Gøtzsche PC, Jørgensen KJ. Screening for breast cancer with mammography. Cochrane Database of Systematic Reviews 2013, Issue 6. Art. No.: CD001877. DOI: 10.1002/14651858.CD001877.pub5.

2. Kösters JP, Gøtzsche PC. Regular self‐examination or clinical examination for early detection of breast cancer. Cochrane Database of Systematic Reviews 2003, Issue 2. Art. No.: CD003373. DOI: 10.1002/14651858.CD003373.

3. Gartlehner G, Thaler K, Chapman A, Kaminski‐Hartenthaler A, Berzaczy D, Van Noord MG, Helbich TH. Mammography in combination with breast ultrasonography versus mammography for breast cancer screening in women at average risk. Cochrane Database of Systematic Reviews 2013, Issue 4. Art. No.: CD009632. DOI: 10.1002/14651858.CD009632.pub2.

4. Fitzpatrick-Lewis, D., Hodgson N., Ciliska, D., Peirson, P., Gauld, M., Liu, Y.Y., Breast Cancer Screening. Canadian Task Force on Preventative Health Care, 2011.

Take a look at yourself and make the change?

Dear Doctor Ninja,

Science has so many great things to offer, but how do I decide when science should change the way I coach? I see p-values, effect sizes, and statistical significance everywhere. Sometimes I see effects that are “significant”, but don’t really make much of a difference at all. And they’d be very difficult to implement!

Please advise,
Insignificant Ina

Everyone hopefully brings many great things to a relationship. Maybe they make a great chicken carbonara. Or are level-headed in a crisis. Some people are really smart and witty. Others bring warmth and affection (not that smart, witty, warm, affectionate are in any way mutually exclusive.)  When do you decide when someone should change the way you see the world? The relationship you have with science and change is not very different.

I got an email this evening from my gym today. It said, “When is the best time to workout?” It linked me to an infographic saying I could work out in the early morning, in the early evening, at night or at mid-day. Each option had a little list of benefits for each one; for example, Night workouts had the listed benefit of “Higher testosterone levels, which will boost your muscle gain” whereas Morning workouts had the listed benefit of an “early morning endorphin boost to start your day off on a high.” 

Hey, I want muscle gains! Being high sounds great! How do I choose which one I want more? Muscle…being high…muscle….being high… I was afraid to look at the other choices; I mean what if working out at lunch makes me live forever? “Fuck,” I said to myself. I worked out in the mid-morning today. 

“Question the start” is the first item in the Critical Mass Manifesto. It’s the most important step. Always question whether you COULD change. If the change requires you buy new expensive equipment, you have to decide whether the possible benefits make that price worth it. If the change is something that your athletes would never be able to do for time reasons, for physical limitation reason, or for the reason that they will just openly revolt and burn down your gym, you can’t make the change. 

I don’t know about you, but if I work out at night, I can’t sleep for at least 2 hours afterwards. I want more muscle, but not if it screws up every single morning because I didn’t get enough sleep. No matter how hot the science looks, we’re not going home together.

Every change you consider is made within a context. That’s the heart of evidence-based practice and the core of your relationship with the science you are thinking of letting into your life.

 

Inevitable?

Dear Doctor Ninja,

I’m a 29 year old man and my relationship with my science was going just fine, until I started trying to lose weight. My science told me, “Look it sucks but because you grew up heavy, and your parents are obese, you’re always going to have to do triple the effort for half the results that most people get. It sucks, but that’s what it is.” At my heaviest I was 280lbs, and I’m 198lbs now, but at my lightest I was 175lbs. I don’t want a six-pack, but I feel like my relationship with science has taken a downturn and I’m not sure how to deal with this.

Signed,

Damned If You Do In DC

Sometimes, it’s okay to agree to disagree, Damned If You Do. Science isn’t the be-all-end-all truth-teller of fate. So why is science being such a douche? A lot of what science is telling you comes from science seeing a lot of heavy people and seeing how their parents are mostly obese as well. But that’s taking the easy way out. I don’t think science has looked at a lot of obese parents to see how many have children to turn out to have obesity issues in adulthood; or whether the same number of parents are obese between heavy and not-heavy adults. Also, what science doesn’t see very well is why the parents are obese, or even why heavy people are heavy regardless of whether their parents are or not.  Is it because parents model eating and exercise patterns for their kids (which is totally changeable), or there is actually a gene that makes obesity an unchangeable certainty?  Or is this just a coincidence from not looking at the all the right people? I don’t think even science knows for sure. 

However, in your case, you have successfully lost over 100lbs and kept most of it off! So whether science thinks there’s a gene, or not; science is definitely wrong when it comes to you!  If losing 100lbs was HALF the results of most people trying to lose weight, we wouldn’t think obesity was a problem at all! If it’s supposed to take 5 times the effort to lose half the weight, then reverse mathing it means it takes 1 times the effort to lose 20lbs, and that’s only half, so most people should be able to lose 40lbs with not very much effort, according to what you heard from science. That’s just not happening. So, I think it’s safe to tell science to suck it on this one.