top of page

Forum Posts

Ryan Eckert, MS, CSCS
Mar 15, 2025
In The VO2 Max Forum
Background Polarized training, or “80/20 training” as it is also known, has become a popular training periodization approach in recent years. Initially, observations that elite Norwegian cross-country skiers spend most of their annual training volume at low intensities (i.e., >80% of training volume), very little at modest intensities (i.e., 0-5% of training volume), and then the remainder at very high intensities (i.e., 15-20% of training volume) led to a surge in interest in this style of training intensity distribution among researchers, coaches, and some amateur/recreational athletes (3). This interest has grown into somewhat of a craze among most endurance athletes. Nowadays, many endurance athletes follow this sort of training approach or, have at the very least, heard of it from one of the many voices preaching its superiority as an approach to training intensity distribution for endurance performance success (e.g., Matt Fitzgerald and one of his many “80/20” books).   The rise in popularity of polarized training has brought more attention and research to the topic of training intensity distribution as a whole, which is essentially the way in which an athlete organizes and divides up the time spent at various training intensities across their total training volume. For example, two different athletes that train for 10 hours each week may both complete those 10 hours of training at vastly different training intensities. This ultimately impacts the physiological response(s) that each athlete will experience from their training.   When it comes to training prescription, volume (frequency x volume), type (i.e., running, strength training, rowing, cycling, etc.), and intensity are some of the most important variables to consider as they all impact the specific adaptations that an athlete will experience from their training program. Total training volume can be easily manipulated by increasing or decreasing the frequency of training and the duration of each training session. Training volume may be increased or decreased for many reasons, such as increasing volume gradually over the course of a build into a long-distance endurance event (e.g., a marathon, full-distance triathlon) or as an athlete progresses in their fitness and experience level. Training volume may decrease, on the other hand, during times of recovery, illness, injury, or tapering into an important event.   Training type is usually selected by matching the exercise selection to meet the specific demands of the event that an athlete is preparing for. For example, an athlete aiming to take on a half-marathon should primarily focus on including running sessions in their training plan. An athlete planning to participate in a multi-day Gran Fondo cycling event will take part in primarily cycling sessions. There are of course reasons when introducing “cross-training” activities, or non-specific endurance activities, is important though. If someone is injured and cannot run, they may swim or use a rowing machine to maintain some endurance-specific adaptations while they recover. Another endurance athlete may also include some form of non-specific cross-training in their plan as a way of building fitness without adding in the additional repetitiveness of the activity that they primarily do, such as a cyclist doing some work on a rowing machine occasionally on an easier day or a runner doing some elliptical work on a recovery day to assist in promoting recovery without the impact of running.   Training intensity, which is just as important of a training variable as volume and type, is often one of the most misunderstood training variables. This brings us back to polarized training, which as I mentioned previously, is simply one way of distributing training intensity across one’s overall volume of training. There are other ways of organizing one’s training intensity distribution, of which we will delve into some of these other approaches below. The question I wanted to explore herein, however, is related to the idea that training intensity distribution is static, or something that one “sets and forgets”. In other words, is polarized training always the best approach all of the time?   Is Polarized Training the “Optimal” Training Intensity Distribution for Endurance Athletes?   Personally, I have written on and discussed on podcast episodes the superiority of polarized training as a general approach to training intensity distribution when compared its opposite, training primarily around one’s threshold (also known as “threshold intensity distribution”, “zone 3 training”, etc.). I can summarize the proposed benefits of a polarized training approach as follows:   Performing the majority of one’s training volume at lower, more comfortable intensities allows for the athlete to accumulate a greater overall training volume at a lower injury, illness, and burnout risk when compared to one training mostly at moderate intensities. The accumulation of more training volume consistently over time without interruption due to injury, illness, and burnout is well known to be related to greater success in endurance sport. Therefore, polarized training could be viewed as a superior approach when compared to a “threshold intensity distribution” approach.   This above summary still holds true based on my experience and what some of the current research literature would suggest. For example, a review paper published in 2015 (1) comparing a polarized training intensity distribution to a threshold training intensity distribution among endurance athletes concluded that “Effect sizes for increasing aerobic endurance performance for the polarized training model are consistently superior to that of the threshold training model. Performing a polarized training program may be best accomplished by going easy on long slow distance workouts, avoiding “race pace”, and getting after it during interval workouts.” Someone reading just this paper alone, or in addition to others that paint a similar picture, might conclude that polarized training is the “optimal” approach for endurance athletes.   However, a problem arises if we forget that training intensity, just like training volume and training type, is a fluid training variable that should not necessarily always remain constant or unchanged. Let’s take an athlete training for their first marathon and do a thought experiment. Most people would agree that this athlete should be primarily running when training as this is going to be most specific to developing the fitness this athlete needs to complete a marathon. Most people would also agree that a sensible approach might be to gradually increase training volume from something that is reasonable and manageable initially to something that is increasingly more challenging in order to build, once again, specific fitness capabilities that are going to help this athlete complete a marathon. The specific frequency and duration of each run session is going to be very personal to that athlete but increasing running frequency and/or duration over the course of the training plan is, in general, going to be a good approach. When it comes to training intensity, however, is it best here to just prescribe 80% of their training sessions as easy and the other 15-20% as high intensity (i.e., polarized training) and continue this approach throughout the entirety of the marathon training plan? The answer depends on many factors, and while there isn’t necessarily one right answer, I would argue the wrong answer in most scenarios would be to leave training intensity distribution static and unchanging.   I think a good example of the point I am trying to highlight above comes from a recently published research paper in the journal, Sports Medicine, by Muniz-Pumares and colleagues (2). Researchers in this study took the last 16 weeks of training data from the Strava accounts of nearly 120,000 runners of varying performance levels and reported training characteristics by marathon finish times. The training intensity distribution of runners was quantified using a 3-zone approach, where Zone 1 consists of easy/aerobic intensities below one’s aerobic threshold/lactate turn-point 1 (LT1), Zone 2 consists of moderate intensities between one’s aerobic and anaerobic//lactate threshold/lactate turn-point 2 (LT2), and Zone 3 consists of hard intensities above one’s anaerobic/lactate threshold/LT2 (see Figure 1 for a visual representation of this typical 3-zone model).   Figure 1. 3-Zone Intensity Model* *Zone 1 (Z1) = green; Zone 2 (Z2) = yellow; Zone 3 (Z3) = red Importantly, researchers took the training intensity distribution (abbreviated “TID” moving forward) of each athlete and categorized them into one of the following distributions: 1. Pyramidal Training Intensity Distribution (Pyr-TID) -   ~75-80% Z1; ~15-20% Z2; ~0-5% Z3 2. Polarized Training Intensity Distribution (Pol-TID) -   ~75-80% Z1; ~0-5% Z2; ~15-20% Z3 3. Threshold Training Intensity Distribution (Thr-TID) -   <75-80% Z1; >20% Z2; ~0-5% Z3 4. High-Intensity Training Intensity Distribution (HIT-TID) -   <75-80% Z1; ~0-5% Z2; >20% Z3   These different TIDs are the most common ways of distributing one’s training intensity. These TIDs above are also unique in the placement of the majority of one’s training intensity. As you can see, with Pyr-TID, the majority of one’s training intensity takes place in Z1 and most of the remainder in Z2. With Pol-TID, the majority is again in Z1 with little in Z2 and the remainder in Z3. With Thr-TID, much more volume is accumulated in Z2 with comparatively less volume in Z1 and little in Z3. Finally, with HIT-TID, one spends comparatively more volume in Z3 with little in Z2 and the remainder in Z1. As mentioned previously, the distribution of training intensity plays a large role in determining physiological adaptations that one experiences. So, each TID listed above will yield specific adaptations.   Finally, marathon finish times were broken down into 30-min time ranges. The fastest marathon runners had a finish time of 120-150 minutes. This was followed by runners finishing between 150-180 minutes. The middle group consisted of those finishing between 180-210 minutes. Next were those that finished between 210-240 minutes. The slowest finishers consisted of those finishing in >240 minutes.   Keeping the above in mind, some of the key findings from this paper include: • Training volume across all runners averaged 45.1+/-26.4 km/week. • Training volume was >3x greater (~107 km/week) in runners with the fastest marathon finish times (120-150 min) compared to those with the slowest marathon finish times (>240 min). • There was a strong negative correlation between total training volume in Z1 (i.e., easy/aerobic training intensity) and marathon finish time, with a higher volume of Z1 training volume being associated with faster marathon finish times. • Higher training volume among the fastest runners was achieved by accumulating more volume in Z1 • The most prevalent TID approach was Pyr-TID in which the majority of athlete’s volume was in Z1, progressively less volume in Z2, and little to no training volume in Z3. • The proportion of runners adopting a Pyr-TID increased linearly with those accomplishing faster marathon finish times, ultimately reaching ~80% of runners with the fastest marathon finish times (120-150 min).   What can we take away from the above? Well, first of all, it is quite striking to see those runners finishing in the fastest times completed nearly 3x more running volume on a weekly basis than did those with the slowest finish times. This confirms previous research that indicates increases in running volume are, generally, associated with improved running performance. How was greater running volume achieved by the fastest runners? Those with the fastest marathon times completed more running in Z1 in an effort to accumulate more weekly running volume. This also makes sense as, well, in order to run more frequently and at higher volumes, an athlete needs to remain injury/illness/burnout-free to do so effectively. Running in Z1 is far less demanding on the body than is running in Z2 and Z3. Therefore, to accumulate more running volume, it generally makes the most sense to run more at a Z1 intensity level.   Another really interesting finding was that Pyr-TID was the most common way of distributing one’s training intensity in this study, and that this TID approach was increasingly common among faster runners. To me, this makes complete sense based on the fact that these runners were all training for a marathon and the data from this study was from the runners’ last 16 weeks of training leading into race day. Marathon race-pace, for a lot of athletes, is somewhere in Z2. So, many athletes were likely to emphasize lots of Z1 training for their everyday runs and, on their harder sessions, focusing on marathon race-pace efforts. If researchers conducted this same study in, say runners preparing for a 10K or 5K event, many runners may have spent more training volume in Z3 and less in Z2 (something that more closely resembles a Pol-TID) since 10K and 5K race-pace for a lot of runners falls somewhere in Z3!   Conclusions   Altogether, the results from this study demonstrate that polarized training isn’t necessarily always the best approach all of the time. Training intensity, and the way an athlete distributes it over the course of weeks and months, is highly dependent on what the athlete is trying to accomplish. Yes, polarized training is certainly an effective way of distributing one’s training intensity, but for most endurance athletes it should not be the only way in which training intensity is distributed throughout the whole season. Training intensity is a fluid variable, and it should be adjusted depending on the athlete’s goals, race targets, time of year, etc. A good rule of thumb is to indeed complete most of one’s training at a low/comfortable intensity (i.e., 70-80% of volume in Z1). However, what an athlete does with the remainder of their volume (i.e., the other 20-30%) should constantly be evaluated and altered to align with their needs and goals.   References: 1.   Hydren, J. R., & Cohen, B. S. (2015). Current scientific evidence for a polarized cardiovascular endurance training model. The Journal of Strength & Conditioning Research, 29(12), 3523-3530. 2.   Muniz-Pumares, D., Hunter, B., Meyler, S., Maunder, E., & Smyth, B. (2024). The training intensity distribution of marathon runners across performance levels. Sports Medicine, 1-13. 3.   Seiler, K. S., & Kjerland, G. Ø. (2006). Quantifying training intensity distribution in elite endurance athletes: is there evidence for an “optimal” distribution?. Scandinavian journal of medicine & science in sports, 16(1), 49-56.   Happy training and racing!   -Ryan Eckert, MS, CSCS   Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Polarized Training in Endurance Sport: An Important, but Often Misunderstood Topic content media
0
0
6
Ryan Eckert, MS, CSCS
Jun 30, 2023
In The VO2 Max Forum
Background   In my continued effort to take a deeper look at the research surrounding the potential long-term health implications of chronic endurance exercise, I decided to focus this month on ultra-endurance running. Before diving into this topic, I want to be clear that exercise of all types confers incredible health benefits. When I discuss my interest in the potential negative long-term implications of endurance exercise, this is limited in scope to chronic, high volumes of endurance exercise that has gained more attention for its relationship to a small, elevated health risks. In particular, high levels of endurance exercise and sport participation has, in some research, demonstrated a very small increased risk of cardiovascular health concerns.   Naturally, as both an endurance athlete and coach, I figured I should be more well-versed in this emerging area of research so that I am aware of any risks that might be present among endurance athletes participating in high loads of training and/or racing for long periods of time. This is, importantly, an emerging area of research. This means that most of the evidence we have fuels more questions than answers. Nonetheless, I have found it interesting to look at, and hopefully you will find it interesting to learn more about as well.   With all of this said, let’s look at a recent paper that focused on discussing potential long-term health problems associated with ultra-endurance running.   Are there risks associated with ultra-endurance running?   In 2022, Scheer and colleagues (1) published a narrative review in which they aimed to discuss potential long-term health risks associated specifically with ultra-endurance running (i.e., running longer than 26.2 miles or ~6 hours). The authors looked at relevant research literature, some of it in ultra-endurance runners and some of it in other endurance athlete populations and non-athlete populations, and categorized possible long-term health risks into various categories. These categories were broken down by key organ systems, including cardiovascular, respiratory, musculoskeletal, renal, immunological, gastrointestinal, neurological, and integumentary (i.e., skin/dermatological) systems.   I will spend time focusing primarily on the highest risk categories where there is some evidence indicating there is some concern to be considered. However, see Table 1 directly from this paper below in which the authors provide a summary of key findings across each organ system.   Of all the possible concerns discussed, many of them could reasonably be mitigated or outright eliminated to some extent with proper training load management and adequate rest and recovery habits. For example, overtraining combined with improper recovery is known to have deleterious effects on the immune system. This is true for any athletic population, not just ultra-endurance runners. However, avoiding overtraining and ensuring proper rest and recovery practices around training and racing can practically eliminate the risk of possible immunological concerns across all athletic populations. I want to focus the rest of my discussion on specific concerns that might be unique to ultra-endurance trail runners or specific risks that might be present despite proper training load management and recovery practices.   The number one category that stood out to me was the possibility of elevated cardiovascular health impacts from ultra-endurance running. This is where most of this emerging area of research focuses across chronic endurance participation more broadly, and there is some mounting evidence showing there may be an increased risk of cardiovascular health complications with high levels of endurance sports participation, of which ultra-endurance runners would certainly align. These risks include potential damage and scarring to the heart walls and chambers, inflammation of the heart, higher coronary artery calcification scores, and more. However, the risk of experiencing a deadly or serious cardiovascular incident in endurance athletes is still very small. There may also be some genetic predispositions that make some athletes more likely to experience a cardiovascular health problem with chronic endurance exercise. More research is needed, but this is my biggest interest area personally as there does seem to be consistent evidence demonstrating a small risk among those that engage in high levels of endurance exercise for many years.   The next category that stood out to me, and that is possibly unique to ultra-endurance running, is that of long-term respiratory concerns. Recurring exercise-induced bronchoconstriction from exercising for long durations or at high-intensities in cold, dry air or in poor air quality conditions may have some adverse effects on overall respiratory health. However, more research is needed related to this topic as this is mostly speculative at this point. Ultra-distance runners, however, are more likely than many endurance athletes to be exercising in dryer, colder conditions as well as conditions in which there is poor air quality depending on the geographical location that one trains and races in. Therefore, it makes some sense to consider possible respiratory effects of repeated exposure to training and racing conditions if they take place in certain conditions that place stress and strain on this organ system. For example, authors in the paper outlined above mentioned a scenario in which exercise-induced bronchoconstriction (EIB; i.e., constriction of the cells of the lungs in which gas exchange between the lungs and blood stream occurs) that can take place after prolonged endurance exercise in cold, dry, or poor air quality environments. The repeated occurrence of this EIB may have some long-term consequences on respiratory health that need further exploration.   What to do about the risks that might be present?   In sum, there might be some small risks unique to endurance athletes more globally when it comes to cardiovascular outcomes and possibly respiratory outcomes depending on the environment that one chooses to train and race in. This risk, however, is small and there is still much more research needed before any firm conclusions can be made.   Personally, I consider the cardiovascular-related and pulmonary-related risks to be something to take seriously. The best way to mitigate these risks, in my opinion and with the limited research we have available, could possibly come down to just being smart in the way that you engage with training and racing over time. Of course, there is the possibility of genetic components that cannot be altered and that might present an elevated risk of certain long-term health implications. The remaining risk might be best managed through smart approaches to training and racing, but there really isn’t any firm research here to back up this assertion.   Managing training load and intensity well over time and being strategic about exercising in certain environmental conditions would be my top recommendations if I were to provide any. By managing training load intelligently and putting a constant emphasis on proper recovery, athletes can minimize the deleterious impact that chronic overloading and/or under-recovering may have on cardiovascular health. Furthermore, by avoiding exercise in poor air quality conditions (e.g., in areas subjected to wildfire smoke, or in more populated areas when air quality is poor) and just being aware of not over-doing exercise in cold and dry conditions on a regular basis, one might be able to avoid some deleterious consequences on long-term pulmonary health as well.   Conclusions   Endurance exercise, even at high levels, confers incredible health benefits. Even elite-level endurance athletes when compared to the general population have much better long-term health outcomes and a much lower risk of all-cause mortality. So, don’t take any of this discussion herein as a recommendation to stop endurance training and racing. However, take the information I have laid out as something to be cognizant of and to keep up to date with as newer research emerges. Be aware of any potential risks and seek to mitigate these risks as more and more answers, and hopefully solutions, come to be available.   References: 1.    Scheer V, Tiller NB, Doutreleau S, Khodaee M, Knechtle B, Pasternak A, Rojas-Valverde D. Potential long-term health problems associated with ultra-endurance running: a narrative review. Sports Medicine. 2022 Apr;52(4):725-40.   Happy training and racing!   -Ryan Eckert, MS, CSCS   Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming aPatreon supporter
Potential Long-Term Health Problems Associated with Ultra-Endurance Running content media
0
0
27
Ryan Eckert, MS, CSCS
May 31, 2023
In The VO2 Max Forum
Background   This month’s post will be a bit different from the norm. I will of course be sharing some thoughts as it relates to endurance sport-related science, but I will be using a recently published study as an opportunity to share some learning points about properly interpreting and contextualizing single research studies as this is an incredibly important skill for anyone that is utilizing research to inform a larger context of coaching athletes, including self-coaching.   I wanted to discuss the topic of long-term endurance exercise and its impact on cardiovascular health, or cardiovascular health risk to be more precise. I have come across more and more articles, podcasts, and social media pages delving into the hotly debated topic of chronic over-exercising and the risk it poses to cardiovascular health. “Over-exercising” in this context doesn’t even mean extremely excessive, over-the top, out-of-control compulsive exercise. It can simply refer to exercise that is above and beyond what is recommended for health and longevity and done so chronically (i.e., for decades to a lifetime).   For reference, current evidence-based physical activity guidelines for recommend ~150-300 min/week of moderate-intensity cardiovascular activity or ~75-150 min/week of vigorous-intensity cardiovascular activity in addition to two days/week of full-body strength training for maximum health benefits (5). The return on investment, so-to-speak, for one’s health starts to decline dramatically beyond this level of exercise. In other words, exercising longer and/or more vigorously does not necessarily confer any significant benefit to one’s health above what is garnered at the recommended levels.   Now, this does not mean that exercising more than the ~2.5-5 hours/week of exercise that is recommended for health is harmful to one’s health. However, there has been some interest among researchers and sports practitioners as to whether extremely high levels of endurance exercise for decades can lead to a small increase in cardiovascular health risk. There have been more and more cases of elite-level athletes experiencing sudden cardiac events and coronary events, or at least more that have been documented, and this has sparked research that aims to explore this area to a greater degree.   The research to date, as I will discuss a small sliver of next, is mixed and inconclusive. But this has not kept some who are on one extreme end of the fence or the other from duking it out in online spaces to convince the masses of their opinion. Some are hardcore believers that there is a substantial risk to one’s health if they engage in high levels (this is somewhat subjective and arbitrary at this point as there is no real hard cut-off or definition as to what “high levels” of exercise are) of endurance exercise over the course of their lifetime, while others believe that there is zero risk whatsoever.   Much like everything else in life, the answer probably lies somewhere in the middle. And from what I have surmised reading just some of this research is that there does seem to be the possibility of some very small risk of highly specific cardiovascular incidents occurring among endurance athletes that do engage in large volumes of endurance-specific exercise for decades upon decades of their life. However, this miniscule risk is likely far outweighed by the massive protective benefit that one receives from exercise.   Nonetheless, I find this area of research compelling as it is so new and something I want to be better informed on as an athlete myself and a coach for other athletes. If there are risks that are present, I want to know how to mitigate them. So, I thought it would be appropriate to dive into a single research study next that piqued my interest. I’ll then follow-up a discussion of this study with some learning opportunities related to contextualizing and interpreting research.   Lifelong endurance exercise and coronary atherosclerosis risk   De Bosscher and colleagues (1) published a study in 2023 that aimed to investigate the relationship between lifelong endurance exercise and coronary atherosclerosis risk (i.e., plaque development). For the purposes of this study, they enrolled 191 lifelong masters endurance athletes, 191 late-onset endurance athletes (endurance sports initiation after 30 years of age), and 176 health non-athletes. Lifelong endurance athletes had the highest coronary plaque scores across multiple different measurements of plaque (including calcified and non-calcified plaques).   However, the larger body of research literature still demonstrates that lifelong endurance athletes have an overall lower risk of plaque-related cardiovascular events (typically referred to as ischemic events) when compared against non-athletes (2-4,6). In some studies, the lowest risk of these types of events were in athletes with the highest coronary plaque scores (3,6). In other words, a higher presence of coronary plaque in lifelong endurance athletes does not seem to be related to a greater likelihood of ischemic events and is, in fact, quite the opposite. One would expect to see research demonstrating that lifelong endurance athletes suffer from a greater likelihood of ischemic events as plaque scores increase, but the research does not show this.   Contextualizing research and interpreting single research studies   I found the study discussed herein a rather interesting one, not only because of the findings, but because of how one’s contextualization of this study is so dependent on having a broader understanding of the larger body of research surrounding the topic. If all I did was share this single research study with another coach or fellow athlete, which has many limitations needing to be accounted for that influence the interpretation of the results, they may draw the seemingly logical conclusion that lifelong endurance exercise poses a risk to cardiovascular health. Yet, when one takes the next step and asks the question as to whether these elevated plaque scores are really related to a greater likelihood of cardiovascular events, the larger body of research shows this to not be true.   This is such a common problem in today’s media especially where attention-grabbing and click-bait headlines are based on single studies that are not put into the appropriate context. This happens over and over and over on multiple different platforms, people start to see it, and misinformation or poorly contextualized information starts to spread like wildfire.   It's easy to have a biased message to spread, go to Google Scholar, type in the keywords you want, and find a study that could be spun in a way to support a biased opinion. And so, alas, people do it all the time. This is why I always, always, always recommend getting information from trusted resources or individuals that are transparent in how they formulate their recommendations and advice (i.e., do a bit of homework to see what someone’s background, education, and credentials are as well as examine how they share information).   This is not to say that individuals who are well-versed in reading and interpreting research or whom are researchers themselves are know-it-alls or have the answer to every question. Those who seriously understand research and how it works are often the ones that will never say the phrases “science has proven”, “new proof that X works”, “do X for guaranteed results”, “Stop doing X and do Y”, and make other aggressive and inflammatory claims. These types of claims and phrases highlight that someone has a serious misunderstanding of how science works and how to interpret it, and/or that they are just pushing an opinion and not an evidence-based recommendation.   Research is a constantly evolving series of objective investigations that aims to better understand how things work. A research study does not necessarily seek to “prove” anything by its nature. A researcher/scientist simply aims to ask a question, formulate an objective method of isolating that which they want to better understand, and then conducting the study to draw a conclusion that helps us understand the answer to a question a bit better. And most often, science has proven very, very little. Science is a tool that can be used to vastly improve our understanding of best practices and best approaches. And when attempting to better understand big questions or problems, single research studies do not tell the whole story, they just contribute to it much like a single puzzle piece does not depict the whole picture.   Conclusions   Hopefully you found this somewhat out-of-the-norm post useful in a variety of ways. But at the very least, I hope you learned a bit more about the relationship between lifelong endurance exercise and cardiovascular health as well as something related to interpreting and contextualizing research. If you’re an athlete, keep training for your sport of choice without fear that you will be worse off in the future because of it. And if you are a coach or practitioner, keep trying to better understand and learn, study by study, bit by bit, and always be sure to put things into a greater context when formulating your advice and recommendations.   References: 1.    De Bosscher R, Dausin C, Claus P, Bogaert J, Dymarkowski S, Goetschalckx K, Ghekiere O, Van De Heyning CM, Van Herck P, Paelinck B, El Addouli H. Lifelong endurance exercise and its relation with coronary atherosclerosis. European Heart Journal, 2023. 2.    DeFina LF, Radford NB, Barlow CE, Willis BL, Leonard D, Haskell WL, Farrell SW, Pavlovic A, Abel K, Berry JD, Khera A. Association of all-cause and cardiovascular mortality with high levels of physical activity and concurrent coronary artery calcification. JAMA cardiology. 2019 Feb 1;4(2):174-81. 3.    Gao JW, Hao QY, Lu LY, Han JJ, Huang FF, Vuitton DA, Wang JF, Zhang SL, Liu PM. Associations of long-term physical activity trajectories with coronary artery calcium progression and cardiovascular disease events: results from the CARDIA study. British Journal of Sports Medicine. 2022 Aug 1;56(15):854-61. 4.    German CA, Fanning J, Singleton MJ, Shapiro MD, Brubaker PH, Bertoni AG, Yeboah J. Physical Activity, Coronary Artery Calcium, and Cardiovascular Outcomes in the Multi-Ethnic Study of Atherosclerosis (MESA). Medicine and Science in Sports and Exercise. 2021 Dec 29. 5.    Piercy KL, Troiano RP, Ballard RM, Carlson SA, Fulton JE, Galuska DA, George SM, Olson RD. The physical activity guidelines for Americans. Jama. 2018 Nov 20;320(19):2020-8. 6.    Radford NB, DeFina LF, Leonard D, Barlow CE, Willis BL, Gibbons LW, Gilchrist SC, Khera A, Levine BD. Cardiorespiratory fitness, coronary artery calcium, and cardiovascular disease events in a cohort of generally healthy middle-age men: results from the Cooper Center Longitudinal Study. Circulation. 2018 May 1;137(18):1888-95.   Happy training and racing!   -Ryan Eckert, MS, CSCS   Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Chronic Endurance Exercise and Cardiovascular Health: Is More Always Better? content media
0
0
9
Ryan Eckert, MS, CSCS
May 01, 2023
In The VO2 Max Forum
Background To answer the question posed in the title of this post, it depends. Truthfully, the answer to most questions posed in the world of sports science is “it depends”. In brief, it truly doesn’t matter what time of day you choose to train, or when your athletes train if you are a coach, as whatever time works best for you to fit in your training as easily and efficiently as possible will always be best. However, if you are an athlete that has some flexibility in the time of day you choose to fit in your training session(s) and you want to maximize your sessions to the best of your ability, there are some important differences in your physiology that characterize morning versus evening times of day. Let’s discuss these differences next. Time of Day: Morning versus Evening Firstly, let’s define “morning” and “evening”. Most studies that investigate morning versus evening exercise use a time of day that falls within 7:00-10:00 am and soon after waking as a “morning” time of day and a time of day that falls between 4:00-8:00 pm as an “evening” time of day. Some studies will use an “afternoon” time that consists of a time of day a little earlier than 4:00 pm. For the purposes of this discussion herein, let’s define “morning” as any time between 7:00-10:00 am and soon after waking and afternoon/evening as any time after 12:00 pm and further away from waking. It has long been documented in the strength and conditioning research literature that muscular strength and power performance as well as short-term anaerobic performance has a time-of-day effect, with better performance in the late afternoon or evening than the morning soon after waking (4-18). One of the primary contributors to this change in performance capacity throughout a 24-hour day is thought to be body temperature. Earlier in the morning and soon after waking, body temperature is near its lowest; however, body temperature reaches its natural peak in the afternoon (1,3). Body temperature matters quite a bit when it comes to athletic performance for a variety of reasons, most of which relate to improved nervous and muscular function when body temperature is higher. If you have ever gone on a run first thing in the morning on a cold winter day and compared it to a run on a mild spring afternoon, you might be acutely aware of what I am talking about here. More recently, researchers have been looking at whether these time-of-day effects on performance are relevant to endurance athletes. Effects of Time-of-Day on Endurance Performance Kang and colleagues (2) just recently published a 2023 systematic review and meta-analysis (one of the highest forms of scientific evidence is a systematic review and meta-analysis) in which they aimed to answer the question of whether there is a time-of-day effect on cardiovascular exercise response and endurance performance. To answer their research question, they identified 31 published studies that, collectively, included 393 subjects (ages 18-38 years; 73% men; mostly healthy adults and some athletes with prior endurance experience) to include in the review. They then performed a meta-analysis to identify any time-of-day effects across various physiologic and performance outcomes (compared between morning [AM] and afternoon/evening [PM]). The meta-analysis ultimately revealed the following key takeaways: Higher resting oxygen intake (VO2) and heart rate (HR) in the PM compared to the AM Higher HR at both submaximal and maximal endurance intensities in the PM compared to the AM Greater endurance performance (as measured by time-to-exhaustion or total workload performed) in the PM compared to the AM No significant metabolic differences during submaximal nor maximal endurance exercise (i.e., VO2, substrate utilization) in the PM compared to the AM These findings are important for a variety of reasons. First, the differences in HR and overall performance are of note for researchers, coaches, and clinicians who might perform exercise testing with research subjects or athletes. Having an understanding that testing results might differ when comparing AM to PM performances is valuable information when deciding how and when to incorporate standardized testing in research settings or real-world settings. Second, this is important for athletes to understand as it might influence the timing of certain key training sessions. Let’s discuss this second point in more detail. Choosing the “Best” Time to Train From an athlete’s perspective, and even a coach’s perspective, knowing that endurance performance is potentially greater in the PM could influence the timing of key training sessions that include higher-intensity efforts. For example, if an athlete has the flexibility to choose between AM or PM sessions, performing intense training sessions later in the day could be advantageous as the total workload or output of the session during the intense efforts might be greater, thereby yielding a greater adaptive stimulus. This might not seem important when considering a single stand-alone training session, but stack together many weeks and months of PM training sessions in which the adaptive stimulus is slightly greater, and this could translate into greater long-term fitness improvements and race performances. At this point, that last statement is hypothetical as this sort of long-term research has not been done yet. Despite the above, this is in no way an indication that performing an intense or key training session in the AM is somehow disadvantageous. It is simply a matter of suggesting that, if one has flexibility to do so and it works well for them, choosing to perform high-quality, intense training sessions at times of the day when performance might be marginally better could provide a greater return on their investment. Like I mentioned at the beginning of this write-up, performing training sessions when it is the most convenient and efficient for an athlete to do so will always be the optimal choice, regardless of the time-of-day (italicized for emphasis). Furthermore, research provides broad insights into what tends to work best for the average majority. There are always outliers, and some athletes might perform better in the AM when compared to the PM based on their individual psychology and physiology. Conclusions The evidence discussed herein suggests that there might be improved endurance performance in the PM compared to the AM. This can be very useful for many athletes and coaches looking to optimize the timing of key training sessions. Flexibility in having a choice in the time-of-day one trains is a prerequisite to take advantage of this potential performance benefit, in my opinion. Training that gets completed as smoothly and efficiently as possible will always be best, regardless of the time-of-day it was performed. If you are fortunate enough to have flexibility in deciding when to train and you enjoy training in the PM, then scheduling high-quality, intense training sessions later in the day might be a smart choice! References: Atkinson G, Reilly T. Circadian variation in sports performance. Sports Med 21: 292–312, 1996. Kang J, Ratamess NA, Faigenbaum AD, Bush JA, Finnerty C, DiFiore M, Garcia A, Beller N. Time-of-Day Effects of Exercise on Cardiorespiratory Responses and Endurance Performance—A Systematic Review and Meta-Analysis. The Journal of Strength & Conditioning Research. 2022 May 9:10-519. Waterhouse J, Drust B, Weinert D, et al. The circadian rhythm of core temperature: Origin and some implications for exercise performance. Chronobiol Int 22: 207–225, 2005. Guette M, Gondin J, Martin A, Pérot C, Van Hoecke J. Plantar flexion torque as a function of time of day. International journal of sports medicine. 2005 Jun 2:171-7. Sedliak M, Finni T, Cheng S, Haikarainen T, Häkkinen K. Diurnal variation in maximal and submaximal strength, power and neural activation of leg extensors in men: multiple sampling across two consecutive days. International journal of sports medicine. 2008 Mar;29(03):217-24. Zbidi S, Zinoubi B, Vandewalle H, Driss T. Diurnal rhythm of muscular strength depends on temporal specificity of self-resistance training. The Journal of Strength & Conditioning Research. 2016 Mar 1;30(3):717-24. Chtourou H, Zarrouk N, Chaouachi A, Dogui M, Behm DG, Chamari K, Hug F, Souissi N. Diurnal variation in Wingate-test performance and associated electromyographic parameters. Chronobiology international. 2011 Oct 1;28(8):706-13. Souissi N, Bessot N, Chamari K, Gauthier A, Sesboüé B, Davenne D. Effect of time of day on aerobic contribution to the 30‐s Wingate test performance. Chronobiology international. 2007 Jan 1;24(4):739-48. Souissi N, Gauthier A, Sesboüé B, Larue J, Davenne D. Circadian rhythms in two types of anaerobic cycle leg exercise: force-velocity and 30-s Wingate tests. International journal of sports medicine. 2004 Jan;25(01):14-9. Giacomoni M, Billaut F, Falgairette G. Effects of the time of day on repeated all-out cycle performance and short-term recovery patterns. International journal of sports medicine. 2006 Jun;27(06):468-74. Hammouda O, Chtourou H, Chahed H, Ferchichi S, Kallel C, Miled A, Chamari K, Souissi N. Diurnal variations of plasma homocysteine, total antioxidant status, and biological markers of muscle injury during repeated sprint: effect on performance and muscle fatigue—a pilot study. Chronobiology international. 2011 Dec 1;28(10):958-67. Hammouda O, Chtourou H, Chahed H, Ferchichi S, Kallel C, Miled A, Chamari K, Souissi N. Diurnal variations of plasma homocysteine, total antioxidant status, and biological markers of muscle injury during repeated sprint: effect on performance and muscle fatigue—a pilot study. Chronobiology international. 2011 Dec 1;28(10):958-67. Racinais S. Different effects of heat exposure upon exercise performance in the morning and afternoon. Scandinavian journal of medicine & science in sports. 2010 Oct;20:80-9. Racinais S, Hue O, Blonc S. Time-of-day effects on anaerobic muscular power in a moderately warm environment. Chronobiology international. 2004 Jan 1;21(3):485-95. Zarrouk N, Chtourou H, Rebai H, Hammouda O, Souissi N, Dogui M, Hug F. Time of day effects on repeated sprint ability. International journal of sports medicine. 2012 Dec;33(12):975-80. Chtourou H, Chaouachi A, Driss T, Dogui M, Behm DG, Chamari K, Souissi N. The effect of training at the same time of day and tapering period on the diurnal variation of short exercise performances. The Journal of Strength & Conditioning Research. 2012 Mar 1;26(3):697-708. Chtourou H, Driss T, Souissi S, Gam A, Chaouachi A, Souissi N. The effect of strength training at the same time of the day on the diurnal fluctuations of muscular anaerobic performances. The Journal of Strength & Conditioning Research. 2012 Jan 1;26(1):217-25. Gauthier A, Davenne D, Martin A, Cometti G, Hoecke JV. Diurnal rhythm of the muscular performance of elbow flexors during isometric contractions. Chronobiology international. 1996 Jan 1;13(2):135-46. Martin A, Carpentier A, Guissard N, Van Hoecke J, Duchateau J. Effect of time of day on force variation in a human muscle. Muscle & nerve. 1999 Oct;22(10):1380-7. Sedliak M, Finni T, Peltonen J, Häkkinen K. Effect of time-of-day-specific strength training on maximum strength and EMG activity of the leg extensors in men. Journal of sports sciences. 2008 Aug 1;26(10):1005-14. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Does it Matter What time of Day You Train? content media
0
0
10
Ryan Eckert, MS, CSCS
Feb 27, 2023
In The VO2 Max Forum
Background Barefoot and minimalist footwear running has garnered a lot of controversy among athletes, coaches, and researchers since the turn of the century. Those in favor of more minimalist running footwear claim that they benefit from a lower likelihood of injury and improved performance. However, those in favor of more traditional or maximalist footwear also make similar claims. Research literature exploring this topic has further added to this confusion at times as some research favors minimalist footwear for certain outcomes while other research favors more traditional or maximalist footwear. The purpose of this post is to help you, as the reader, make some sense out of the confusion on this topic and better understand what the pros and cons are to running in different types of footwear. The topic of barefoot running is a separate topic on its own; so, for the purposes of this post, the conversation will be limited to the three most common categories of footwear on today’s market: 1) minimalist footwear, 2) traditional footwear, and 3) maximalist footwear. Minimalist vs. Traditional vs. Maximalist Footwear Before we start discussing the benefits and drawbacks of different types of footwear, we first need to understand the differences between them. In today’s running footwear market, there are generally three categories of shoes to choose from. The image below is a table from a review paper by Andreyo and colleagues (1) that I will be referring to throughout this post. The table describes the key differences between minimalist, traditional, and maximalist footwear. Running Mechanics by Footwear Type A very common misconception among runners and running coaches is that certain types of footwear, or lack of footwear (i.e., barefoot running), are “better” or “worse” than others. Running mechanics change depending on the type of footwear an athlete chooses. This is not inherently “good” or “bad” as some mistakenly believe. It is for this reason that running footwear selection is often very personal to each individual athlete. What works best for one athlete might be entirely different from another athlete. Footwear selection can be influenced by a variety of factors, including: · Personal preference for what “feels good” · Injury history · Running style · Running surface and environment · Age · Weight · Training history · And more… I think athletes and coaches, and sometimes researchers, get too caught up in trying to label certain types of footwear as “bad” in comparison to other types of footwear. However, the real challenge is to understand how running mechanics changes by footwear selection and how these changes might be used in determining what footwear choice is best for an individual athlete. For example, it is well-documented that when running in minimalist footwear, athletes tend to land with a forefoot or midfoot strike (2). When athletes run in traditional or maximalist footwear, they tend to exhibit more of a rearfoot strike pattern (2). Additionally, footwear choice has an impact on the loading forces that are experienced during running. When running in minimalist footwear, there tends to be an increased demand placed on the metatarsals (i.e., toes), the ankle joint itself, and the ankle plantarflexors (i.e., calf musculature) (3). However, when running in traditional or maximalist footwear, a greater demand tends to be placed on the hip joint, knee joint, and shin bone. Again, these changes in load demands and mechanics are not inherently good or bad. They just are what they are. However, given this information, an athlete can use it to help in their selection of footwear. For example, knowing that minimalist footwear tends to promote a forefoot or midfoot striking pattern and an increased loading demand on the Achilles tendon and plantarflexor musculature, an athlete with a recurring history of Achilles tendon injury might want to avoid choosing minimalist footwear. This is just one example of how an understanding of mechanics and footwear can be used to an athlete’s advantage. Is Minimalist Footwear More Efficient? Some athletes and coaches will make the claim that running in minimalist footwear is more efficient or economical than running in more cushioned footwear. There is some research to back up these claims as studies have shown a lower utilization of oxygen when running in minimalist footwear compared to running in traditional footwear (4). There are multiple proposed theories as to why this is the case, including a reduced weight of minimalist shoes compared to traditional shoes and a greater reliance on the elastic qualities of the plantarflexor musculature when running in minimalist footwear, to name a few. However, this documented improvement in running economy has yet to be shown to improve real-world running performance (4). This is important as, while improvements in running economy measured in the lab are important, these changes don’t really matter to the athlete if there is not an effect on real-world running performance out on the road or the trail. So, What Type of Footwear is Best? The short answer to this question is, it depends. There is no such thing as the “best” type of running shoe. The best shoe for you as an individual comes down to a variety of factors as I have outlined in a prior section. A minimalist footwear choice may work best for some athletes, while a maximalist footwear choice may work best for other athletes. The key to finding what footwear choice is best for you is simply getting out there and trying different shoes. You can certainly look at factors such as injury history, weight, age, training history, etc. and make an educated guess as to which type of shoe will be best. But at the end of the day, there is no perfect way of determining what type of shoe is best for you without simply lacing up a pair and trying it out. From personal experience, you will usually know when you have found the footwear choice, and even the shoe brand sometimes, that just feels the best. It is also worth mentioning that the human body is incredibly good at adapting to different footwear choices. For example, if you have habitually run in maximalist footwear for years, changing to any other type of footwear may initially be a shock to the system. However, if you transition from one type of footwear to another slowly and gradually over time, the body can become used to it. I make this point so that you understand that what feels best to run in now can change in the future if you desire or if you are forced to make a change. They key to changing is just doing so in a smart manner. Conclusions Footwear selection is an incredibly personal thing for a runner. There is no such thing as one type of shoe or footwear that is “best” nor “worst”. Different athletes will find different types of shoes with different levels of cushion more comfortable. It is important to understand key mechanical and physiological changes that occur when running in different types of footwear, but these changes are not inherently good nor bad. These differences, however, can be used to an athlete’s advantage when deciding which category of footwear might be best for them, including minimalist, traditional, and maximalist footwear. Keep this in mind the next time you are deciding which shoe is best for either yourself or athletes you work with if you are a coach! References: 1. Andreyo E, Unverzagt C, Schoenfeld BJ. Influence of Minimalist Footwear on Running Performance and Injury. Strength and Conditioning Journal. 2022 Jun 26;44(3):107-16. 2. Lieberman DE, Venkadesan M, Werbel WA, Daoud AI, D’andrea S, Davis IS, Mang’Eni RO, Pitsiladis Y. Foot strike patterns and collision forces in habitually barefoot versus shod runners. Nature. 2010 Jan 28;463(7280):531-5. 3. Rooney BD, Derrick TR. Joint contact loading in forefoot and rearfoot strike patterns during running. Journal of biomechanics. 2013 Sep 3;46(13):2201-6. 4. Cheung RT, Ngai SP. Effects of footwear on running economy in distance runners: A meta-analytical review. Journal of science and medicine in sport. 2016 Mar 1;19(3):260-6. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Are There Benefits to Running in Minimalist Footwear? content media
0
0
10
Ryan Eckert, MS, CSCS
Jan 31, 2023
In The VO2 Max Forum
Background Over the years, the scientific community has come to recognize various factors that are strongly related to endurance performance. These factors are primarily physiological characteristics and include VO2max, lactate threshold, efficiency/movement economy, and most recently, durability (sometimes referred to as aerobic endurance) (1,2). The determination that these factors relate positively to endurance performance has come about from many decades of lab-based experimental research and field-based observational research. However, any sensible exercise scientist or coach working with athletes will know that overall endurance performance of an athlete is a product of much more than just these four physiological qualities described above. Physiological development of an endurance athlete is, of course, critical to success. Yet, there are other factors that potentially influence performance as well. Nutritional factors, psychological factors, and health-related factors are just some of the other variables influencing performance. It is difficult to establish some of these as determinants of endurance performance as they have not been focused on in the research literature to quite the same extent as the four physiological variables described above have been. However, a recently published paper dove into this exact topic and aimed to identify the top factors associated with endurance performance via an expert panel. Let’s discuss this paper next! An Expert Panel to Identify Top Factors Associated with Endurance Performance A study published in 2022 by Konopka and colleagues in Europe (3) took a list of 120 potential factors that could influence endurance performance and worked it down to 26 factors that 18 international experts agreed upon are associated with high-level endurance performance. These 26 factors are grouped into 5 different cluster groups and are summarized in the image below, which is Table 2 taken directly from this manuscript. The level of agreement from the expert panel had to be between 70-100% for the item to make it onto the final list of factors that are seen below. As you can see, the four physiological factors that are known to be associated with endurance performance made the list (i.e., VO2max, lactate threshold, efficiency/movement economy [economy of movement], and durability [endurance capacity]). Importantly, however, there are 22 other factors that 18 international experts agreed were associated with endurance performance. The expert panel organized their factors into the following five categories: physiology, nutrition, injuries, psychology, and fatigue. While all the factors listed in the table above are important, in my opinion, let’s look at some of the ones that had nearly 100% agreement or 100% agreement across the experts in the panel: What is the Takeaway From All This? There is quite a lot that can be taken away from this recently published research. Firstly, I would agree that just about every factor this expert panel decided on as being important to endurance performance. Honestly, there are likely dozens of other factors that are important as well. However, I do understand that this study was trying to get an expert panel consensus on some of the most important factors related to high-level endurance performance. Of the factors they agreed upon, I highlighted a few in the prior section that I thought stood out to me the most as they were modifiable outcomes that athletes can influence through their behaviors. For example, as mentioned early on, various physiological variables have been established as being critically determinant of endurance performance (i.e., VO2max, lactate threshold, efficiency/movement economy [economy of movement], and durability [endurance capacity]), and these factors can all be improved through training. However, the research published herein describes other factors that research should consider studying deeper to determine their power of predicting endurance performance. Things such as sleep, motivation, red blood cell volume, carbohydrate metabolism, and glycolysis capacity are all factors that can be trained and improved upon with the right behavioral approaches. Having more research to suggest which of these factors are most predictive of endurance performance alongside well-established physiological variables would appropriately broaden the list of factors that should be considered important in determining endurance performance capacity. To illustrate the importance of this, in a scenario in which two marathon runners with the exact same physiological profile are competing against each other, which one will end up winning? The critical difference might be in their sleep habits or patterns or their psychological make-up. Knowing and understanding this will help researchers and professionals better define the group of factors that contribute to better endurance performance. Secondly, as a coach or athlete reading this research, it gives a glimpse into what experts in the endurance sport space think you should be focusing your time and energy on developing. Outside of the basic physiological qualities that we all train for through our physical training, which factors should you also emphasize as a coach working with athletes? Which factors should you focus on bettering if you are an athlete coaching yourself? Maybe dietary intake of iron and red blood cell volume is a consideration for vegan athletes or athletes that don’t consume diverse sources of iron? Maybe improved sleep is a huge area of improvement for an athlete that trains extremely well but that only sleeps 6 hours per night? Or maybe an athlete has an incredible VO2 max, lactate threshold, movement economy, and durability profile, but fails to get the best out of themselves on race day due to lack of motivation? The research discussed herein and the factors that have been proposed as most crucial in determining high-level endurance performance provides coaches and athletes with a larger list of modifiable factors to target through training, diet, and health behavior changes that will ultimately improve overall endurance performance. Conclusions Personally, I hope studies like this pique researcher’s attention so that more factors other than the big four physiological variables are investigated more rigorously to determine their association with endurance performance. If you are a coach or athlete reading this, I hope you can take away some ideas regarding modifiable factors (like sleep, nutrition, psychology) that can be improved to achieve greater endurance performance. Success as an endurance athlete, after all, is about much more than just training to be physically fitter! References: Joyner MJ, Coyle EF. Endurance exercise performance: the physiology of champions. The Journal of physiology. 2008 Jan 1;586(1):35-44. Maunder E, Seiler S, Mildenhall MJ, Kilding AE, Plews DJ. The importance of ‘durability ‘in the physiological profiling of endurance athletes. Sports Medicine. 2021 Aug;51:1619-28. Konopka MJ, Zeegers MP, Solberg PA, Delhaije L, Meeusen R, Ruigrok G, Rietjens G, Sperlich B. Factors associated with high-level endurance performance: An expert consensus derived via the Delphi technique. Plos one. 2022 Dec 27;17(12):e0279492. Thomas DT, Erdman KA, Burke LM. Nutrition and athletic performance. Med Sci Sports Exerc. 2016 Mar;48(3):543-68. Kuwabara AM, Tenforde AS, Finnoff JT, Fredericson M. Iron deficiency in athletes: A narrative review. PM&R. 2022 May;14(5):620-42. McCormick A, Meijen C, Marcora S. Psychological determinants of whole-body endurance performance. Sports medicine. 2015 Jul;45:997-1015. Vitale KC, Owens R, Hopkins SR, Malhotra A. Sleep hygiene for optimizing recovery in athletes: review and recommendations. International journal of sports medicine. 2019 Aug;40(08):535-43. Huang K, Ihm J. Sleep and injury risk. Current sports medicine reports. 2021 Jun 1;20(6):286-90. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Which Factors Are Associated with High-Level Endurance Performance content media
0
0
30
Ryan Eckert, MS, CSCS
Dec 30, 2022
In The VO2 Max Forum
Background on Training Variables: Frequency, Intensity, Time/Duration, and Type/Mode Endurance athletes have a few key training variables that can be manipulated to optimize training adaptation and maximize race-day performance. These variables include the following: These variables described above are the key characteristics of training prescription and are the key drivers of training adaptation. The planned manipulation of frequency, intensity, and duration over the course of days (microcycles), weeks (mesocycles), and months/years (macrocycles) is defined as periodization. However, there are a variety of ways that these training variables can be manipulated. Typically, training intensity and training volume (frequency x duration) are manipulated throughout a macrocycle to achieve an important performance goal. For example, training volume might be high and intensity low when an endurance athlete is months away from their main competition. As the athlete gets closer to their competition, training volume might decrease or remain the same while intensity increases so that they are physiologically prepared to perform at their best on the day of their competition. When it comes to total training volume, typical training volumes among world-class endurance athletes is uniform with some variations depending on the individual athlete. It is well-established that, in general, a greater training volume will yield greater endurance adaptations if the athlete can handle it and recover well from it. This is not to say that there is a specific number of hours or miles that all endurance athletes should strive for, but rather that endurance athletes should strive to generally increase the amount of training volume they can perform and tolerate as this will likely lead to better overall performance. For example, an amateur runner in their first year of dedicated training might be able to handle running 30 miles/week. As this runner improves, the volume of training required to continually maximize training adaptations might be greater. At the same time, this runner’s capacity to handle and recover from training stress will increase. During the third year of this amateur runner’s dedicated training, they may be able to perform 50 miles/week of running. Now, this runner will surely reach a threshold of running volume at which their performance no longer benefits, and their risk of injury and illness increases. It would be wise for the athlete to not cross that threshold of training volume as the risk is not worth the reward. Doing more volume, hence, is not always better. But there is a clear relationship between training volume and endurance performance broadly speaking. It is for this reason that world class endurance athletes all perform high training volumes (1,3). When it comes to training intensity, however, there is some debate amongst coaches and researchers as to which type of training intensity distribution is best for endurance athletes. This is not to say that there is a debate as to whether doing more intense training sessions is beneficial or not, but rather the debate relates to how much time spent at various intensities yields the best results for endurance athletes. Before we can dive deeper into this debate, we need to first discuss what is training intensity and how is it measured. A Background on Training Intensity Training intensity is, in simple terms, how hard or easy an effort is. Training intensity can be monitored subjectively and objectively: There are upsides and downsides to each intensity monitoring tool, and there really is no such thing as one intensity monitoring tool that is “best”. Each mode of monitoring intensity tells us something that another doesn’t. For example, RPE is highly subjective and a higher RPE value can sometimes be assigned to objectively low exercise intensities when an athlete is under heavy fatigue. Combine RPE with HR and you start to see a more complete picture. That same athlete might indicate a high RPE during an objectively moderate exercise intensity with a HR of 120 bpm, indicating that they are fatigued. Add in other more objective metrics like pace, power, and/or lactate, and the picture becomes even more complete. Often, different metrics will be used under different circumstances as a way of prescribing specific training sessions and/or reviewing training-related data from a specific session. For an easy run, a coach might have an athlete rely solely on RPE. Yet, for a threshold run session wherein the athlete has access to a power meter, the coach might prescribe very precise power outputs for certain durations of time and have the athlete disregard RPE almost completely during the session. This is only the briefest of introductions on training intensity monitoring. There are entire books written on the topic of training intensity, and it can be quite a complex topic as one dives deeper and deeper into certain intensity metrics. For this discussion, we all simply need to have a basic understanding of training intensity and to acknowledge that there are benefits of each method of training intensity monitoring. Once we understand training intensity and how it is monitored, we can then begin to create different training “zones”. A training zone is simply a range of values that correlates with a different exercise intensity. Training zones are most often set with highly objective metrics, such as pace, power, and/or blood lactate. Then, HR values and RPE values are assigned to correlate with the different physiological zones. Keep in mind though, that training zones can be effectively setup and utilized based on just RPE and HR if that is all that an athlete has access to. The most basic training zone model is one comprised of three different training zones as seen in the graph directly below. To keep this piece of the discussion very brief, this is a classic 3-Zone Training Intensity Model depicted with lactate and HR. Lactate is perhaps one of the most precise ways of determining different training intensity cutoffs in an athlete. The only cutoffs that really matter for endurance athletes are the first and second lactate turn-points (LT1 and LT2). LT1 indicates the cutoff for an athlete’s low-intensity training zone, whereby anything below LT1 is considered a low-intensity effort. LT1 through to LT2 indicates an athlete’s moderate-intensity zone. And LT2 is the cutoff for an athlete’s high-intensity training zone, whereby anything above LT2 is considered a hard effort. LT1 corresponds to the exercise intensity at which lactate begins to rise noticeably in the blood above resting values. LT2 corresponds to the exercise intensity at which lactate accumulates in the blood exponentially. LT2 is also known as one’s “lactate threshold” or “functional threshold power/pace”. These 3 training zones can be setup and based off lactate, power, or pace. Lactate is the most precise way of monitoring intensity, but power and pace can be very precise as well if proper testing is done. What matters here is that we understand that training intensity zones, at their most basic level, are comprised of low-intensity (Zone 1), moderate-intensity (Zone 2), and high-intensity (Zone 3). For those used to a more common 5-zone model, Zones 1-2 fall into Zone 1, Zone 3 falls into Zone 2, and Zones 4-5 fall into Zone 3 when translating them over to the 3-zone model. Now, the debate amongst researchers and coaches relates to how much time endurance athletes should be spending at various exercise intensities to maximize performance. This is the research we will take a closer look at next! What’s Better: Polarized or Pyramidal? Two of the more common training intensity distribution (TID) approaches that endurance athletes follow are the polarized and pyramidal models. Polarized training consists of spending most of one’s time in Zone 1, a very small time in Zone 2, and a bit more time in Zone 3. Pyramidal training consists of spending most of one’s time in Zone 1, a bit of time in Zone 2, and a very small amount of time in Zone 3. See the figure below for a comparison of typical intensity distribution breakdowns for each of these models. There is a consensus that most of an endurance athlete’s time should be spent at low intensities in Zone 1 (75-80% of total training volume) as this has been documented to be associated with superior endurance performance (1,3). The debate comes into play regarding how much time should be spent in Zones 2 and 3. There is published research to suggest that both TID models are effective (2,4,5,6). So more recently, researchers have tried comparing the two different TID models head-to-head. For example, Filipas and colleagues (2) published a paper in 2021 comparing four groups of well-trained runners completing 16 weeks of training following a different TID plan. The groups were as follows: The total training volume across all groups was equal so that training volume was not a confounding factor. Runners were tested at baseline, mid-point, and post-intervention for a range of factors, including VO2peak, running velocity at blood lactate 2mmol/L, running velocity at blood lactate 4 mmol/L, and 5km time trial performance. All groups saw improvements across all outcomes mentioned above, but the group that performed PYR > POL saw the greatest improvements in the outcomes tested. So, does this mean that a periodization plan that follows a PYR > POL is the most effective design for all endurance athletes? Not necessarily. It is important to consider the athlete, their goal event, and their fitness when choosing a TID approach. The study mentioned previously had testing outcomes that likely favored the physiological adaptations that would occur from switching to a polarized TID for the last 8 weeks leading into the post-intervention tests. A 5km time trial is performed well above threshold, and a polarized TID has athletes training more often above threshold in Zone 3. Now, the velocity that runners could achieve at LT1 and LT2 improved, indicating improved running economy. We know that training at very high intensities can improve running economy at higher and lower intensities, which is included if one’s training with a polarized approach. It is interesting that the 16-week polarized TID group did not show any superior outcomes when compared to the 16-week pyramidal TID group. So, there is something that was more effective in transitioning between TID approaches in this study. A question I have is this: Would the group outcomes be different if the time trial distance that was assessed was a marathon? Or another running distance that is performed well below threshold or LT2? I would hypothesize that the POL > PYR group might do better as the final 8 weeks would have the athletes performing more work around the intensity that they would be testing at in the time trial. Unfortunately, most research utilizes shorter-length time trials as opposed to longer-distance ones. There is still much more research that is needed on this topic area. This all leads me to my primary take away from the research that has published on various TID approaches to date. Polarized and pyramidal TID approaches are both likely effective at improving endurance performance. The TID approach that is most effective depends highly on the individual athlete and the goal event they are targeting. As a coach, I generally progress an athlete from an off-season TID that targets different energy systems that do not necessarily need to be maximized for race day to an in-season TID approach that does target the specific energy systems that need to be maximized for race day. In other words, if an athlete’s target race/event is less than about 60-70 minutes, I would progress an athlete from a pyramidal approach to a polarized approach. This is because as the athlete closes in on race day, the athlete would want to be training above threshold (Zone 3) on their hard days as that is most like the exercise intensity that they will be racing at on race day. The opposite would be true for an athlete targeting an event longer than 70-90 minutes. I would progress an athlete from a polarized approach to a pyramidal approach. This is because as the athlete closes in on race day, the athlete would want to be training below threshold (Zone 2) on their hard days as that is most like the exercise intensity that they will be racing at on race day. My approach outlined above follows the SAID principle in strength and conditioning, which stands for Specific Adaptations to Imposed Demands. If an athlete is training for an Ironman triathlon, the farther out from race day they are, say during an off-season base period, spending time working in Zone 3 on hard days makes the most sense physiologically as the adaptations that occur would likely yield the most improvements in their ability to perform at exercise intensities above their threshold. This is not to say that there are not some physiological benefits that will be seen at lower intensities in Zones 1 and 2, but the most benefit would likely be seen at the exercise intensity they train specifically at, which is above threshold in Zone 3. Then, as the athlete moves into their race-specific (in-season) preparation, a pyramidal approach makes sense as they will then be spending most of their time in hard sessions below threshold. An Ironman is typically raced well below LT2 (threshold) and a lot closer to LT1 (Zone 1 top end cutoff). Spending more time improving an athlete’s mechanical efficiency at LT1 would be the most specific training for the intensity they will be performing on race day. Keep in mind that this is my approach that considers the currently available evidence and my own experience in working with athletes. At the end of the day, there is not enough research to suggest that a specific TID or progression from one TID to another is best. Whether you are an athlete coaching yourself, or a coach working with athletes, it is important to consider this research we have on TID as well as consider what tends to work best in your experience. At the end of the day, there is no one best TID approach for all endurance athletes. The TID that works best is dependent on a variety of factors and will change throughout an athlete’s season. Conclusions Research to date supports both a polarized and a pyramidal TID for endurance athletes. Generally, performing most of one’s training volume at low intensities (~75-80%) is best for long-term adaptation and performance. However, time spent at moderate and high intensities of exercise can and should vary depending on the athlete, their fitness level, their unique needs, and their goal competition. Therefore, a polarized TID and a pyramidal TID can have a place in an athlete’s training program. References: Casado A, Hanley B, Santos-Concejero J, Ruiz-Pérez LM. World-class long-distance running performances are best predicted by volume of easy runs and deliberate practice of short-interval and tempo runs. The Journal of Strength & Conditioning Research. 2021 Sep 1;35(9):2525-31. Filipas L, Bonato M, Gallo G, Codella R. Effects of 16 weeks of pyramidal and polarized training intensity distributions in well‐trained endurance runners. Scandinavian Journal of Medicine & Science in Sports. 2022 Mar;32(3):498-511. Haugen T, Sandbakk Ø, Seiler S, Tønnessen E. The Training Characteristics of World-Class Distance Runners: An Integration of Scientific Literature and Results-Proven Practice. Sports Medicine-Open. 2022 Dec;8(1):1-8. Hydren JR, Cohen BS. Current scientific evidence for a polarized cardiovascular endurance training model. The Journal of Strength & Conditioning Research. 2015 Dec 1;29(12):3523-30. Neal CM, Hunter AM, Brennan L, O'Sullivan A, Hamilton DL, DeVito G, Galloway SD. Six weeks of a polarized training-intensity distribution leads to greater physiological and performance adaptations than a threshold model in trained cyclists. Journal of applied physiology. 2013 Feb 15. Stöggl T, Sperlich B. Polarized training has greater impact on key endurance variables than threshold, high intensity, or high volume training. Frontiers in physiology. 2014 Feb 4;5:33. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Which Type of Training Intensity Distribution is Best for Endurance Athletes: Polarized or Pyramidal? content media
0
0
31
Ryan Eckert, MS, CSCS
Nov 29, 2022
In The VO2 Max Forum
Introduction It has been almost three years now since I last looked at the scientific literature surrounding compression/recovery boots (I will refer to them from here on out as compression boots). A few years ago, there was still very limited published research on this topic, and that research was very mixed at best and primarily consisted of relatively small studies with few participants. Essentially, I concluded that compression boots simply didn’t have the evidence to back them up as a worthwhile investment into one’s recovery. Besides, adequate sleep quantity and quality as well as good overall nutrition are really the key drivers to 99.9% of one’s recovery from training. So, even though there really weren’t any drawbacks or negatives to wearing compression boots as a recovery tool, I figured time and energy was better spent on good sleep and good nutrition. This is still true today and will always be true. Good sleep and good nutrition should always be the focus before introducing any sort of additional recovery tool or modality. Nothing makes up for a lack of good quality and quantity sleep and proper nutrition practices when it comes to recovery. However, I am aware that there are many athletes out there, and coaches too, that are looking for that extra 0.01% to help accelerate their recovery and further improve their performance (assuming they are sleeping well and eating well). So, I will continue to keep up to date with various technologies and methods claiming to improve recovery in athletes as there is indeed an established market for them, and people do buy into that market. I figure, if at the very least I can help an athlete or coach decipher which evidence-based products to invest their limited time, energy, and money into, then I can feel better about the overall work I do as a Coach. The concept of an intermittent compression-based boot or sleeve is not novel. These sorts of devices have been around for decades, and a few devices have FDA approval as a medical device to help some patients with cardiovascular disease (9). There is emerging evidence that these sorts of medical grade devices can also be useful in other clinical populations as well. However, it was more recently that these types of intermittent compression devices were being investigated for their potential benefits on recovery in athletes. It was in the early 1990’s that researchers first published on the potential utility of compression boots for recovery, and then about two decades later, more and more researchers began investigating this area and consumer-based compression boots started to flood the market. Modern-day compression boots, however, are not cheap, with some of the top brands selling their boots anywhere between $500-$1000 depending on the company and the accessories you purchase with it. So, I think it is important to understand before making a purchase like that, do compression boots do what they claim to do, and do they help enhance recovery? Let’s look at the most current scientific literature on this topic and find out. What does the updated science tell us? As I mentioned previously, there has been a small body of research that has examined the effects of compression boots on recovery from strenuous exercise in both healthy adults and athletes. This research is very mixed, with some studies showing that there may be some small benefit to be had from wearing compression boots after strenuous exercise related to improved recovery of electrical activity of acutely fatigued muscles, quicker recovery of exercise capacity after a fatiguing exercise bout, accelerated recovery of hormonal indicators of stress, and accelerated reduction in inflammatory biomarkers post-exercise (3,7,8,10,11). However, on the other hand, there are a handful of studies as well demonstrating the opposite and that there is no net benefit of wearing compression boots post-exercise on indicators of recovery (2,4,5,6). This past year in 2022, Blumkaitis and colleagues (1) published a study in which they had thirty healthy male adults perform a fatiguing exercise bout and then immediately either undergo a 30-min compression boot session, wear a compression garment on their legs for 30-min, or receive no treatment. Researchers were looking to see if there was any effect of the single compression boot treatment or compression garment treatment on muscle soreness, fatigue, blood markers of stress/fatigue, and plyometric jumping performance at 24- and 48-hours post-exercise. All study participants experienced muscle soreness and fatigue after the fatiguing exercise bout as expected, however the compression boot treatment and compression garment treatment groups experienced a lesser drop on plyometric jump performance at 24 and 48 hours when compared to the control group, indicating that these treatments had an impact on healthy adults’ recovery of elastic and neuromuscular qualities post-exercise. There were no other benefits seen for the treatment group. As with most of these studies around recovery boots, this study had an extremely small sample size not powered to detect true effectiveness. Another recent study published in 2022 by Tally and colleagues (9) found somewhat similar results in that a compression treatment for three consecutive days following a fatiguing exercise bout has a positive impact on study participants’ ability to recover exercise capacity when compared to the control group. However, this study implemented a specific form of compression called ‘external counter pulsation’ in which three different compression cuffs are sequentially and rapidly inflated during diastole (i.e., relaxation of the heart muscle) and then rapidly deflated during systole (i.e., contraction of the heart muscle). This is a specific compression sequence that differs substantially from the typical compression pattern found in consumer-based recovery boots, so the findings of this study should not be translated directly over to more mainstream athletic recovery boots. Altogether, despite more research slowly emerging in this area, the research still seems to be mixed at best and is still limited by very small sample size studies, which limits our ability to draw more firm conclusions as to compression boots’ true effectiveness. The placebo effect is something that also confounds research around recovery modalities post-exercise. Very often, individuals that believe a treatment is helping them recover will experience some measurable recovery benefit even if the modality or device is not eliciting a physiological response that would benefit recovery. In other words, if someone believes compression boots are going to help them recover better, they may experience some recovery benefit, not from the device itself, but from the belief in the device. This is an important distinction to be made when it comes to any sort of intervention or treatment in the space of exercise training, recovery, and performance, as often the belief in something itself is more powerful than the device, modality, treatment, or intervention. So what does this all mean? I still believe that I am much of the same mindset that I was in about three years ago last time I looked at the existing research around compression boots: There is the potential for compression boots to have some physiological benefit on recovery from hard exercise or training, however, this is still very much theoretical at this point as it is becoming increasingly more difficult to tease out if the benefits seen in some studies are due to the placebo effect or not. Small sample sizes in the studies done to date do not help in determining true effectiveness of these devices either. It is important to note that wearing compression boots as a recovery modality do not seem to carry any sort of negative impact. So, those that decide to wear these devices can rest assured that there is no detriment to wearing them, but there may or may not be any real value in the device itself for recovery purposes. Conclusions If you have 1) established good sleeping and nutrition habits, 2) are looking to incorporate some additional forms of recovery into your routine to seek out that extra 0.01%, and 3) have the extra cash and time to invest in utilizing compression boots, then these devices certainly cannot hurt to try. In fact, if having compression boots forces you to slow down, kick your feet up, and relax, then these might be a worthwhile investment for you. Some athletes have a hard time just sitting down and doing nothing. However, if having compression boots is the thing that sparks you to sit down and relax each day, then this sort of device can be a great thing to add into your routine. On a personal note, I have worn compression boots myself, and I can vouch for them feeling good once they are on. When it comes to recovery, there is something to be said about doing things that just feel good and help you relax. I am unconvinced by the research literature available to date in the true effectiveness of compression boots, however, and so I do not think I will invest my money into purchasing a pair. I will continue to invest my time and money into optimizing my sleep and nutrition habits before I spend a big chunk of money on any sort of recovery device like compression boots. But if the research literature becomes more convincing that a device like compression boots can in fact have a true impact on recovery, then who knows, maybe I’ll pull the trigger and buy a pair. References: Blumkaitis JC, Moon JM, Ratliff KM, Stecker RA, Richmond SR, Sunderland KL, Kerksick CM, Martin JS, Mumford PW. Effects of an external pneumatic compression device vs static compression garment on peripheral circulation and markers of sports performance and recovery. European Journal of Applied Physiology. 2022 Apr 27:1-4. Cochrane DJ, Booker HR, Mundel T, Barnes MJ. Does intermittent pneumatic leg compression enhance muscle recovery after strenuous eccentric exercise?. International journal of sports medicine. 2013 Nov;34(11):969-74. Collins R, McGrath D, Horner K, Eusebi S, Ditroilo M. Effect of external counterpulsation on exercise recovery in team sport athletes. International Journal of Sports Medicine. 2019 Aug;40(08):511-8. Northey JM, Rattray B, Argus CK, Etxebarria N, Driller MW. Vascular occlusion and sequential compression for recovery after resistance exercise. J Strength Cond Res. 2016;30(2):533–539. O’Donnell S, Driller MW. The effect of intermittent sequential pneumatic compression on recovery between exercise bouts in well-trained triathletes. J Sci Cycl. 2015;4(3):19. Overmayer RG, Driller MW. Pneumatic compression fails to improve performance recovery in trained cyclists. International journal of sports physiology and performance. 2017;13(4):490-5. Roberts LA, Caia J, James LP, Scott TJ, Kelly VG. Effects of external counterpulsation on postexercise recovery in elite rugby league players. International journal of sports physiology and performance. 2019 Nov 1;14(10):1350-6. Russell S, Evans AG, Jenkins DG, Kelly VG. Effect of external counterpulsation on running performance and perceived recovery. International Journal of Sports Physiology and Performance. 2020 Feb 27;15(7):920-6. Tally S, Kado-Walton M, Hillery N, Wing D, Higgins M, Groessl E, Nichols J. Effects of External Counterpulsation on Performance and Recovery After Exertion. American Journal of Sports Science. 2022;10(4):84-91. Wiener A, Mizrahi J, Verbitsky O. Enhancement of tibialis anterior recovery by intermittent sequential pneumatic compression of the legs. Basic Appl Myol. 2001;11(2):87–90. Zelikovski A, Kaye C, Fink G, Spitzer S, Shapiro Y. The effects of the modified intermittent sequential pneumatic device (MISPD) on exercise performance following an exhaustive exercise bout. Br J Sports Med. 1993;27(4):255–259. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Compression/Recovery Boots: An Update on the Science content media
0
0
82
Ryan Eckert, MS, CSCS
Nov 01, 2022
In The VO2 Max Forum
Introduction to Running Mechanics Running mechanics is a fairly misunderstood topic among many coaches and athletes. There is a lot of misinformation out there on what is considered ‘proper running form’. Some of the most common recommendations I will often here from others is to ‘run with a cadence of >180 steps per minute’, or to ‘kick your heel to your butt when running’, or to ‘run on the ball of your foot’. These recommendations, and more, are incorrect and not based on the most up-to-date scientific literature. There is an incredibly wide variance in running mechanics among amateurs and professionals, with differences in things like running cadence, arm carry, heel kick, etc. often being quite unique to each runner (6). Research in recent decades has shown that ‘proper running mechanics’ can be best summed up with the following three evidence-based recommendations (6): Run with a slight forward lean at the ankles (not the hips), Land with each foot strike under the hips, or the body’s center of gravity (heel vs. midfoot, vs. forefoot doesn’t matter), and Run with a self-selected cadence that feels comfortable and natural (typically somewhere between 160-190 steps per minute) The rest of running form and mechanics is up to each individual to self-select and optimize over time. Running technique and efficiency usually optimizes itself over time as a runner runs more and gains more experience. Running Shod vs. Barefoot Another hot topic of debate and one also fraught with misinformation is running footwear. Again, this is something that is highly individual to each runner to find what it best for them over time. Some runners prefer more ‘cushy’ shoes whereas others prefer more ‘minimalist’ shoes. Others prefer something in between. However, one aspect of running that is not debatable is the difference in running mechanics that occurs when a runner runs shod (i.e., in shoes) vs. barefoot. With any runner, there will be a difference in foot strike pattern, cadence, and other mechanics when going from shoes to no shoes (1-2,4-5), and this is not necessarily a good or bad thing. It is simply a matter of what is and what is observed. I want to spend a bit more time discussing this last bit of information as it is important for endurance athletes and coaches to know what happens when someone goes from running shod to running barefoot. Some runners are adamant that running barefoot is optimal, whereas others argue running in shoes is optimal. I’m not here to get into human evolution and how the foot has changed over time as humans have evolved. However, humans did indeed start off running barefoot through untouched wilderness over rocks, roots, stones, and more. We then progressed over hundreds of thousands of years to where we are today, which is running in various types of shoes on pavement, trails, and treadmills. Like just about anything else we experience in our day-to-day life today, running in modern day conditions is vastly different from the running conditions that humans experienced a long time ago. So I challenge folks to be careful when claiming what is ‘optimal’ when it comes to run footwear, or lack thereof, as humans have slowly adapted over time to do many things differently as the world and the environment we lived in has changed. For example, barefoot running humans from the early days of our ancestors possibly had different foot structure and different thickness of padding on their feet. They also never knew what running in footwear was like and therefore acclimated over time to optimize themselves for running barefoot on wilderness terrain. Humans today, however, are born from generations of humans that have worn footwear of some sort. We also most likely started wearing shoes as soon as we could walk. So we more modern humans acclimated over time to optimize ourselves for walking and running shod and not barefoot. The point is, just because early ancestors of ours did something a long time ago when the world was vastly different than it is today does not mean we should replicate it and that is optimal for today’s modernized conditions. After all, most people run on concrete and asphalt for goodness’ sake! Try running on those surfaces barefoot and see how your body feels the next day… With that discussion out of the way. Let’s discuss what we know happens when an athlete transitions from running shod to barefoot. Typically, we see in the scientific literature that when an athlete goes from running in shoes to running barefoot, a few key running characteristics will change (1-2,4-5): A greater tendency for a midfoot or forefoot strike, A higher running cadence, A shorter stride length, A shorter ground contact time, and A shorter flight time These changes when going from shod to barefoot are not a good or bad thing like I mentioned previously. The loading patterns and shock absorption characteristics while running are different in shod vs. barefoot conditions, and running biomechanics will change to better suit the footwear conditions that a runner chooses. It is actually quite fascinating that the body is able to shift the way it runs without conscious thought when running shod vs. barefoot. A recently published study by Jaen-Carillo and colleagues (3) aimed to examine differences between running shod vs. barefoot like many studies have in the recent decades, but they had study participants wear a Stryd foot pod that measured power while running. The findings of this study confirmed what we already see in the literature in that runners running barefoot tended to have more of a midfoot or forefoot strike, a higher running cadence, a shorter stride length, a shorter ground contact time, and a shorter flight time. However, researchers also identified a greater running efficiency when running barefoot compared to running in shoes as evidenced by lower form power (power output that does NOT generate forward propulsion), greater leg spring stiffness, and lower vertical oscillation in those running barefoot. These changes are indeed all good things that a runner tends to experience at a slower rate over time when running in normal shoe footwear. These changes are usually indicative of improvements in overall running efficiency. Now, again, these differences do NOT mean that running barefoot is superior to running shoes because there are efficiency changes. For example, in this same study, the average power and normalized power output required to run at the pace study participants were running at during the experiment was no different between shod and barefoot groups. This essentially means that all runners, regardless of shod or barefoot condition, required the same power output to run at the given pace they were assigned. These differences are, however, important to know as an athlete or coach. One reason it is important to know the differences in running shod vs. barefoot is so that clever marketing tactics can be assessed for validity. For example, minimalistic footwear companies and even some proponents of running barefoot will make wild claims about the so-called ‘superiority’ or ‘benefits’ of running in minimalistic footwear or of running barefoot. Most of the time, the claims that individuals and companies make are absolutely not grounded in scientific evidence and can be misleading for athletes and coaches. There have been minimalistic footwear companies in recent years that have actually gotten in trouble for making false claims about their shoes. Another reason it is important to know the differences in running shod vs. barefoot is because there may be some potential benefit to purposely running barefoot on soft surfaces in small doses. Authors of the study I mentioned previously (3) called for more research examining the long-term impact of adding small bouts of barefoot running to an athletes training plan. The reasoning for running barefoot on soft surfaces (grass, turf, smooth trails) is that running barefoot forces running biomechanics that are more efficient in certain ways. If a runner can train barefoot occasionally, is there some long-term efficiency improvements to be had that can transfer over to running in the athlete’s normal footwear? Researchers don’t know as of yet, but hopefully future research will give us some more answers on this topic. So what does this all mean? What can be taken away from this discussion? First, know that running biomechanics and running form is highly unique to each individual. Rarely should there be specific changes to running form unless an athlete is running with a dramatic flaw or deficit in their form (landing way out in front of their center of gravity, running with an extremely low cadence <160 steps per minute, leaning really far backward or forward, bouncing up and down way too much with each run stride). Second, footwear is also highly unique to each individual. There is no one best shoe or barefoot running style that is best for every runner. Shoe choice should be made by each individual athlete that best suits their running style and comfort needs. Barefoot running or minimalistic footwear is not some magical solution that every runner needs to adopt. Finally, it is important to understand the differences between running in shoes vs. running barefoot in order to decipher company marketing claims and to understand if there is any utility to planning in small bouts of running barefoot. However, research is still yet to indicate whether there is any real benefit to purposely running in minimalistic footwear or running in no shoes at all for small bouts. Conclusions I realize that the topic of running biomechanics and running footwear is a massive one to dissect. My goal in this brief write-up was not to dissect every aspect of this topic, but rather to give you as the reader a brief insight into some of the most recent research literature on some common and often misunderstood topic areas within the space of running biomechanics and footwear. I hope the information provided herein proves to be useful for you. References: Cochrum RG, Connors RT, Coons JM, Fuller DK, Morgan DW, Caputo JL. Comparison of running economy values while wearing no shoes, minimal shoes, and normal running shoes. Journal of Strength and Conditioning Research. 2017 Mar 1;31(3):595-601. Divert C, Mornieux G, Freychat P, Baly L, Mayer F, Belli A. Barefoot-shod running differences: shoe or mass effect?. International journal of sports medicine. 2008 Jun;29(06):512-8. Jaén-Carrillo D, Roche-Seruendo LE, Molina-Molina A, Cardiel-Sánchez S, Cartón-Llorente A, García-Pinillos F. Influence of the Shod Condition on Running Power Output: An Analysis in Recreationally Active Endurance Runners. Sensors. 2022 Jun 26;22(13):4828. Lieberman DE, Venkadesan M, Werbel WA, Daoud AI, D’andrea S, Davis IS, Mang’Eni RO, Pitsiladis Y. Foot strike patterns and collision forces in habitually barefoot versus shod runners. Nature. 2010 Jan;463(7280):531-5. Lussiana T, Hébert-Losier K, Mourot L. Effect of minimal shoes and slope on vertical and leg stiffness during running. Journal of Sport and Health Science. 2015 Jun 1;4(2):195-202. van Oeveren, B. T., de Ruiter, C. J., Beek, P. J., & van Dieën, J. H. (2021). The biomechanics of running and running styles: a synthesis. Sports Biomechanics, 1-39. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Running Biomechanics and Footwear: Dispelling Common Beliefs and Misconceptions content media
0
0
7
Ryan Eckert, MS, CSCS
Sep 29, 2022
In The VO2 Max Forum
Introduction Muscle cramping is something that nearly every single endurance athlete has dealt with at some point in their sporting journey. Cramping can be mild and bearable for some, but debilitating for others. Cramping can also unfairly afflict some athletes more than others. Cramping during a race can literally cause an athlete to withdraw from an event they have trained for and invested in for months or even years of their life. Cramping also does not discriminate as it affects everyone from complete beginners to the highest performing professionals. Muscle cramping, however, is an often very misunderstood topic and one that is fraught with misinformation and touted “miracle cures” and “formulas” that are supposed to help athletes avoid muscle cramping during training and racing. Yet, if there is anything that the research literature has taught us over the years, it is that muscle cramping is without a doubt NOT caused by one single factor. Rather, muscle cramping can have many, many different potential causes depending on the athlete. Therefore, any product or person claiming that the secret they harness a “cure” for cramping is one to likely take caution in. Yes, dehydration, loss of electrolytes, fatigue, heat, and other things factor into the etiology of cramping in an athlete. But it is much more complex than just these factors in isolation. Some athletes may cramp in certain scenarios when others don’t. Some athletes might think they cramp because of dehydration or at a big loss of electrolytes when in fact that is not the cause at all. With this topic being shrouded in so much mystery, misinformation, and snake oils promising a cure, let’s take a look at a very recently published 2022 evidence-based review paper by Miller and colleagues (1) to dissect muscle cramping during exercise in a bit more detail and to help us understand what is really happening and then how to address it. What are some of the proposed theories/mechanisms behind muscle cramping? Firstly, we need to discuss some of the most prominent theories underpinning exercise-associated muscle cramps (EAMCs for short used herein). There are three primary theories that have emerged over the decades related to the pathophysiology of EAMCs. These three theories are as follows: Dehydration and Electrolyte Imbalance Theory This is the oldest and perhaps better-known theory underpinning muscle cramping. This theory postulates that a loss of fluids and electrolytes during exercise causes an increase in chemicals in the muscle that excite nerve fibers as well as an increased pressure on motor neurons, both of which lead to an increased likelihood of a sustained involuntary muscle contraction (i.e., cramp). This theory over the years, however, has had some ‘holes’ poked into it, giving it less shaky scientific ground to stand upon by itself as the sole reason for causing EAMCs in athletes. Altered Neuromuscular Control Theory This theory emerged in 1997 and was updated in 2009 to reflect the notion that muscle fatigue, combined with other risk factors, alters the level of excitation and inhibition that the motor neurons are receiving, thereby increasing the chances of a cramp. However, this theory also developed some flaws in its reasoning over the years and did not have the evidence to back it up either as the sole cause of muscle cramping. Multifactorial Theory of EAMCs This is the most recent theory put forth and proposes that EAMCs are likely due to a multitude of factors, of which dehydration, electrolyte imbalances, and fatigue are some of them. Any factor that increases motor neuron activity and increases muscle cell membrane activity can, in theory, result in an increased risk of muscle cramping. However, there are many ways in which this ‘altered neuromuscular control’ could be reached, including things such as dehydration, electrolyte imbalances, muscular fatigue, heat stress, prior muscle injury/damage, and certain medications, supplements, or conditions that affect the central nervous system. To see how complex this theory is and to get a better sense of just what factors might play a role in EAMCs, see the figure below taken from the Miller and colleagues’ paper (1). As an example to help illustrate this theory further, consider an athlete that cramps during the final 10k or a marathon. Sure, fatigue may play a role and so may hydration and electrolyte status. However, what if this athlete had prepared well for the event and had followed a hydration plan that worked well for them? There could be other factors at play unique to this athlete at that time, including any stress outside of training they are under, the effects of any medication they may be taking, the environment they are racing in, and more. This athlete, however, might have mistakenly been taught to believe that cramps are caused by dehydration and electrolyte imbalance. They may, consequentially, go astray down a path of searching for further hydration planning and refinement that won’t provide any useful answers to their problem. Essentially, when looking to find the underlying cause of EAMCs, the goal is to look globally at the athlete as a whole, identify any risk factors they present that could underpin altered neuromuscular control, and then address them (as seen in the figure/image above). In theory, this should help reduce the risk of cramping, but there will never be zero risk of EAMCs occurring during training or racing. What does the evidence actually say is linked to muscle cramping? This third theory highlighted above seems to be the theory that holds the most evidence and support. But basically, researchers and clinicians still don’t know the exact cause of muscle cramping. We only have a theoretical best guess. So, first and foremost, understand that anyone who claims to know the cause of EAMCs and precisely what to do or take to prevent it is not being truthful or is misinformed. This is the first step in knowing how to understand muscle cramping. There is a lot that we do know, but it is not enough to make a definitive conclusion pointing precisely to the cause of cramps. After you understand a bit more about what underlies EAMCs from a physiological perspective, we can then begin to make changes to prevent them or treat them. What are the best strategies to prevent muscle cramping from taking place? This is what most endurance athletes are interested in as the goal is always to avoid cramping up during an important training session or race, not only because cramping can drastically affect performance, but because they are painful and unpleasant. From a prevention standpoint, however, there are many things to consider. Like I mentioned previously, if you do experience cramps as an athlete, the best way to begin addressing this issue is to take a step back and look at various factors and begin addressing them one-by-one. The last thing you want to do is get caught in a trap of believing cramping is caused by one thing and go down a rabbit hole of trying to prevent cramps by addressing only that one factor, when you might have multiple factors that need addressing. The possible factors to address are too many, so I will include an image below from the Miller and colleagues’ paper (1) that poses questions to ask yourself in order to better identify risk factors for you as an individual that can be addressed: As you can see, there are many questions to consider that go way beyond simply addressing hydration and electrolyte intake like many folks believe they need to do in order to solve cramping issues. Yes, these factors could be the ones causing or contributing to cramps for you, but they may not, or they may be two of many other factors to consider when trying to resolve your cramping issues. My suggestion is to start with the questions above and work with a qualified professional in the areas that you might need to address. For example, if you think that muscular fatigue or prior muscle injuries might be placing you at risk for cramps, work with a qualified Physical Therapist or Strength and Conditioning Professional to begin addressing these areas. If nutrition and hydration are potential issues, work with your Coach or a Registered Dietician to address these areas of concern. If a medical condition or a medication might be the cause, work with your Primary Care Physician to begin addressing these concerns. Maybe you are under a lot of stress outside of being an athlete and you feel that your mental health is an issue. Work with the appropriate mental healthcare provider to work on addressing this concern. I understand that this will likely be frustrating for athletes that do suffer from regular cramping that impacts their ability to perform, but the underlying causes are likely complex, and so the solutions that help reduce your risk of cramping will likely be somewhat complex as you try and navigate various items that increase your risk of EAMCs. I really do wish it was as simple as “drink more fluids”, “take more magnesium during exercise”, or “do more strength training”, but the reality is that EAMCs are a complex topic with many things to consider. So, if you are an athlete that regularly suffers from cramps and you would like to try and lessen the burden of these cramps on your performance, start by trying to identify the factors you have that place you at risk of experiencing cramps, then work with the right professional(s) to begin coming up with solutions. Over time, you will hopefully see improvements and a lessening of the occurrence of cramps. Remember though, realistically nobody is ever immune to cramping, even those that have never experienced one before. There will always be a possibility of EAMCs, particularly during higher-intensity and longer-duration performances when you are pushing yourself to the limit as an athlete. Understand that the goal is to work on reducing the likelihood of cramps from occurring in the first place, but to also be prepared to deal with them should they occur. This is what we will discuss in the next section. What are the best strategies to treat muscle cramping after it has taken place? The treatment for muscle cramping really falls under two categories: 1) things that can be done yourself to help alleviate cramping, and 2) things that should be done under the supervision of a medical professional should cramping worsen or present with other signs/symptoms. The fastest way of stopping or relieving a muscle cramp once it has begun is through stretching, which is usually the natural instinct of any athlete when they feel a cramp come on suddenly. However, if the cramping persists and becomes debilitating to performance, more treatment may be needed. If the cramping is severe enough, it may cause you to pull out of a race. Typically stopping exercise, stretching the muscles that cramped, and focusing on general rehydration, is the best way to get them to stop, and this is supported by the scientific literature (1). However, if you were to experience cramping that continues and worsens over time even after stopping exercise, along with the development of other signs/symptoms of a serious medical condition (e.g., dizziness, nausea/vomiting, collapsing, dark urine, altered state of consciousness), this is where medical intervention becomes necessary. There are many things a medical professional might need to do in order to address this type of scenario, and this is beyond the scope of this paper. It is important, though, to know what to look for in case you do feel that medical intervention is warranted. Conclusions Muscle cramping, or EAMCs as they were referred to often herein, is a complex topic that is not fully understood. Unfortunately there is not a clear cause of muscle cramps nor a clear treatment approach for preventing muscle cramps. However, by taking a global and multifactorial approach to understanding yourself as an athlete better, you may begin to identify various risk factors of muscle cramps that you can address under the guidance of the appropriate professional. Cramping is something that is all too common in endurance sports, and so it is important for athletes and coaches to be well-informed of the current scientific literature surrounding EAMCs so that the proper recommendations are made to remedy muscle cramping in those that are trying to address it. References: Miller KC, McDermott BP, Yeargin SW, Fiol A, Schwellnus MP. An Evidence-Based Review of the Pathophysiology, Treatment, and Prevention of Exercise-Associated Muscle Cramps. Journal of Athletic Training. 2022 Jan;57(1):5-15. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Muscle Cramping During Exercise: What Does the Latest Evidence Say Regarding the Cause and Treatment of Cramping? content media
0
0
46
Ryan Eckert, MS, CSCS
Aug 31, 2022
In The VO2 Max Forum
Introduction Compression socks or calf sleeves (referred throughout as ‘garments’) are popular amongst endurance athletes, particularly runners and triathletes. They are typically worn by athlete either during training/racing to improve performance or after exercise to augment recovery. Regardless of what the actual scientific evidence might be in support or opposition of these potential benefits, many endurance athletes wear them because it mentally makes the athlete feel better or just ‘feels good’ in general to be wearing compression garments. But what does the scientific evidence say? Is there evidence to support the use of compression garments to improve performance and enhance recovery? Let’s find out. What are the Theoretical Mechanisms Behind Compression Garments? The potential mechanisms proposed as to why compression garments help athletes can be divided into two categories: During exercise After exercise Compression garments worn during exercise is thought to reduce microtrauma and damage as well as reduce the energy expenditure of the the working muscles. This reduced overall energy demand and damage that normally occurs during exercise is thought to improve performance while wearing the compression garments. Compression garments worn after exercise is thought to improve venous return to the heart, accelerate the removal of metabolic by-products, limit swelling, and increase blood flow, and thereby oxygen delivery, to the musculature utilized during exercise. These proposed mechanisms of action are all theoretical art this point as no research has yet tried to prove or disprove these theories. There has, however, been research that has looked at the benefits or impacts of compression garments worn during and after exercise. This is the research we will discuss next in order to see if compression worn during or after exercise provides any real-world benefit. Does the Evidence Support Compression Garments as Helpful or Beneficial? A 2015 Systematic Review published by Beliard and colleagues aimed to synthesize the available literature on compression garments at the time and answer the question as to whether wearing compression garments during and/or after exercise confers any benefits. From the 24 original research articles included in the review, the major findings were as follows: There are conflicting and inconclusive results regarding the benefits of wearing compression garments during exercise and performance. In other words, some studies showed compression garments worn during exercise had a benefit, but many others showed no benefit. There was a general trend towards improved recovery metrics (reduced soreness, faster recovery of strength or power) when compression garments were worn after exercise. There was no relationship between various pressures applied by different compression garments and the benefits, or lack thereof, that were recorded. Conclusions I have looked at some select studies previously on compression garments and their potential benefits, and I will be honest, this paper changed my opinion on compression garments a bit. Previously, I would have suggested that there were no real benefits of wearing compression garments, but I have to change that statement to say the following instead. There may not be any performance benefit to wearing compression garments during exercise, but wearing them after exercise may have a benefit on recovery, namely reduced soreness and faster recovery of strength and power following exercise. That’s the beauty of science, keeping up-to-date with it has a way of changing and altering one’s suggestions and beliefs over time as newer research emerges and is compiled. Hopefully this brief write-up herein helps inform your feelings on compression garments as well. References: Beliard S, Chauveau M, Moscatiello T, Cros F, Ecarnot F, Becker F. Compression garments and exercise: no influence of pressure applied. Journal of sports science & medicine. 2015 Mar;14(1):75. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Does Wearing Lower Body Compression Garments Improve Performance During and/or Recovery After Exercise content media
0
0
10
Ryan Eckert, MS, CSCS
Aug 01, 2022
In The VO2 Max Forum
Introduction Climate change is something that we all have to face and navigate as a human species. We can only hope that we are able to make progress towards reducing our burden on the planet’s global temperature, and therefore climate patterns, by reducing our output of carbon emissions into the atmosphere. However, making the changes necessary to delay or slow the warming of the planet will take decades to realize, and in the meantime, we will all be dealing with some of the ramifications of hotter average global temperatures and the impact that this has on global climate patterns. Endurance athletes are a unique group of people that can be impacted by climate change in various ways, and it goes beyond the obvious of having to train and race in warmer weather more frequently. Herein, I discuss some areas of concern for endurance athletes moving forward, and how to address them, while navigating climate change. Climate Change and Hotter Temperatures This one is likely the most obvious. As global temperatures rise, endurance athletes will likely spend more of their time training and racing in warmer weather (1). However, it is the more extreme temperatures that hit during summer months that can be the most concerning to endurance athletes and race event organizers. Many races around the globe each year are experiencing ‘record heat’ on the day of their event. This is something endurance athletes will, therefore, need to be more diligently prepared for through proper training and heat acclimation. I have written on the topic of heat previously, and it is worth reading as it takes a deep dive into the physiology of exercising in the heat as well as ways in which to acclimate to the heat to improve performance or to prepare for competition in a hot environment. Acclimating to the heat is important for any athlete that is planning on racing in a hot environment so as to reduce the impact that the heat has on their performance and to reduce the likelihood of heat-related illness. The importance of this has never been higher with many races that endurance athletes partake in occurring during summer months and with an increased possibility of extreme heat on race day. Climate Change and Air Pollution A changing climate makes wildfires more prevalent, especially during hotter and dryer summer months (1). Wildfires can dramatically reduce overall air quality, which can be harmful to endurance athletes if they live and train, or race, in an area that is prone to experiencing these wildfires and reduced air quality. Endurance athletes can dsubstantially increase their breathing frequency for prolonged periods of time during training, and this can increase the amount of smoke pollutants that get into and damage the lungs if the air quality is sufficiently poor due to a wildfire. It is, therefore, important for endurance athletes that live in these areas to be aware of ongoing wildfires nearby and the current air quality on days they plan to train outdoors. If the air quality index (AQI) is too high, it is worth taking training indoors, if possible, or wearing a well-fitting N95 mask to reduce the inhalation of smoke pollutants. Typically, race event organizers will cancel or postpone events if the AQI on race day is too high as the risk is too great. Climate Change and Tick-Borne Illness As the climate changes and certain areas become warmer and wetter, the population of ticks is spreading to larger swaths of geographical areas where they were previously not commonly found (1). Typically ticks are most commonly a concern for endurance athletes that exercise outdoors in heavily wooded or grassy areas, including hikers, mountain bikers, and trail runners (1). Tick bites can spread Lyme disease to humans, which can have a potentially devastating effect on overall health and performance. Therefore, endurance athletes that are exercising in areas where ticks might be present should be careful to protect themselves from possible tick exposure. For athletes that are exercising outdoors in wooded or grassy areas, the following precautions can be followed to reduce the chances of tick exposure (1): Wear long-sleeved shirts, long pants, and tall socks while exercising, if possible Treat clothing or gear with tick repellant Spray skin-safe tick repellant on exposed skin while exercising Check clothing and body for ticks after exercise; a tick must be embedded for 48-72 hours to transmit Lyme disease Conclusions Climate change has impacts on a global scale. However, different groups of people may be impacted differently depending on their geographical location and the activities they partake in. For endurance athletes, there are a few important areas of concern to be aware of moving forwards, including increasingly hotter temperatures, increased possibility of wildfires and poor air quality, and increased risk of exposure to tick-born illness when exercising outdoors. Each of these areas, however, can be addressed in many ways to reduce the overall risk of injury or illness when training or racing as an endurance athlete in a changing climate. References: Nowak AS, Kennelley GE, Krabak BJ, Roberts WO, Tenforde KM, Tenforde AS. Endurance Athletes and Climate Change. The Journal of Climate Change and Health. 2022 Feb 1:100118. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Climate Change and Endurance Athletes: What Do They Have to Do With Each Other? content media
0
0
12
Ryan Eckert, MS, CSCS
Jul 05, 2022
In The VO2 Max Forum
Brief Overview of Functional Overreaching, Non-Functional Overreaching, and Overtraining Syndrome Before we begin a discussion on markers of functional overreaching, it is imperative that we have a discussion about the continuum of fatigue so that we understand the different stages of fatigue. There is a wide range of normal fatigue that occurs from regular endurance training, and then there is a state of functional overreaching (which is positive and somewhat essential to the training process), and then non-functional overreaching and overtraining syndrome (both of which are detrimental to training, performance, and potentially health). So let’s start with someone that is sedentary and does not exercise at all or very minimally. This person that leads a relatively physically inactive lifestyle may carry minimal to no residual fatigue and physiological stress. A dedicated endurance athlete who trains regularly may be carrying around a healthy amount of physiological fatigue and stress on a daily basis. An athlete who pushes their body very hard for a brief period of time, but with adequate recovery between training sessions and training weeks, may begin to carry too much fatigue, putting them in a state of functional overreaching with noticeable performance decrements. This state is considered potentially positive, should the right amount of rest and de-loading occur in the realm of 5-7 days once this state is achieved, as when the athlete recovers, they can usually achieve a state of higher overall performance and fitness (also called ‘super-compensation’). If the athlete, however, continues to push themself, their fatigue and stress levels continue to build to the point where they are in a state of non-functional overreaching, performance starts to decline significantly, and health may be impacted. If the athlete continues to maintain their usual training habits while in a state of functional overreaching, they may eventually exceed their body’s capacity to handle stress and the body then breaks down. This is known as overtraining syndrome, and it carries with it significantly reduced performance and serious health consequences. Full-blown overtraining syndrome can take months or years to recover from! The figure below depicts the continuum of fatigue to better put these terms into visual context. It is important to note that there is a VERY distinct difference between functional overreaching and non-functional overreaching and overtraining syndrome. Most research and many coaches focus on identifying symptoms of non-functional overreaching and overtraining syndrome, but it is also important to identify when an athlete might be in a state of functional overreaching so that the appropriate rest can be planned. Functional overreaching is a physiologically beneficial state only if the appropriate recovery period is programmed into one’s training. Typically, an athlete might purposefully achieve a state of functional overreaching right before a taper into a key race as, after the taper, they will super compensate and be in a greater state of fitness than before the functional overreaching period. An athlete might also go away on a training camp and purposefully push their body to the limit for a few weeks to gain a big early season boost in fitness. However, this training camp is usually followed up with a week or two of recovery and much less training volume and intensity. So, identifying this physiologic state can be useful for coaches and athletes alike. Let’s take a look at some research in the following section that sheds some light on potential markers that may be indicative of functional overreaching. What Does the Research Support as Potential Markers of Functional Overreaching? Roete and colleagues published a systematic review in 2021 that aimed to identify key indicators of functional overreaching across the available research on the topic. In total, they included 12 relevant research studies in order to determine markers of functional overreaching. They found that the following markers were associated with a state of functional overreaching in trained and professional endurance athletes (comprised of cyclists, runners, and triathletes): A reduction in peak power output when performing a maximal effort A lower achievable maximum heart rate (HR) A faster 60-second HR recovery after stopping an exercise effort A reduction in average power output and average submaximal HR during a time trial effort Higher rating of perceived exertion (RPE) at submaximal efforts It is also important to mention the markers that were NOT associated with a state of functional overreaching, including: Resting HR Heart Rate Reserve (HRR) What is quite interesting with the above findings is that some of these markers can also be a sign of improved physiological fitness (i.e., faster HR recovery, lower submaximal HR during a time trial effort). Therefore, it has been proposed that utilizing multiple markers including fitness markers in conjunction with RPE and the overall mood and energy level of the athlete be taken into account when aiming to determine the status of functional overreaching. For example, if an athlete achieves a lower HR during a submaximal training effort during a session, this alone is not enough to be indicative of functional overreaching because the athlete might have seen a fitness improvement indicative by a lower HR and a higher pace or power output during the effort. However, if an athlete has been consistently showing a depressed HR during training sessions alongside a higher RPE during those sessions and is feeling a lack of energy and motivation alongside their training, this may be indicative of a functional overreaching state, or at least a state of chronic fatigue that warrants a week of recovery. Keep in mind, it is very important to not confuse functional overreaching with non-functional overreaching or full-blown overtraining syndrome. The latter states usually present with some ill health effects in conjunction with poor physical performance, including sleep disturbances, depressed mood, changes in weight or appetite, chronic underlying muscular soreness or poor recovery, greater frequency of illness, etc. Functional overreaching is not associated with any ill health effects or symptoms and just presents as a mild state of fatigue alongside a static or very slightly depressed physical performance level. Conclusions Being able to identify a state of functional overreaching as a coach or self-coached athlete can be very useful as it can help aid in the programming of planned recovery. Typically, functional overreaching comes from a phase of elevated training volume and/or training intensity for the purpose of improving overall fitness in the long-term. It is a state that an athlete does not want to be in chronically, but rather a state that can planned for multiple times in a season as long as appropriate recovery follows the training block. Knowing the difference between functional overreaching and non-functional overreaching and overtraining syndrome is important, and the markers identified in recent research mentioned herein can be useful in identifying a state of functional overreaching and differentiating it from full-blown overtraining syndrome. References: Roete AJ, Elferink-Gemser MT, Otter RT, Stoter IK, Lamberts RP. A systematic review on markers of functional overreaching in endurance athletes. International journal of sports physiology and performance. 2021 Jun 8;16(8):1065-73. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Markers of a State of Functional Overreaching: What Can We Learn from the Science? content media
0
0
13
Ryan Eckert, MS, CSCS
Mar 31, 2022
In The VO2 Max Forum
Brief Overview of Low-Carbohydrate, High-Fat Diets for Endurance Athletes Current sports nutrition guidelines from leading nutrition and exercise organizations around the globe (4,5) recommend a high-carbohydrate, low-fat (HCLF) diet for endurance athletes that broadly consists of: ~50-60% of energy intake from carbohydrates ~15-20% of energy intake from protein The remaining (~20-35%) energy intake from fat However, since the early 1980’s, researchers have been interested in the potential benefits of a low-carbohydrate, high-fat (LCHF) for endurance athletes. Studies investigating the effects of LCHF diets have their origins in the management of type 2 diabetes, the management of epileptic seizures, and the potential management of obesity (2). However, in 1983, Phinney and colleagues (3) were the first to test the impacts of a LCHF on endurance athletes. Researchers thereafter discovered in the next decade or so that LCHF diets did not really demonstrate any efficacy for improving endurance performance. However, there has been a renewed interest in LCHF diets for endurance athletes, particularly the more extreme version of the LCHF diet, a ketogenic LCHF (K-LCHF) diet. A typical LCHF diet might consist of an energy intake ratio of >60% of calories from fat and <25% of calories from carbohydrates. However, a K-LCHF diet would restrict an athlete to consuming <5% of their calorie intake from carbohydrate in order to promote a state of ketosis, wherein the body derives much of its energy from fat and ketones as opposed to glucose. In a December 2019 Science Post, I discussed in great depth the proposed mechanisms behind the LCHF diet as well as provided an in-depth review of the research literature surrounding this topic up to that point. I will, therefore, spare the in-depth overview herein and refer you to this post for more background on the topic and origins of LCHF diets for endurance athletes. If you are not familiar with LCHF diets and the proposed mechanisms of action, I would strongly suggest you read this previous post before reading further. Some of the key takeaways from the research that was available prior to 2019 was as follows: LCHF diets had minimal evidence to document any superiority to a traditional HCLF diet for endurance performance. LCHF diets may impair an endurance athlete’s ability to do high-intensity work in training and in racing due to an impaired ability to derive energy from glucose or glycogen (i.e., glycolysis). There is some possibility of a LCHF diet to improve performance in ultra-endurance athletes that train and race at very, very low intensities for extremely prolonged periods of time that rarely ever do work at an intensity above 60-65% of their VO2 max (e.g., multi-day adventure racers), which is an extremely low intensity compared to what most endurance athletes race at for single-day events (e.g., trail runs, road runs, triathlons, etc.). Since 2019, however, more research has emerged on the topic of LCHF or K-LCHF diets for endurance athletes. Therefore, it is worth revisiting this topic given its resurging popularity in some realms of the endurance world, particularly in the long-distance triathlon space (e.g., half-distance and full-distance triathlons). What Does the Latest Research Say? I often turn to systematic reviews and meta-analyses for summaries of evidence on topics of interest to me, and this is exactly what I did here. There was a great systematic review and meta-analysis recently published in 2021 by Cao and colleagues (2) summarizing the effects of K-LCHF diets on aerobic capacity and exercise performance among endurance athletes. This article included 10 total individual studies for analysis consisting of 139 endurance athletes, albeit these athletes were primarily male (a potential limitation). The primary outcomes assessed in this systematic review and meta-analysis were related to the impact of a K-LCHF diet on the following exercise-related variables: Aerobic capacity (i.e., VO2 max) as assessed by a graded exercise test (GXT) Time to exhaustion (TTE) on GXT Maximum heart rate (HR) achieved during GXT Respiratory exchange ratio (RER) on GXT In brief, the authors found no significant effect of a K-LCHF diet on aerobic capacity, TTE, nor maximum HR; however, there was a large shift towards greater fat metabolism during a GXT as evidenced by a large reduction in RER among those following a K-LCHF diet (lower RER indicates greater reliance on fat oxidation). These findings are very similar to the larger body of LCHF literature in that a LCHF diet shows no real benefit on markers of endurance performance or fitness. However, a LCHF diet does show a dramatic increase in the athlete’s capacity for fat oxidation, but often at the expense of a reduced capacity for glycolysis, or the generation of energy from carbohydrates. This latter shift is not necessarily performance-enhancing as many propose it to be as the available research just doesn’t show a greater shift in fat metabolism to be associated or linked with any sort of enhancement in endurance performance or fitness markers. This recent systematic review and meta-analysis lends more evidence to support this case. A recently published study by Burke and colleagues (1) sheds some further insight into LCHF diets and endurance performance on a more granular level, and hints at what I think the true potential of LCHF diets is for endurance athletes (discussed later on below). In this study, researchers enrolled 13 elite/professional race walkers, some of whom compete at the Olympic level. These 13 race walkers all went through 3 phases in this research study, and these phases were as follows: Phase 1: All 13 athletes completed baseline fitness and performance tests (VO2 max testing, walking economy testing, and a 10,000-m race on a track), followed by a 5-day HCLF diet to establish a baseline dietary intake, and finally completing phase 1 with a 25-km “long” walk where fitness metrics were recorded (RER, HR, RPE, and more). Phase 2: The 13 athletes were divided into a HCLF group (n=6) or a K-LCHF group (n=7) for 5 days followed by a repeat of baseline fitness and performance measures (VO2 max testing, walking economy testing, a 10,000-m race on a track, and a 25-km “long” walk). Phase 3: Finally, all athletes then went back to a HCLF “restoration” diet for 5 days and finished this 3rd phase with another 25-km “long” walk where fitness metrics were recorded (RER, HR, RPE, and more). In brief, the researchers found the following after administering this dietary intervention among these elite-level race walkers: 5-6 days of adaptation to a K-LCHF diet was sufficient to increase exercise fat oxidation rates previously seen with longer-term K-LCHF dietary interventions (>12 weeks); this indicates that full adaptation to a K-LCHF may take place quickly and potentially dispels the notion that adaptation to a LCHF diet takes a long time (>12 weeks) and is why most research to date does not demonstrate benefits in favor of LCHF or K-LCHF diets. Increases in fat oxidation among the K-LCHF group was also seen alongside a reduction in exercise efficiency, as evident with a 5-8% increase in oxygen cost at the race walkers 10,000-m race performances; this is rather significant as long-distance endurance events are heavily reliant on being as efficient as possible, so a reduction in exercise efficiency may have a significant negative impact on performance. Acute restoration of glycogen stores with a 1-day HCLF restoration period prior to a key performance was not enough to “outweigh” the negative impacts of a K-LCHF on 10,000-m race performance; this was evident by carbohydrate oxidation rates only reaching 61-78% of baseline values established with all athletes were on the standard HCLF diet. High-intensity exercise performance was impaired in those on the K-LCHF diet, and acute restoration of glycogen stores was not sufficient to improve high-intensity exercise capacity to baseline values seen when all athletes were on the HCLF diet. The findings herein from this study have now been pretty clearly replicated over and over and over again, and the moral of the story remains pretty much the same as what I stated back in my 2019 post on this topic and what I stated in the introduction of this current post, of which I will state them again with some underlining to emphasize important points: LCHF diets had minimal evidence to document any superiority to a traditional HCLF diet for endurance performance. LCHF diets may impair an endurance athlete’s ability to do high-intensity work in training and in racing due to an impaired ability to derive energy from glucose or glycogen (i.e., glycolysis). There is some possibility of a LCHF diet to improve performance in ultra-endurance athletes that train and race at very, very low intensities for extremely prolonged periods of time that rarely ever do work at an intensity above 60-65% of their VO2 max (e.g., multi-day adventure racers), which is an extremely low intensity compared to what most endurance athletes race at for single-day events (e.g., trail runs, road runs, triathlons, etc.). I would also like to add one additional take-home message that seems to be emerging from the data as more studies are done on this topic: There seem to be a very small subset of endurance athletes that respond positively to a LCHF or K-LCHF diet, demonstrating improvements in performance despite reduced carbohydrate oxidation capacity. In the Burke and colleagues study discussed just above (1), when looking at some of the individual-level data amongst the 13 athletes, there was one single athlete that seemed to perform better in their 10,000-m race performance. This is also a typical finding in these sorts of studies. The majority of athletes do not improve their performance and usually perform worse when on a LCHF or K-LCHF diet, but there is usually one or two athletes that actually do see small improvements. This is where I am starting to see the potential utility of a LCHF/K-LCHF diet for endurance athletes and starts getting at the genetic variability in the response to dietary interventions. It is also for this reason that dietary protocols are not a one-size-fits-all. Yes, based on the research we currently have, a “standard” HCLF diet seems to be what most endurance athletes should be following as this diet seems to produce optimal performance across a range of endurance distances and disciplines, but there is a very, very small minority (~5%) of endurance athletes that may, just may, see some performance improvement/optimization when on a LCHF or K-LCHF diet. However, I say this with caution as there are other risks to adopting a high-fat diet related to overall health and well-being that are sometimes not captured in all the studies done in this area, namely the increased risk of illness (e.g., upper respiratory tract infection) and reported negatively impacted mood that is sometimes seen in athletes undergoing a high-fat dietary protocol. Adequate carbohydrate availability and a reliance on glucose at rest seems to be related to optimal functioning of various bodily systems and functions, and it may be that, even if an athlete performs better in a lab or in a race, they may still experience negative health consequences as a side effect. More research indeed needs to be done in this area. Conclusions So, there you have it. I hope this provides a rather concise update on the state of the LCHF literature and allows you to walk away with a better understanding of this topic more broadly. As with almost anything in life, let alone sport, things are rarely black and white or cut and dry. Things tend to fall more in the middle, and dietary interventions are no different. Diet can elicit strong emotions and reactions from athletes, with some dietary principles being held in an almost dogmatic or religious light among many, but I strongly encourage you to be open-minded when reading this and when reading other nutrition-related research as nutrition can be very, very individual. While a HCLF diet may work best for most, there could certainly be some that function best on a LCHF diet. The same can be said for many other dietary approaches, from veganism, to vegetarianism, etc. Just because it works for you or worked for someone else, does not mean it works for everyone. However, I would still strongly encourage the vast majority of endurance athletes to eat a HCLF diet for optimal performance and health… of which I can cheers (over a pastry or two) to that! References: Burke LM, Whitfield J, Heikura IA, Ross ML, Tee N, Forbes SF, Hall R, McKay AK, Wallett AM, Sharma AP. Adaptation to a low carbohydrate high fat diet is rapid but impairs endurance exercise metabolism and performance despite enhanced glycogen availability. The Journal of Physiology. 2021 Feb;599(3):771-90. Cao J, Lei S, Wang X, Cheng S. The Effect of a Ketogenic Low-Carbohydrate, High-Fat Diet on Aerobic Capacity and Exercise Performance in Endurance Athletes: A Systematic Review and Meta-Analysis. Nutrients. 2021 Aug;13(8):2896. Phinney SD, Bistrian BR, Evans WJ, Gervino E, Blackburn GL. The human metabolic response to chronic ketosis without caloric restriction: preservation of submaximal exercise capability with reduced carbohydrate oxidation. Metabolism. 1983 Aug 1;32(8):769-76. Thomas DT, Erdman KA, Burke LM. Position of the Academy of Nutrition and Dietetics, Dietitians of Canada, and the American College of Sports Medicine: nutrition and athletic performance. Journal of the Academy of Nutrition and Dietetics. 2016 Mar 1;116(3):501-28. Vitale K, Getzin A. Nutrition and supplement update for the endurance athlete: review and recommendations. Nutrients. 2019 Jun;11(6):1289. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Low-Carbohydrate, High-Fat (LCHF) Diets and Endurance Performance: an Update on Emerging Research content media
0
0
62
Ryan Eckert, MS, CSCS
Mar 01, 2022
In The VO2 Max Forum
What is Ashwagandha Ashwagandha might be an herb you have never heard of, but it is time that you hear about it! I became interested in using Ashwagandha for its various medicinal properties, namely the evidence-based beneficial effects of this compound on stress and mood. To my surprise however, as I started diving into a bit of the research surrounding the health benefits of Ashwagandha, I started to notice that it had also been studied as a potential performance-enhancing substance for athletes. Ashwagandha (Withania somnivera) is a popular herb used in Ayurvedic medicine due to its wide-ranging biological actions influencing health (Mishra et al., 2000). It is known as an “adaptogen”, which is considered any nontoxic substance, especially plant extracts, that is known to increase the body’s ability to resist the damaging effects of stress and promote or restore bodily homeostasis. Many different plants are considered to have adaptogenic properties, including things such as various strains of mushrooms and other fungi. Ashwagandha is one of the more popular adaptogenic herbs that is both used in practice and studied in the scientific literature. However, it has only been recently studied for its possible beneficial effects on athletic performance. The findings of this research might just surprise you as it surprised me, namely because I would not have expected an herb to have the effects that have been documented in recent scientific literature. So, without further ado, let’s dive into that research! Ashwagandha and Endurance Performance A recent 2020 systematic review and meta-analysis (Pérez-Gómez et al., 2020) aimed to summarize the effects of ashwagandha supplementation on VO2 Max in healthy adults and athletes. This study ultimately included five studies with 162 total participants ranging in age from 16-45 years of age. Study participants were a mix of healthy adults, various recreational athletes, and elite cyclists. So, this review and meta-analysis included quite a diverse population despite a relatively small total sample size. Researchers found that the studies included in this review utilized 500-1000 mg of Ashwagandha daily for 2-12 weeks. The meta-analysis found a significant beneficial effect of Ashwagandha compared to placebo control in the order of 3.0 ml/kg/min. This essentially means that taking Ashwagandha favored a 3.0 ml/kg/min improvement in VO2 Max at study treatment end. For anyone that knows anything about VO2 Max, this is a rather significant real-world difference in VO2 Max. Athletes will spend months or years trying to increase their VO2 Max by a few points, so to have Ashwagandha consumption for a few weeks to a few months lead to a favorable impact on VO2 Max in the order of 3.0 points higher than a placebo control group is very exciting. This meta-analysis, however, had some limitations, including the small sample size and the relatively poor overall quality of evidence as calculated by the authors when conducting the meta-analysis. Therefore, these results should be interpreted with caution. Another more recent 2021 systematic review and meta-analysis (Bonilla et al., 2021) aimed to assess the effects of Ashwagandha on physical performance, including VO2 Max and blood hemoglobin concentration (Hb) in healthy individuals. This review/meta-analysis included 12 studies and over 600 healthy adults. Sub-group meta-analysis for VO2 Max demonstrated a significant and very large effect of Ashwagandha compared to placebo control (d = 1.929; p < 0.001). A separate sub-group meta-analysis for Hb also revealed a significant and very large effect of Ashwagandha compared to placebo control (d = 1.697; p < 0.001). This particular meta-analysis had a much higher quality of evidence than the previously discussed 2020 review/meta-analysis, and so the findings of this paper can be interpreted with greater confidence. This is great, as the treatment effect of Ashwagandha on VO2 Max and Hb were significant and very large compared to the placebo control! Conclusions So, what do we make of all this? It seems, based on the two systematic reviews and meta-analyses discussed above, that Ashwagandha has a significant and large treatment effect on endurance performance markers, specifically VO2 Max and Hb. VO2 Max is an important marker/characteristic of endurance performance, and so a higher VO2 Max due to the addition of Ashwagandha is potentially significant in real-world competition. A greater blood hemoglobin concentration is also potentially significant in real-world competition for endurance athletes as greater Hb is related to a greater oxygen-carrying capacity, which can improve endurance performance. It is important to point out that there have not been any studies that have really investigated the impact of Ashwagandha on real-world performance in a race or time trial. So, the above findings should not be interpreted to say that Ashwagandha improves endurance performance, per se, as that research has not been done. However, Ashwagandha does seem to have beneficial effects on markers of endurance performance, and so it is theoretically possible that these improved endurance characteristics could ultimately lead to better real-world performance in a race or competition. Ashwagandha also tends to be a relatively affordable supplement, with a high-quality Ashwagandha supplement from Gaia Herbs being only ~$24 on Amazon.com (at the time this was written and published in early 2022). Research tends to suggest a beneficial effect of Ashwagandha when taken regularly at doses of 500-1000 mg, and this Gaia Herbs supplement would last an athlete 60 days when taken at that dosage range. This is only ~$12/month for a potentially ergogenic, and completely legal for competition, substance. This makes it a potentially attractive option for athletes. And the added benefit is that Ashwagandha has evidence supporting general health benefits, including beneficial effects on stress. Athletes are constantly under stress from training, and so managing stress and recovery from said stress through sleep, nutrition, and possibly supplementation can help an athlete perform at their best. Ashwagandha seems to help improve not only performance, but also counteract and manage stress at a relatively low cost. Ashwagandha is also safe when taken orally in doses of up to 1000 mg/day, with minimal risk of side effects (mild sedative effect can be seen in some individuals; Examine, 2022). However, it is typically recommended to take Ashwagandha for no more than 3 months at a time as the body can develop a tolerance to it and the beneficial effects can decrease long-term (Examine, 2022). Therefore, a sensible approach for supplementing with Ashwagandha could be to take 500-1000 mg/day for 2-3 months with a 1-2 month “wash-out” period before cycling back on the supplement again. As with any supplement, it is a good idea to talk to a healthcare professional to make sure there are no contraindications to you taking Ashwagandha. Despite it being a very safe supplement with little risk of any adverse side effects, it can react with certain medications, and so if you are taking any medications for your health, it is worthwhile checking with your doctor before starting any supplement. And finally, as with any supplement, it is always more important to optimize your diet first before relying on supplements. Although Ashwagandha is not normally found in foods consumed within a nutritious diet, a good diet is always the place to start when looking to maximize adaptations to and recovery from endurance training. So, start with your diet and then consider supplementing with various evidence-based supplements if you have the budget and willingness to do so. References: Bonilla DA, Moreno Y, Gho C, Petro JL, Odriozola-Martínez A, Kreider RB. Effects of Ashwagandha (Withania somnifera) on Physical Performance: Systematic Review and Bayesian Meta-Analysis. Journal of Functional Morphology and Kinesiology. 2021 Mar;6(1):20. Examine. Ashwagandha. Updated Jan, 6, 2022. Retrieved from: https://examine.com/supplements/ashwagandha/ Mishra, L.C.; Singh, B.B.; Dagenais, S. Scientific basis for the therapeutic use of Withania somnifera (ashwagandha): A review. Altern. Med. Rev. 2000, 5, 334–346. Pérez-Gómez J, Villafaina S, Adsuar JC, Merellano-Navarro E, Collado-Mateo D. Effects of Ashwagandha (Withania somnifera) on VO2max: a systematic review and meta-analysis. Nutrients. 2020 Apr;12(4):1119. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Adaptogenic Herb, Ashwagandha, and its Effects on Endurance Performance content media
0
0
12
Ryan Eckert, MS, CSCS
Jan 31, 2022
In The VO2 Max Forum
Injuries Among Triathletes Injuries are unfortunately relatively common among endurance athletes. However, the unique demands of triathlon place a potentially greater risk of injury on athletes competing in the sport. Prior studies have found a wide range of injury rates among triathletes, ranging from 37-91% of athletes (1). The wide range is likely due to a myriad of factors, including different lengths of studies, the training status of individuals followed, and the distance race that the athletes included were preparing for. Nonetheless, the rate of injury among triathletes is relatively high, and so identifying the traits of characteristics associated with injury is an important first step in being able to mitigate the risk of injury among triathletes. A recent study published by Kienstra and colleagues aimed to do just this (1). Prior studies have focused a lot on identifying relationships between training-related factors (training intensity, training volume, training frequency) and injury risk. However, Kienstra and colleagues aimed to identify the relationship between training factors as well as lifestyle factors and injury risk. Let’s take a look at what they found in the next section. Training, Injury, and Lifestyle Characteristics of Recreational Triathletes Authors in this study administered a survey assessing training and other lifestyle characteristics (specific dietary or nutrition strategies, supplement use, medical history) to 34 recreational triathletes in the Miami, Florida area (mean age = 47.6 years; 33% female). Reported Injuries Authors found that 79% identified at least one current area of pain, with the lower extremity being the most common site of injury (72% of all pain and injuries reported). The leg accounted for 17% of all injuries while the hip, knee, and foot accounted for 16% of reported injuries each. Finally, the back, neck, and shoulder accounted for 6%, 7%, and 15% of reported injuries, respectively. Training Characteristics and Injury Athletes who trained more than 12 hours/week had an average of 3.3 injury sites while athletes training less than 12 hours/week had an average of 2.3 injury sites. Other Training Characteristics Only 56% of athletes reported engaging in strength training and only 15% reported engaging in some form of yoga or Pilates. Most athletes (65%) trained under the guidance of a coach. The average training volume per week across all athletes was 11.8 hours/week (range = 4 to 35 hours/week). Athletes planning to race half-distance events or longer averaged 12.6 hours/week of training whereas those planning on racing short-distance triathlons (sprint or Olympic) averaged 10.8 hours/week of training. Nutrition and Supplement Characteristics A total of 65% of athletes reported using some form of supplement or vitamin, with multivitamin use (47%) being the most common, followed by a specific vitamin supplement (30%), a protein supplement (26%), a calcium supplement (15%), a fish oil supplement (15%), and an iron supplement (9%). Most athletes reported no dietary restrictions; however, 15% followed a gluten-free diet, 15% reported a lactose-free diet, 9% a vegetarian diet, and 6% a vegan diet. Every single athlete following a gluten-free, vegan, and vegetarian diet reported at least one injury, whereas 80% of those following a lactose-free diet reported at least one injury. Other Lifestyle Characteristics Among the 34 athletes included in the study, 16 completed the survey questions regarding sleep. A total of 63% of these athletes reported sleeping 6 hours or less per night and nobody reported sleeping more than 9 hours per night. What to Make of All This? Despite the vast majority of triathletes reporting at least one injury, this number might be slightly biased as 9/24 participants were recruited directly from a Miami-based sports medicine clinic, potential inflating the likelihood of injuries among the total sample. Nonetheless, the rate of injury is still high among the triathletes included in this study. The three factors that really stood out to me when sorting through the data were the following: Those training >12 hours/week were more likely to experience injury than those training <12 hours/week. This demonstrates the positive relationship between training volume and injury risk, of which injury risk goes up as training volume goes up; this has been demonstrated among other endurance athletes as well, particularly among runners and long-distance triathletes (1). Only 58% of all triathletes reported engaging in strength training. This is alarming and demonstrates the need for more triathletes to engage in regular strength training as strength training has been well-documented in reducing the risk of overuse and sports-related injuries (2). Among the 16 triathletes that responded regarding their sleeping habits, 63% reported 6 hours/night or less and nobody reported sleeping 9 hours/night or more. This is HUGELY alarming as research demonstrates that nightly sleep durations of less than or equal to 7 hours/night for prolonged periods of time is associated with a 1.7x greater risk of musculoskeletal injury among athletes (3). The important thing to note about these three characteristics above is that they are all modifiable. In other words, these characteristics and behaviors can be improved/changed. Based on the evidence we currently have, I would argue that increasing sleep quantity to >7 hours/night and increasing the proportion of athletes engaging in regular strength training would likely reduce the prevalence of injury among this small sample of recreational triathletes. There is some research linking increased training volume with increased injury risk. For the average recreational triathlete that is usually unable to engage in extreme amounts of training like professionals/elites often engage in, however, the bigger problem is likely what happens around training and not the training volume itself. Triathletes, and endurance athletes more broadly, can lead very busy lives outside of a very demanding sport in triathlon. They quite often have families, full-time jobs, or schooling in addition to part-time jobs. The business of life outside of an already very demanding sport can lead to what is seen in the sample included within this study, chronically poor sleep habits and a lack of athletes engaging in strength training. There are other lifestyle factors contributing to risk of injury as well, including nutrition, however, I quite often see athletes sleeping too little and not engaging in regular strength training individualized for them as an endurance athlete. I would argue that these two characteristics alone account for far too many injuries among recreational triathletes that may otherwise be preventable. Conclusions It is not necessarily shocking to hear of the relatively high prevalence of injury in this small sample of recreational triathletes. I think the most important take-home message from this study for triathletes is to prioritize optimal recovery through good sleeping habits (aiming for at least 7-8 hours of sleep each night) and engaging in a safe, progressive, and individualized strength training to improve overall strength and reduce the risk of injuries. These two lifestyle habits will go a long way in keeping athletes healthy and performing at their best. If you want to learn more about sleep, click here. If you want to learn more about strength training, click here and here. References: Kienstra CM, Cade WH, Best TM. Training, injury, and lifestyle characteristics of recreational triathletes. Current sports medicine reports. 2021 Feb 1;20(2):87-91. Lauersen JB, Bertelsen DM, Andersen LB. The effectiveness of exercise interventions to prevent sports injuries: a systematic review and meta-analysis of randomised controlled trials. British journal of sports medicine. 2014 Jun 1;48(11):871-7. Huang K, Ihm J. Sleep and injury risk. Current sports medicine reports. 2021 Jun 1;20(6):286-90. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Training and Lifestyle Characteristics and Their Relation to Injury Rates Among Recreational Triathletes content media
0
0
12
Ryan Eckert, MS, CSCS
Jan 03, 2022
In The VO2 Max Forum
What is the rationale behind cold-water immersion to enhance endurance adaptations? I have written previously about the science surrounding the use of cold-water immersion (CWI) and other forms of cold therapies on recovery in athletes. However, another recently popularized use of CWI is to help enhance the long-term adaptation response to and performance of endurance exercise. In other words, CWI done after an endurance training session is reportedly being used to help improve the adaptive response to endurance training and long-term performance. But, does this really work? What is the evidence to support it? Before I dive into the evidence, let’s discuss briefly the rationale behind this theory. A single endurance training session elicits a massive molecular and biochemical response from your body, known as a hormetic response. The short-term stress, and the associated molecular and biochemical responses of a single endurance session results in a temporary reduction in performance, but the adaptive response to repeated short-term stresses ultimately yields a positive net improvement in function and performance. Figure 1 below represents how long-term adaptation to endurance training takes place: It is thought that CWI after endurance exercise helps to augment this process above, specifically at the molecular and biochemical level during the short-term stress response to a single exercise session. For example, research has shown that CWI following a single bout of endurance exercise increases the protein content of peroxisome proliferator-activated receptor gamma coactivator 1-alpha (or PGC-1aplha for short) to a greater extent than just endurance exercise alone (1,3,4). This is one of the molecular responses to endurance training, an increased PGC-1aplha concentration in the muscle, that leads to increased mitochondrial biogenesis (the formation of new mitochondria). This research has gotten some thinking that maybe CWI after endurance exercise can actually enhance the adaptation response to chronic endurance training. However, it is important to know that there is a big difference between what happens initially after a single exercise session and what happens long-term after repeated exercise sessions. In other words, just because CWI after a single exercise session increases PGC-1alpha content of skeletal muscle, one cannot then assume that this will certainly lead to greater mitochondrial content, and therefore greater endurance adaptations, in the long-term. Why? Because the adaptive responses to endurance exercise are insanely complex, and there are literally thousands of molecular and biochemical responses that ensue from a single exercise session. The long-term adaptation process is even more complex as it takes into account these thousands of molecular and biochemical responses to a single session and multiplies it over days, weeks, and months. To truly know whether or not CWI after endurance exercise helps augment long-term adaptation, we need to look at research that has studied the long-term effects of CWI on endurance adaptation and performance. Let’s take a look at this research next. What Does the Research Say Regarding Cold-Water Immersion on Endurance Adaptation and Performance? Broatch and colleagues (2) published a systematic review of the literature in 2018 in which they aimed to examine the influence that CWI had on adaptive responses to exercise across both endurance exercise and strength exercise. For the purposes of this write-up, I’m only focusing on their findings as they relate to endurance exercise, but I would recommend you check this review paper out if you want more detail as it pertains to strength exercise. Essentially, the authors found that while some studies do show short-term increases in markers of mitochondrial biogenesis (e.g., increased PGC-1alpha among other markers), there was no long-term changes from regular CWI post-exercise over time. In other words, although CWI immediately after endurance training might elicit or augment increases in certain molecular pathways important for long-term training adaptations, this doesn’t appear to actually play out with increased endurance training adaptation in the long-run, including a lack of increase in mitochondrial proteins, which one would expect to see if CWI helped augment greater changes in endurance adaptation compared to exercise alone. Malta and colleagues (5) published a slightly more recent systematic review and meta-analysis in 2021 that aimed to examine the influence of CWI on actual performance outcomes across both strength training and endurance exercise; however, I will again only focus on the endurance findings herein. I do recommend you read this paper as well though if you want to deepen your understanding of CWI on exercise performance. Interestingly, and in line with the prior 2018 systematic review that found no beneficial effects of CWI on adaptive responses to chronic endurance exercise, this 2021 review found that CWI had no significant negative or positive effect on markers of endurance performance (cycling time-trial mean power, maximal aerobic power, and cycling time trial performance). The findings from these two papers taken together demonstrate no real effect of chronic CWI post-endurance exercise on adaptive responses nor endurance performance. The research that has been done so far is not without its limitations, however, namely the length of cold exposure time possibly being insufficient to elicit long-term adaptive effects or performance improvements. The typical CWI protocols across all studies included within these papers typically consists of immersing oneself in water at 40-50 degrees F for 10-20 minutes after exercise for 3-7 weeks in length. Broatch and colleagues (2) noted that extremely long cold air exposure in the order of multiple months at a time in animals has elicited long-term increases in mitochondrial proteins, which would mark improvements in endurance adaptations. However, the exposure time each day is likely far longer than what any normal athlete would expose themselves to, which has been upwards of 24 hours/day of cold exposure in the animals studied. Malta and colleagues (5) also mentioned that it is largely unknown if longer CWI exposure per session would result in more favorable findings on long-term performance outcomes. There might be an optimal window of time that is longer than current protocols (e.g., >20 minutes at a time) or colder than current protocols (e.g., <40 degrees F) that does indeed work. It is also possible that longer-term studies (i.e., longer than 7 weeks) are needed to see endurance adaptations or performance benefits take place from CWI. Finally, it could also be possible that extremely cold air exposure has the potential to augment endurance adaptations and performance as one can expose themself to far colder temperatures than they otherwise could with cold water. However, these types of studies have not been done yet. Further research is needed to answer these questions. Conclusions Broadly, the use of CWI is not recommended as a means of enhancing long-term endurance training adaptations and performance outcomes. Despite promising research showing short-term increases in markers of endurance adaptation, these short-term increases do not seem to materialize into long-term adaptations indicative of enhanced endurance adaptation nor actual performance improvements. The use of CWI to enhance recovery after training is another popular use of this modality. The evidence is very mixed on whether CWI works to enhance the recovery process between hard training sessions as well (2). However, there does not seem to be any harm to doing CWI post-endurance exercise, both in terms of its short-term effects and its long-term effects, so if you already use CWI after exercise and you enjoy it, there would be no reason to discontinue. Go on with it at your pleasure, or discomfort, depending on how you view CWI. Personally, I enjoy short bouts of cold showers each day for reasons other than recovery or the possibility of enhanced endurance adaptation. I also like to pair up infrared sauna sessions with cold showers immediately afterwards for contrast therapy. I enjoy the invigorating sensation I get after a few minutes of cold exposure, so I’ll continue doing this with the knowledge that it is likely not helping my performance nor hindering it. References: Allan, R., Sharples, A. P., Close, G. L., Drust, B., Shepherd, S. O., Dutton, J., ... & Gregson, W. (2017). Postexercise cold water immersion modulates skeletal muscle PGC-1α mRNA expression in immersed and nonimmersed limbs: evidence of systemic regulation. Journal of Applied Physiology, 123(2), 451-459. Broatch, J. R., Petersen, A., & Bishop, D. J. (2018). The influence of post-exercise cold-water immersion on adaptive responses to exercise: a review of the literature. Sports Medicine, 48(6), 1369-1387. Ihsan, M., Watson, G., Choo, H. C., Lewandowski, P., Papazzo, A., Cameron-Smith, D., & Abbiss, C. R. (2014). Postexercise muscle cooling enhances gene expression of PGC-1. Med Sci Sports Exerc, 46(10), 1900-1907. Joo, C. H., Allan, R., Drust, B., Close, G. L., Jeong, T. S., Bartlett, J. D., ... & Gregson, W. (2016). Passive and post-exercise cold-water immersion augments PGC-1α and VEGF expression in human skeletal muscle. European journal of applied physiology, 116(11), 2315-2326. Malta, E. S., Dutra, Y. M., Broatch, J. R., Bishop, D. J., & Zagatto, A. M. (2021). The effects of regular cold-water immersion use on training-induced changes in strength and endurance performance: a systematic review with meta-analysis. Sports Medicine, 51(1), 161-174. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Can Cold-Water Immersion Enhance Endurance Training Adaptation and Performance? content media
0
0
19
Ryan Eckert, MS, CSCS
Dec 01, 2021
In The VO2 Max Forum
Isn’t Strength Training just for Weight-Lifting and Strength/Power-Based Athletes? Nope! I have written a lot on strength training in previous posts of mine, but a recent study has me excited as it is one of the first studies to take a look at how a long-term strength training program impacts exercise economy specifically in triathletes. There have been numerous studies done showing that exercise economy is improved in cyclists and runners that undertake a regular, heavy-load strength-training program (1); however, this research has largely been done in single-sport athletes. Son, in theory, the effects of strength training on exercise economy in triathletes is not precisely known as this research has not been done. It is no surprise that triathlon is different from its’ single-sport counterparts that undertake stand-alone swimming, cycling, and running. One of the biggest differences is that each discipline of triathlon has a cumulative fatiguing effect on the subsequent discipline, reducing exercise economy in the process. For example, runners participating in a stand-alone marathon undertake this endeavor fresh, whereas in triathlon, an Ironman-distance triathlete would undertake the marathon portion of the event carrying significant fatigue from the swim and bike, and these previous two disciplines have a negative impact on running economy compared to a marathon that was done when fresh and not preceded by a swim and bike. Strength training, however, is known to improve exercise economy, particularly in cycling and running (1). Exercise economy is typically defined as the energetic cost (oxygen consumption) associated with a sustained power or pace output (2). Exercise economy is thought to be improved through strength training by the following mechanisms: Improved ability to store and release elastic energy from increased musculotendinous stiffness. This primarily impacts running which is a sport that requires the utilization of storing and releasing elastic energy with every stride Improved rate of force development (RFD) of musculature. This is primarily a neural improvement of the musculature’s ability to generate force quickly Increased maximal strength. This allows for a greater recruitment of less fatigable, type 1 slow-twitch muscle fibers during submaximal cycling and running These improvements described above ultimately improve exercise economy, which means a reduction in the oxygen demand during submaximal exercise, thereby improving performance. Exercise economy has been well-established as a valid marker or predictor of endurance performance, with a greater exercise economy predicting better endurance performance (1,2). However, no research has been done to date examining strength training’s impact on triathletes during successive swimming, cycling, and running. Luckin-Baldwin and colleagues (2) attempted to change this by conducting the first study of its kind in which strength training and its impact on exercise economy was examined in triathletes during a simulated long-distance triathlon. Let’s take a look at this study and the findings. Strength Training for Triathletes This study took 25 well-trained, long-distance triathletes and randomly assigned them to 26 weeks of concurrent strength training and endurance training (n=14) or just endurance training (n=11). The concurrent strength/endurance group performed 26 weeks of progressive strength training in addition to their usual endurance training, whereas the endurance only group simply performed their usual endurance training. The strength training program consisted of two sessions per week, with weeks 0-12 consisting of moderate loads (8-12 repetitions @ <75% of one rep maximum) and weeks 14-26 consisting of heavy loads (1-6 repetitions @ >85% one rep maximum). There was a two-week break in the middle of the strength training program to allow recovery. All study participants completed a simulated triathlon consisting of a 1500-meter swim, a 60-minute cycle, and a 20-minute run at weeks 0, 14, and 26 while researchers collected exercise economy data and other measures. The concurrent strength/endurance group saw improvements in maximum strength over the entire 26 weeks as well as improvements in cycling economy at week 14 and running economy at week 26 with no changes in total body mass. The endurance-only group did not see any improvements in cycling nor running economy at any time points. These findings are about what we would expect given the published literature documenting the beneficial effects of strength training on cyclists and runners, but this is the first study of its kind to demonstrate these improvements in triathletes. Interestingly, cycling economy improved after only 12 weeks of moderate-load strength training while running economy only improved after the heavy-load strength training phase at week 26. Typically, very heavy loads and/or explosive strength training elicit improvements in exercise economy, but there has been some research showing that moderate loads can improve exercise economy during cycling, and this might explain why these improvements in exercise economy were only seen in cycling at week 14 whereas running economy didn’t improve until week 26 (2). Another important finding was that these improvements in maximum strength and exercise economy occurred without an increase in body weight. This is important as some endurance athletes worry that engaging in strength training will lead to an increase in muscle mass, and therefore an increase in total body mass, and lead to a reduction in endurance performance. However, this is typically not the case as it is pretty well-documented that concurrent endurance and strength exercise prioritizes endurance adaptation over muscle hypertrophy adaptations. Endurance exercise is known to inhibit intracellular signaling pathways important for muscle protein synthesis and growth, which likely explains why concurrent endurance and strength training typically does not yield significant increases in muscle size, particularly in endurance athletes that engage in large amounts of endurance training (2). You might be wondering why swimming economy was not improved? Well, first of all, it was not measured, but strength training is also not known to typically improve swimming economy even though strength training for swimmers is still recommended. Finally, this study did not measure objective performance outcomes, so it was not possible to extrapolate the improvements in exercise economy to any improvements in objective performance during the simulated triathlon. However, exercise economy is such a strong predictor of endurance performance, that it is very likely that improvements in exercise economy would translate to some form of measurable objective performance during a triathlon (1). However, further work will of course be needed to demonstrate this. Conclusions While the general findings of this study are not groundbreaking in and of themselves, these findings are the first of their kind in triathletes specifically. It is exciting to see that a concurrent strength/endurance program can positively impact cycling and running economy while improving maximum strength in triathletes without an increase in body mass. The findings add further justification for the recommendation for triathletes to engage in long-term strength training, not only for general injury prevention, but for improved exercise economy. References: Bazyler CD, Abbott HA, Bellon CR, Taber CB, Stone MH. Strength training for endurance athletes: theory to practice. Strength & Conditioning Journal. 2015 Apr 1;37(2):1-2. Luckin-Baldwin KM, Badenhorst CE, Cripps AJ, Landers GJ, Merrells RJ, Bulsara MK, Hoyne GF. Strength Training Improves Exercise Economy in Triathletes During a Simulated Triathlon. International Journal of Sports Physiology and Performance. 2021 Feb 11;16(5):663-73. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Strength Training Improves Exercise Economy in Long-Distance Triathletes content media
0
0
55
Ryan Eckert, MS, CSCS
Nov 02, 2021
In The VO2 Max Forum
What is Iron? Iron is a critical micronutrient that is needed in the diet since the body cannot produce iron on its own. Iron is important and required for numerous processes in the body, ranging from proper immune cell function to the formation of red blood cells. Likely a few of the most important functions of iron for endurance athletes are its role in oxygen transport via hemoglobin and myoglobin as well as its role in oxidative production of adenosine triphosphate (ATP) within the electron transport chain (3). Therefore, a compromised storage of iron could potentially disrupt the formation of red blood cells and the ability to generate ATP via oxidative metabolism, both of which could impair endurance performance capacity. Iron deficiency is one of the most common deficiencies in the world, and so rightfully so it tends to be a micronutrient that gets a lot of attention in general. However, iron deficiency is also one of the more common micronutrient deficiencies among athletic populations, particularly endurance athletes and team-based athletes. It is estimated that ~15-35% of female athletes and ~3-11% of male athletes are deficient in iron; however, some smaller studies suggest even higher numbers than these (3). The rates of iron deficiency are much higher in athletes than they are in non-athletes as well (3). What makes athletes more likely to experience iron deficiency? It has been proposed that any one, or all, of the factors below increase the likelihood of athletes being at greater risk of iron deficiency compared to non-athletes (3): Hemolysis (breakdown of red blood cells) exacerbated by ground-contact forces during running Post-exercise inflammatory responses which leads to greater post-exercise interlekin-6 (IL-6) concentration, which increases hepcidin concentration, and hepcidin is the master regulating hormone of iron, with higher hepcidin leading to reduced iron absorption from the gut Greater potential for gastro-intestinal bleeding Potential for iron loss through sweat Greater breakdown and production of red blood cells due to adaptations in response to exercise Of note, women also have a higher need for iron, and therefore a greater likelihood of becoming deficient, due to the loss of blood and iron associated with their menstrual cycle. This is of course not unique to only female athletes, however. In regards to iron deficiency, it important to note, that like other micronutrient deficiencies, there are differing stages of iron deficiency ranging from initial less severe stages that may not exhibit any overt symptoms all the way to the most severe stage in which there are significant negative consequences and symptomatology. Iron is stored in the body as ferretin or hemosiderin, and these stores are usually what is depleted initially, with actual hemoglobin and red blood cell production affected in later stages of deficiency when iron stores get severely depleted. For iron deficiency in athletes, the following three stages of deficiency have been proposed: Stage 1: iron deficiency (ID): iron stores in the bone marrow, liver and spleen are depleted (ferritin < 35 μg/L, Hb > 115 g/L, transferrin saturation > 16%). Stage 2: iron-deficient non-anaemia (IDNA): erythropoiesis diminishes as the iron supply to the erythroid marrow is reduced (ferritin < 20 μg/L, Hb > 115 g/L, transferrin saturation < 16%). Stage 3: iron-deficient anaemia (IDA): Hb production falls, resulting in anaemia (ferritin < 12 μg/L, Hb < 115 g/L, transferrin saturation < 16%). Most people reading this might think that iron deficiency is really only a problem if anemia (low red blood cell volume and reduced hemoglobin concentration) is present. While it is true that low red blood cell volume and iron-deficiency anemia has well-documented negative consequences on athletic performance, some research also might suggest a reduction in performance from iron-deficiency non-anemia, likely due to impaired oxidative metabolism in the absence of reduced red blood cell volume (3). Therefore, catching iron deficiency at any stage is critical, not just for general health and well-being, but also for athletic performance. This is why it is usually recommended for endurance athletes to have iron status checked annually, or even biannually or quarterly if you have had evidence of impaired iron status previously (3). What Happens if Iron Status is Compromised? Firstly, it should be mentioned that an athlete should never guess when it comes to iron status. If an athlete wants to know what their iron status is, whether it is due to simple curiosity or because they experience some potential symptoms indicating iron deficiency (e.g., lethargy, weakness, fatigue, reduced endurance performance, etc.), they should have their iron levels tested by your primary care physician. There are also micronutrient testing options that one can get as well, and these don’t necessarily always have to be ordered through one’s primary care physician. Regardless of how the testing is done, so long as it is an accurate test, it is imperative that an athlete gets tested before manipulating iron intake as supplementing with iron when one is not deficient has no benefit. The benefits from increasing iron intake through food or supplementation typically only come when someone is deficient. So, let’s say an athlete gets their iron status tested and they do indeed have a deficiency. How is this normally handled? Typically, the first approach is to increase iron in the diet (fortified cereals, fish, meat, poultry, green leafy vegetables, etc.) (3). A change in diet to promote increased iron intake may also be done in conjunction with consuming foods that also increase the absorbability of iron, such as vitamin C or consuming heme iron foods (meat, fish, poultry) as opposed to non-heme sources (vegetables, beans/legumes, etc.). The second approach is to supplement with an oral supplement (3). Oral supplements can be elemental iron sources such as ferrous sulfate, or they could be chelated iron sources such as ferrous bisglycenate. It is important to mention here that chelated forms of iron may be advantageous to elemental iron when it comes to supplementation as chelated forms of iron can typically be taken in lower doses due to enhanced absorbability (1,2). Chelated forms of iron are also typically better tolerated compared to elemental iron, which is commonly associated with gastrointestinal upset and nausea (1,2). Finally, the third approach is to administer iron via an intramuscular shot or intravenous drip (3). However, this approach is usually only reserved for cases of severe iron-deficiency anemia in which rapid increases in iron stores are desired (3). When an athlete does have a compromised iron status, including milder deficiencies such as iron deficiency or iron deficiency without anemia, research has shown that endurance performance can possibly be improved when supplementing with iron (3). With the most severe cases of iron deficiency anemia, performance is likely severely compromised, and so supplementing with iron will almost surely improve performance as iron stores and hemoglobin status is improved (3). Conclusions To conclude, iron is an important and critical micronutrient for general health, but also for athletes, particularly endurance athletes. Iron deficiency is much more common among athletes than non-athletes due to various factors, and correction of iron deficiency is likely to have benefits on overall performance, particularly among those with severe iron deficiency anemia. Iron status can be improved through increasing oral iron intake (diet, supplementation) or via intramuscular shots or intravenous fluid. However, testing for iron deficiency should be done to confirm an actual iron deficiency before attempting to increase iron intake. It is, of course, good practice to regularly consume foods with iron with or without testing as iron is required in our diets, but supplementing with iron via oral supplementation or intramuscular shots or intravenous fluids should not be done if iron deficiency is not present due to the increased health risk of too much iron and the lack of performance benefit by athletes increasing iron intake without the presence of iron deficiency. If you are looking to learn more about iron and its considerations specifically for athletes, I highly recommend reading reference #3 in the reference list below as it is a 2019 review on the topic. On a side note, if you are looking for an iron supplement to help correct an identified deficiency, I personally love the Athlete’s Iron from MOXiLIFE as it is a highly absorbable and gut-friendly chelated form of iron. As an added benefit, it also contains some gut-friendly prebiotics. References: 1. Ashmead. H.D., Guaiandro. S.F.M., and Same. J.J. 1997. Increases In hemoglobin and ferritin resulting from consumption of food containing ferrous amino acid chelate (ferrochel) versus ferrous sulfate. In Trace Elements in Man and Animals - 9: Proceedings of the Ninth International Symposium on Trace Elements in Man and Animals. Edited by P.W.F. Fischer. M.R. L’Abbe. K.A. Cockell, and R.S. Gibson. NRC Research Press. Ottowa. Canada. Pp. 284-285. 2. Ferrari P, Nicolini A, Manca ML, Rossi G, Anselmi L, Conte M, Carpi A, Bonino F. Treatment of mild non-chemotherapy-induced iron deficiency anemia in cancer patients: comparison between oral ferrous bisglycinate chelate and ferrous sulfate. Biomedicine & Pharmacotherapy. 2012 Sep 1;66(6):414-8. 3. Sim M, Garvican-Lewis LA, Cox GR, Govus A, McKay AK, Stellingwerff T, Peeling P. Iron considerations for the athlete: a narrative review. European journal of applied physiology. 2019 Jul;119(7):1463-78. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
Iron Intake Among Endurance Athletes content media
0
0
19
Ryan Eckert, MS, CSCS
Sep 30, 2021
In The VO2 Max Forum
What are Omega-3 Fatty Acids? Omega-3 fatty acids consist of eicosapentaenoic acid (EPA), docosahexaenoic acid (DHA), and alpha linoleic acid (ALA). EPA and DHA are the fatty acids primarily found in fatty fish, whereas ALA is typically found in plant oils, nuts, and seeds (1). These three fatty acids are considered essential as the body does not produce them in sufficient quantities. Therefore, these fatty acids must be obtained in the diet. ALA can be converted to EPA and DHA in the body, but the conversion rate is extremely low (3-10%) in humans partially due to typical western diets consisting of too much Omega-6 fatty acids (1). The lack of EPA and DHA consumption in typical western diets as well as the higher Omega-6 intakes throws off the ratio of omega-3 to omega-6 fatty acids that would be considered ideal for optimal health (1). It is for this reason that omega-3 supplementation, with a particular emphasis on EPA and DHA, has become more and more popular in recent decades. In fact, omega-3 supplementation has a large body of evidence to support its health benefits, particularly for improving cardiovascular health and reducing the risk of cardiovascular disease and cardiovascular-related mortality from stroke and heart attacks (2). Omega-3 fatty acids have a vast array of other health benefits too, however, including beneficial impacts on nervous system function and immune function. Athletes have become interested in omega-3 fatty acids due to the beneficial effects of these compounds on inflammation. Dietary omega-3 fatty acid consumption has the potential to reduce inflammation, which is proposed as a way to enhance recovery from strenuous exercise (1). I became interested in the potential benefits of omega-3 fatty acids for endurance athletes after I started working to improve my diet. I have recently transformed my diet to reduce my intake of added sugars and processed foods while also increasing my consumption of fruits, vegetables, nuts, seeds, and other nutrient-dense foods. I went as far as getting a comprehensive micronutrient panel done so that I could further optimize my diet to eliminate micronutrient deficiencies. I don’t regularly consume fatty fish, so I knew I was likely not consuming enough omega-3 fatty acids in my diet. This was when I turned to a high-quality fish oil supplement, mainly for general health benefits. However, then I started looking at the potential performance benefits that omega-3 fatty acids have for endurance athletes. I will do my best to summarize what I found in the section below. What Is the Impact of Omega-3 Fatty Acid Supplementation on Endurance Performance? There was a really nice review article published in Research in Sports Medicine by Dr. Philpott and colleagues (1) discussing the applications of omega-3 fatty acid supplementation for sports performance. One of the subsections of this paper was specifically focused on the research surrounding omega-3 fatty acid supplementation for endurance sports performance. There is some research and limited evidence to suggest that omega-3 fatty acid supplementation may enhance mitochondrial biogenesis (the formation of more mitochondria), which would of course be particularly beneficial to endurance athletes as the more mitochondria would mean a greater capacity to utilize oxygen to breakdown glycolysis by-products and fat for fuel during exercise. However, this research was conducted in rodents and obese individuals, so the direct application to endurance athletes is lacking at this point. There is a bit more evidence, however, demonstrating that omega-3 fatty acid supplementation can reduce the oxygen cost of submaximal endurance exercise as well as reduce submaximal exercise heart rate due to the beneficial effects that EPA and DHA have on stroke volume and muscle cell insulin sensitivity. There is actually some research in cyclists demonstrating this, however, these reductions in oxygen cost and heart rate did not translate into objective cycling performance improvements in time trials. So, despite potential physiological changes from omega-3 fatty acid supplementation in endurance athletes, it has yet to be demonstrated that this translates into real-world performance improvement. Finally, there is the potential for omega-3 fatty acid supplementation to reduce the risk of upper respiratory tract infection (URTI) among endurance athletes due to the beneficial immunomodulatory effects of omega-3 fatty acids. Endurance athletes can be at a greater risk of URTIs, especially during periods of heavy training load. Therefore, if omega-3 fatty acid supplementation can reduce this risk and keep endurance athletes healthy with fewer bouts of illness, then this would theoretically have a positive downstream effect on performance as athletes would not miss training as often due to illness. However, again, there is limited data that directly demonstrates fewer bouts of illness when endurance athletes consume omega-3 fatty acids either from food or via supplementation. Conclusions As you can tell from reading the above, most of this research involving omega-3 fatty acid supplementation in endurance athletes is still very limited and inconclusive due to such limited studies being done. However, my personal opinion is as follows: If you do not regularly consume foods rich in omega-3 fatty acids, either starting to consume foods high in omega-3 fatty acids or taking a high-quality omega-3 fatty acid supplement is not likely going to impair endurance performance and can only improve your overall health. There is, of course, the potential for it to improve your endurance performance via the mechanisms outlined above. However, even if omega-3 fatty acid intake does not translate to objective performance enhancement, it will still likely improve your health as there are myriad health benefits to be had from regular intakes of omega-3 fatty acids in the range of 1-2 grams/day of EPA/DHA (1,2). This could certainly be something to consider if you do not consume enough omega-3 fatty acids in your diet on a regular basis. References: Philpott JD, Witard OC, Galloway SDR. Applications of omega-3 polyunsaturated fatty acid supplementation for sport performance. Res Sports Med. 2019 Apr-Jun;27(2):219-237. doi: 10.1080/15438627.2018.1550401. Epub 2018 Nov 28. PMID: 30484702. Bernasconi AA, Wiest MM, Lavie CJ, Milani RV, Laukkanen JA. Effect of Omega-3 Dosage on Cardiovascular Outcomes: An Updated Meta-Analysis and Meta-Regression of Interventional Trials. Mayo Clin Proc. 2021 Feb;96(2):304-313. doi: 10.1016/j.mayocp.2020.08.034. Epub 2020 Sep 17. PMID: 32951855. Happy training and racing! -Ryan Eckert, MS, CSCS Do you enjoy our monthly educational content that we create? Not only do we create written content like what you just read, but we have a podcast too where the goal is also to share science-driven, evidence-based information highly relevant to endurance athletes and coaches. We do all of this for free, and we rely on the generous help and support of others to cover some of our basic operating costs for putting out this content. If you would like to help or support, the best way to do so is by becoming a Patreon supporter.
The Effects of Omega-3 Fatty Acid Supplementation on Endurance Performance content media
0
0
25

Ryan Eckert, MS, CSCS

Admin
More actions
Contact

Success! Message received.

© 2017- 2025 by Peak Endurance Solutions, LLC.

  • Instagram Social Icon
  • YouTube Social  Icon
  • LinkedIn
bottom of page