The paradox of “invisible” monitoring: The less you do, the more you do!

Written by Jace Delaney

September 8, 2018

The sport science industry has come a long way from the days of manually coding matches using a pen and paper, and the recent influx of technologies such as global positioning systems (GPS) and accelerometers have allowed us to be far more objective with our decision-making processes. However, as far as I know, there seems to be little to no correlation between the amount of money spent on technology and on-field performance. The teams with the best players typically win, regardless of which GPS provider the team decides to sign with or which pre-season testing protocol was used in the first week of training. That’s not to say that what we do as an industry isn’t significant, but it’s important to keep in mind where we fit within the context of everything that goes towards winning games. We practitioners are faced with the challenge of sifting through all the latest methods and techniques, most of which are padded by questionable claims from manufacturers and stakeholders, in order to identify what can actually be beneficial to performance. The constraints are extensive, whether it be a small budget, an old-school coach who is unfamiliar with sport science, or a minimal level of player buy-in. Because of these limitations, I really feel like we waste too much time “testing” and “monitoring” athletes; time that could be better spent developing physical qualities, or technical and tactical abilities. That’s not to say that we can’t gather important information about our players regularly. However, some methods are inherently better; they don’t take time away from the factors that are important to ensuring team-sport success. The resources available through HIIT Science will detail plenty of specific examples of how to develop athletes physically, from a wide range of well-respected professionals. My aim in this post is to share a few ways to free up some valuable time, so that it can be spent ensuring players are prepared to compete at the highest level. The concept of “invisible monitoring” is a simple one – gather as much information about the athlete, their performance and their current training status, without them even knowing you’re doing it. If we take a closer look at the information we are already collecting, we can still answer coach-driven and performance-based questions, without the need for an intervention or any additional burden on the athlete. The introduction of contemporary data analysis techniques (e.g. R Studio) into our field has permitted us to be more flexible with our research and ability to answer performance-based questions. The problem/solution examples I will discuss are specific to GPS and accelerometers, but this concept can (and should) be applied wherever possible. Problem 1: Head coach says our team is too slow, can’t match it with the teams that play an “up-tempo” style and strangle us out of the game. How do we get better in this area without compromising our skills? Solution 1: Game Speed Without knowing it explicitly, the coaches are referring to “game speed”, which was the focus of my Ph.D. research within team-sport athletes. This is not a complex concept. We are simply referring to the ability to generate and maintain a high physical output, without significant compromise of the technical, tactical or psychological components of competition. To improve this capability, we need to work with coaches to design training drills [like game-based HIIT (GBHIIT) or small-sided games] that enable players to reach the peak intensities that occur during match play. This approach does not require any visible intervention with the playing group, nor does it remove any face-to-face coaching time from the staff. In this regard, the first step is to quantify the most intense periods of competition, which we were able to do using a moving average approach across a range of different durations.1 This revealed a classic power law relationship, where running intensity decreased consistently as a function of time (Figure 1, below).

Figure 1. Maximum running intensities of rugby league match-play across a range of rolling average durations. Data are presented as mean ± SD for each outcome variable.1

  By modeling this relationship, we were able to achieve a simple calculation for estimating the peak running intensity of competition for any given duration across variables of interest.2 More detail about this technique can be found on a blog post I recently wrote, along with an associated spreadsheet for calculating game speed using your own data. Put simply, this method allows us to prescribe and monitor skill-based training drills and GBHIIT more accurately relative to the most intense period of competition. Likewise, this process permits us to prescribe our GBHIIT so as to simultaneously develop our physical capacities alongside the specific skill and game-based decision-making demands of team sport play. Such a technique is clearly preferable to traditional conditioning from the point of view of the coaching staff, as the technical and tactical abilities are being constantly overloaded. Therefore, the monitoring of training relative to game speed is a simple, effective and invisible way to drive intensity during training. In some cases, we may find that players are unable to reach the required intensities of competition during training. In this case, such players may benefit from supplementary conditioning strategies (shuttle-based short interval HIIT, for example) to increase their physical capacities and to enable the intensity requirements of our skill-based training. It is important to note that these methodologies are not intended to replace traditional conditioning practices, but instead to serve as useful alternatives during the competitive phase when time constraints are at their highest. Problem 2: Ok, so we are training at an appropriate intensity, but is it enough to sustain or further enhance our players’ fitness levels? Solution 2: The Training Efficiency Index (TEI) Now that we know we are training at the right intensity, the next thing we need to assess is how our athletes are responding to their training. This is the focus of Chapter 9 in the forthcoming book and course you will find on HIIT Science. Quantifying external training load (Chapter 8) is relatively straightforward. Just ask a player to wear a GPS and we can get a decent idea of what’s occurred from an external training load standpoint. Unfortunately, assessing the player’s internal (physiological) load is more complicated. There are a limited number of ways we can go about determining the internal load, each with their own varying level of logistical and financial limitations. Determining internal load using heart rate (HR) may be most useful, as the marker is relatively reliable and sensitive to fluctuations in intensity throughout a session. Nevertheless, the measure has a number of limitations including lag time in response to intensity, HR inertia, notwithstanding skin connectivity and player buy-in issues. Session rating of perceived exertion (RPE) is simple, free, and can serve as a useful global non-specific measure. However, this method depends on athletes responding accurately, which can be difficult to achieve with large groups of athletes in a short space of time. Specific to fitness tests themselves, submaximal running tests have become increasingly popular due to their short durations, low physiological burden, and relative ease of implementation.3 These standardized tests control the volume and intensity of the external work completed, meaning fluctuations in the internal response (HR) from week-to-week may be reflective of positive/negative training effects. However, we need to ask ourselves – is standardization of the external component of the test really necessary if we are constantly measuring it using GPS? Instead, by just assessing our internal load (i.e. HR response) in the context of the measured external load, we can attain a similar (and invisibly collected) snapshot assessment without needing to perform a fitness test at all (Figure 2).

Figure 2. A conceptual model for the integration of internal and external training loads as per the “invisible monitoring” framework.

Due to the generally strong relationship held between most internal and external training load markers (Chapter 8), we are able to identify unexpected responses to training (termed training status, Chapter 9). In that sense, a greater than expected external training load for a given internal load is typically representative of a positive training effect (i.e. increase in training efficiency). In contrast, a lower than expected external training load for a given internal training load is reflective of a negative training response (i.e. state of fatigue or loss of fitness). In order to assess these deviations, we have proposed the Training Efficiency Index (TEI)4 as an appropriately scaled integration of internal and external load measures, shown by the following equation: Where x is a scaling factor, obtained as the slope of the relationship between the log-transformed internal and external load variables. A detailed description of how to calculate this metric can be found here, including a free spreadsheet for use. Primarily, the benefit of the TEI is that no additional intervention is required (i.e. monitored invisibly), and just by assessing internal and external load simultaneously during any on-field training session (conditioning, skills training or matches), we can attain valuable information on the training status of our athletes. A recent study by Lacome et al.5 presents a comparable technique, where predicted HR responses to training drills are compared against actual HR results, similarly reducing the logistical burden of testing interventions. While an extra 10 minutes per week returned back to an elite program might not seem like a big difference, those in the trenches, including those you’ll read about in the sport application chapters of HIIT Science, will attest that this time-saving does have a positive impact on their playing groups. Problem 3: Our players are looking flat, but their fitness levels are good. Where are we going wrong? Solution 3: Within-session fatigue/readiness monitoring In theory, we could have the fittest and most skillful team in the world, but if we run our players into the ground, it’s unlikely they’re going to perform at their best on game day. Therefore, it could be suggested that monitoring of fatigue and readiness on a regular basis can help ensure that players are operating at their optimal levels on game day. The possibilities for assessing fatigue and readiness are extensive, ranging from subjective wellness questionnaires, to biochemical profiling, or neuromuscular performance testing. However, as discussed in detail by Carling et al.6 recently, each of these interventions carries a significant burden, whether it be cost, implementation practicality, or simply lack of buy-in from coaches and players, not to mention the questionable importance of fatigue monitoring in the real-world setting. High-level players are more often than not able to perform irrespective of slight neuromuscular impairments, or elevated levels of acute muscle soreness. Unfortunately, good quality, practically-relevant research in such athletes is lacking due to the aforementioned perceived burden to athletes and coaches. This issue is difficult to circumvent, as sport science research cannot be the number one priority at this level. Therefore, we as practitioners must continue to strive for ways to extract relevant data using the information that we typically collect on a day-to-day basis – i.e. “invisibly”. Specific to on-field performance, there are several possibilities for assessing fatigue/readiness during training and competition. For example, GPS-embedded accelerometers, coupled with customized data- processing software, enables practitioners to assess the neuromuscular and mechanical determinants of training and competition,7 which may have greater precision for detecting fatigue when compared to traditional time-motion analyses. Similarly, changes in the contribution of each vector (i.e. x, y or z) to the total PlayerLoadTM achieved during match play have been shown to be associated with countermovement jump variables amongst top-level Australian soccer players.8 Given the primarily horizontal nature of force output during many field-based sports such as soccer and the football codes, assessed horizontal force production capabilities may reveal further insights into the athletes’ readiness to perform. Practical field-based methods of assessing the mechanical profile of sprint acceleration have been developed,9 which allow the determination of horizontal force production during short but maximal sprint acceleration efforts. Although these data are useful for detecting fatigue,10 they can be difficult to implement practically due to the associated additional fatigue and injury risk, the need for players to sprint maximally, or simply the logistics of scheduling during the in-season period. Within-session estimates of the mechanical properties of sprinting would be ideal, though current tracking technologies seem to be unable to assess these metrics accurately. Summary The examples I have discussed in this post are just a few of many in the team sport context. The best sport science programs I have seen are highly-integrated, and complement the goals of the team and other members of the performance staff. Most importantly, the “invisible monitoring” concept outlined within this post ultimately allows us to reduce the testing burden placed on the athletes themselves. Since adopting this approach, I have been able to answer questions asked by both coaches and performance staff, without taking time away from the development of the playing group. These invisible techniques have developed over time, and will continue to do so, due to both technological advancements, and our collective sharing as practitioners of the enhanced understanding of their efficacy in high-level sport. About the author: Jace Delaney is a sport scientist currently working as the Director of Performance and Sports Science with the University of Oregon’s athletic program. His website Sport Performance Explained is a platform where he and fellow colleague Matt Jones share their opinions concerning elite athletic performance, with the overarching focus on linking research to practice. References
  1. Delaney JA et al. Acceleration-based running intensities of professional rugby league match play. Int J Sports Physiol Perform. 2016; 11, 802-809.
  2. Delaney JA et al. Modelling the decrement in running intensity within professional soccer players. Sci Med Football. 2017; 2(2), 86-92.
  3. Veugelers KR et al. Validity and reliability of a submaximal intermittent running test in elite Australian football players. J Strength Cond Res. 2016; 30(12), 3347-3353.
  4. Delaney JA et al. Quantifying the relationship between internal and external work in team sports: development of a novel training efficiency index. 2018; 2(2), 149-156.
  5. Lacome M et al. Monitoring players’ readiness using predicted heart rate responses to football drills. Int J Sports Physiol Perform. 2018.
  6. Carling C et al. Monitoring of post-match fatigue in professional soccer: welcome to the real world. Sports Med. 2018.
  7. Buchheit M et al. 2018. Neuromuscular responses to conditioned soccer sessions assessed via GPS-embedded accelerometers: Insights into tactical periodization. Int J Sports Physiol Perform. 2018; 13(5), 577-583.
  8. Rowell AE et al. A standardized small sided game can be used to monitor neuromuscular fatigue in professional A-League football players. Front Physiol. 2018.
  9. Samozino P et al. A simple method for measuring power, force, velocity properties, and mechanical effectiveness in sprint running. Scand J Med Sci Sports. 2016; 26(6), 648-658.
  10. Marrier B et al. Quantifying neuromuscular fatigue induced by an intense training session in rugby sevens. Int J Sports Physiol Perform. 2016; 12(2), 218-223.

You May Also Like…

Listen to Your Body, Not Your Heart: How to Optimize Endurance Training Results with “Self-Reported Stress Tolerance”

Listen to Your Body, Not Your Heart: How to Optimize Endurance Training Results with “Self-Reported Stress Tolerance”

Have you ever thought about what tools are best for telling us when to apply HIIT the next time? How do we know when we will handle it again without driving our athlete into overtraining or injury risk? If you’ve read our ​book​ or taken our ​course​, you’ll know about how important this variable will be when we look to design the ideal program. Both heart rate variability and subjective comments were key factors we discussed. But is one better than the other?
Maybe. We’ve stumbled upon an incredible study that might just help clear our thoughts on the matter in terms of athlete readiness and when to apply HIIT again in the future to maximize performance potential. The blog post written by ​Diego Hilgemberg Figueiredo​ compares training plans guided by either HRV or subjective inner feelings. The findings may surprise you. Check out a sneak peak in the graphic below as to why heart rate and HRV can’t always bring us certainty around readiness to perform. They’re good, and helpful, but you must know your context! Combine it with feel to get the most power in your programming.

Level-Up

Featured Courses

NEWSLETTER SIGN-UP TRIATHLON SALE

Sign up for our EXTENSIVE MONTHLY HIIT goodies AND
get 👉 35% OFF our NEW IRON TRIATHLON BUNDLE™ !

Please check your email inbox and confirm your participation.