Bakken et al: "Does Lactate-Guided Threshold Interval Training within a High-Volume Low-Intensity Approach Represent the “Next Step” in the Evolution of Distance Running Training?"

Featuring the father of lactate training Marius Bakken himself. Limited in the study to middle/long distance running, but very applicable here. I’m not sure it’s breaking any new ground as much as confirming what seems anecdotally known.

Full Text: https://www.mdpi.com/1660-4601/20/5/3782

Abstract
The aim of the present study was to describe a novel training model based on lactate-guided threshold interval training (LGTIT) within a high-volume, low-intensity approach, which characterizes the training pattern in some world-class middle- and long-distance runners and to review the potential physiological mechanisms explaining its effectiveness. This training model consists of performing three to four LGTIT sessions and one VO2max intensity session weekly. In addition, low intensity running is performed up to an overall volume of 150–180 km/week. During LGTIT sessions, the training pace is dictated by a blood lactate concentration target (i.e., internal rather than external training load), typically ranging from 2 to 4.5 mmol·L−1, measured every one to three repetitions. That intensity may allow for a more rapid recovery through a lower central and peripheral fatigue between high-intensity sessions compared with that of greater intensities and, therefore, a greater weekly volume of these specific workouts. The interval character of LGTIT allows for the achievement of high absolute training speeds and, thus, maximizing the number of motor units recruited, despite a relatively low metabolic intensity (i.e., threshold zone). This model may increase the mitochondrial proliferation through the optimization of both calcium and adenosine monophosphate activated protein kinase (AMPK) signaling pathways.

ScaleHRVO2maxRPETraining Methods6-Zone3-Zonemmol·L−1% Max%6–20
SST (6)3n/an/an/an/aSprintVHIT (5)38–18>9794–14018–20Lactate tolerance (i.e., 800 m and 1500 m pace)HIT (4)34.5–892–9788–9416–18Intensive aerobic interval (i.e., 5000 m pace)MIT (3)23.5–4.587–9284–8814–16Threshold training: interval running (10,000 m pace)MIT (2)22–3.582–8780–8412–14Threshold training: continuous/interval running (marathon pace)LIT (1)10.7–262–8255–809–12Easy and moderate continuous running

Thanks for the study.

Is Marius the father of lactate training, or the father of lactate training in the running world?
He seems a bit young to be the father of lactate, but my timeline isn’t very clear.

He went to a rival high school, I knew his name before I knew about the concept of a threshold run.

So, run a lot. Sometimes run hard, but not too hard and use lactate meter to make sure you don’t go too hard. Did it get it?

Fixed the link: https://www.mdpi.com/1660-4601/20/5/3782
.

Alex Hutchinson does his usual excellent review of the study,

https://www.outsideonline.com/...d-jakob-ingebrigtsen

Alex says,
The anecdotal reporting I’ve heard is that runners who are used to hammering interval workouts really struggle to go easy enough in these sessions, especially at first. But over the course of a few months, the pace they’re able to sustain without spiking their lactate levels quickens dramatically.

This sound suspiciously like the last new thing, heart-rate aka Maffetone training, which also promised to improve your pace at lower effort levels, after just a few months of carefully limiting intensity. But that didn’t work particularly well for anyone except Mark Allen and others who were able to do the 15-20 hours/week required…

Here again the training volume is about 110 miles per week, just for the running.
As Tom Waits used to sing, ‘how do we do it ? volume, volume, turn up the volume’
Necessarily at those volumes the intensity has to be limited, or you destroy yourself.

Another interesting question to me as a hobby jogger, as Alex alluded to in the article -
Is the lactate testing necessary, or could the same effects be achieved doing the threshold workouts using RPE/critical speed to limit intensity ?
This is probably irrelevant given I’m never getting near the required training volumes again…

Is the lactate testing necessary, or could the same effects be achieved doing the threshold workouts using RPE/critical speed to limit intensity ?

This is what is most interesting to me about all of this as I will likely never be spending money, time or effort on taking lactate samples during workouts.

The part that stands out to me is the wide range of speeds used in the threshold training and just the large shift in thought to internal load rather than resulting speed dictating the type of workout.

I would be interested in an analysis of what types of work intervals, rest intervals, and external load combinations typically result in staying below LT2 and which result in exceeding it.

As they mention, I think HR can sort of be used to do this without measuring lactate. Find workout designs that involve speeds above critical speed but where HR doesn’t exceed what you’d expect for a typical just-sub-threshold effort.

Is the lactate testing necessary, or could the same effects be achieved doing the threshold workouts using RPE/critical speed to limit intensity ?
The part that stands out to me is the wide range of speeds used in the threshold training and just the large shift in thought to internal load rather than resulting speed dictating the type of workout.

I would be interested in an analysis of what types of work intervals, rest intervals, and external load combinations typically result in staying below LT2 and which result in exceeding it.

As they mention, I think HR can sort of be used to do this without measuring lactate. Find workout designs that involve speeds above critical speed but where HR doesn’t exceed what you’d expect for a typical just-sub-threshold effort.I’ll second that.
HR can perfectly be monitored as often as the runner/cyclist wishes to view the data/display as a measure of training intensity. It is a magnitude easier than stationary blood-letting. By all means do a treadmill lactate test to confirm/adjust the treadmill HR ramp test deflection point but then use HR: that’s what I did. Needs to be moderated for altitude.
This also allows a coach to specify % rates for sessions which allow for individual’s HR range (HRrest-HRmax).

I was an early adopter of Polar HRM (?3000) for interval intensity control - mid eighties, maybe - and have some great plots of races too: resting HR = 28 and had a couple of final sprints at 206 and 207. I couldn’t get my HRmax up above 195 in on the steady uphill increased pace to exhaustion test, though.
I am truly amazed at Sanders’ HR (eg last night) when he must be way above threshold: pumped volume must be huge: maybe that why he runs in his imitable style at speed.

Is the lactate testing necessary, or could the same effects be achieved doing the threshold workouts using RPE/critical speed to limit intensity ?

This is what is most interesting to me about all of this as I will likely never be spending money, time or effort on taking lactate samples during workouts.

The part that stands out to me is the wide range of speeds used in the threshold training and just the large shift in thought to internal load rather than resulting speed dictating the type of workout.

I would be interested in an analysis of what types of work intervals, rest intervals, and external load combinations typically result in staying below LT2 and which result in exceeding it.

As they mention, I think HR can sort of be used to do this without measuring lactate. Find workout designs that involve speeds above critical speed but where HR doesn’t exceed what you’d expect for a typical just-sub-threshold effort.

I’ve been working with a lactate meter since the end of last season, so I can give some insight. (for price, the total I’ve spent is ~$500)

The work/rest and related lactate measurements are complicated by workload near the workout. Running stuff can be from 10x1k (5:45pace, 60s rest) to 4x15min (6:05pace, 3min rest), both usually in LT2 range but not always. Cycling stuff is longer but the same general idea. After a few months you can develop a feel for the right effort level, but simple things like temperature, wind, and work stress can throw it off.

HR could be a workable alternative, although to benefit from staying under threshold you would need to buffer by a fair amount (maybe 5bpm below measured LT2). The difference between a hard workout done at LT2 and a hard workout pushed above LT2 is noticeable and will add up over time.

All of this being said, I’m fairly confident I’ve maxed out my available time and workout energy. Every week is the same basic schedule, ~20 hours and 3-6hrs LT2 work. Rarely some speed thrown in. I don’t believe this system would have success in a low-volume environment.

A couple of thoughts.

  1. I do not believe blood lactate testing during and after every session is very useful for most high level experienced endurance athletes. The reason for this is that these athletes are almost always are aware of what their heart rate and effort level is at threshold. It may be a useful tool for athletes who are consistently overtraining and have a difficult time maintaining appropriate intensities.
  2. If you read Bakkens work, his belief is that threshold training is best done at SLIGHTLY BELOW threshold pace. If you have an FTP of 330-340 and a running threshold of 5:35 min/mile, most of your “threshold” training should be done at around 300-315 watts and 6:05-5:50 min/mile. This is often termed “sweet spot” or “tempo” training. The reason for this is that you can accumulate a much larger training stimulus with less recovery time required. This would put most athletes at a Blood lactate of around 2.5-3.5 mmol.
  3. He still argues for Vo2 max training. This is often done in the form of shorter burst interval sets, such as 30-40 seconds with equal rest. I believe the rationale is that this type of vo2 max training can provide adequate stimulus of improving maximal oxygen uptake, while also working on the neuromuscular systems to a greater degree, as you can produce higher power than a 2-4 minute effort. Additionally, they are usually easier to handle mentally, and easier to recover from.
  4. The sessions are best supported by adequate volume of low intensity work at less than 70 percent max heart rate. Perhaps 3-5 hours of easy training for every hour of this type of threshold work…

well? does it?

A couple of thoughts.

  1. I do not believe blood lactate testing during and after every session is very useful for most high level experienced endurance athletes. The reason for this is that these athletes are almost always are aware of what their heart rate and effort level is at threshold. It may be a useful tool for athletes who are consistently overtraining and have a difficult time maintaining appropriate intensities.
  2. If you read Bakkens work, his belief is that threshold training is best done at SLIGHTLY BELOW threshold pace. If you have an FTP of 330-340 and a running threshold of 5:35 min/mile, most of your “threshold” training should be done at around 300-315 watts and 6:05-5:50 min/mile. This is often termed “sweet spot” or “tempo” training. The reason for this is that you can accumulate a much larger training stimulus with less recovery time required. This would put most athletes at a Blood lactate of around 2.5-3.5 mmol.
  3. He still argues for Vo2 max training. This is often done in the form of shorter burst interval sets, such as 30-40 seconds with equal rest. I believe the rationale is that this type of vo2 max training can provide adequate stimulus of improving maximal oxygen uptake, while also working on the neuromuscular systems to a greater degree, as you can produce higher power than a 2-4 minute effort. Additionally, they are usually easier to handle mentally, and easier to recover from.
  4. The sessions are best supported by adequate volume of low intensity work at less than 70 percent max heart rate. Perhaps 3-5 hours of easy training for every hour of this type of threshold work…

I agree,

I see that the most important of this model is:

  • define very well the work zones
  • avoid the noone land of inter-thresholds
    . acumulate the work in low intensity zone and do like 1h high intensity for each 3-5h of training. And high intensity: most of it in subthreshold and rest in VO2max.

To maximize the volume, reduce the time of recovery: work in sweet spot for threshold and use hiit for vo2max.

One of the intersiting description of Bakken about the work distribution was to distribute intensity days so that they do 4 intensity sessions in 2 days, with double sessions, and rest of the weeks “easy”, (instead of do 4 intensity sessions in 4 different days of intensity work)

  • avoid the noone land of inter-thresholds

What do you mean by this? I think the one of the main interesting aspects of this training approach is that it is very much advocating work between the thresholds (I think this is what your above line is talking about avoiding). It is the counter to a polarized training approach, which is interesting with most of the endurance world fixated on polarized training.

One of the intersiting description of Bakken about the work distribution was to distribute intensity days so that they do 4 intensity sessions in 2 days, with double sessions, and rest of the weeks “easy”, (instead of do 4 intensity sessions in 4 different days of intensity work)

It is 3 days of intensity because of the z5 work (see Saturday in Table 2), though the 2 threshold days do seem to be the focus and are certainly the most interesting aspect of this approach.

  • avoid the noone land of inter-thresholds

What do you mean by this? I think the one of the main interesting aspects of this training approach is that it is very much advocating work between the thresholds (I think this is what your above line is talking about avoiding). It is the counter to a polarized training approach, which is interesting with most of the endurance world fixated on polarized training.

One of the intersiting description of Bakken about the work distribution was to distribute intensity days so that they do 4 intensity sessions in 2 days, with double sessions, and rest of the weeks “easy”, (instead of do 4 intensity sessions in 4 different days of intensity work)

It is 3 days of intensity because of the z5 work (see Saturday in Table 2), though the 2 threshold days do seem to be the focus and are certainly the most interesting aspect of this approach.

as I understand: they train a lot below LT1,
only interval training with controlled intensity - using intervals just below LT2 threshold (I would say Low Zone 4 or SST) and HIIT Intervals in Z5 / hill ups

As you says, my error, 5 sessions: 2 double days and 1 single. Two afternoons for resting… rest Z1 and sprints (Z6)+strength+drills

The no one land is above LT1 and those 2.5mm/3.5mmols (which are reached only during intervals).

RE: VO2max sessions -

The weekly inclusion of VO2 max is pulled from the example of the Ingebritsens, training for the 1500m. The approach seen by top triathletes on this system is a fair bit less but nonzero amounts of VO2 work, on the order of once a month for long course for the bulk of the year.

I don’t subscribe to the notion of inter-threshold paces being ‘noone land’. There is often a fair bit of LT1 specific work being done for athletes in long course training, which makes sense as it’s probably the most specific to that event.

Even details like double threshold days have room for adjustment to meet the demands of the event. For Ingebritsen racing 3.5min the absolute quality of the workout is paramount. For even mid-distance tri athletes the priority shifts toward accumulated load. Putting a similar volume of threshold work into a single session can be favorable, especially if the total volume of threshold work is nearing the limit.

Controversial opinion:
1)
There are no two thresholds - there is only a curve. Short 40min lactate step tests hide the slow fatigue of the most efficient fibres when they hand over to the least efficient fibres. At highest intensities this fatigue/handover happens within just a few minutes so you see lactate rise in the test.
2)
All training is durability training. If you run reps at a pace that needs fast twitch fibre contribution, those fibres become more resilient. That high intensity is also simultaneously training your more efficient fibres maximally, but due to rapid fatigue at that output you are better off working at an output that barely needs fast twitch fibres and collecting more minutes.
Due to glycogen availability and catabolic stresses, we keep the other curve-pushing at a low enough intensity not to impede the higher curve-pushing work.

@tmcmanners

Controversial opinion:
1)
There are no two thresholds - there is only a curve. Short 40min lactate step tests hide the slow fatigue of the most efficient fibres when they hand over to the least efficient fibres. At highest intensities this fatigue/handover happens within just a few minutes so you see lactate rise in the test.
2)
All training is durability training. If you run reps at a pace that needs fast twitch fibre contribution, those fibres become more resilient. That high intensity is also simultaneously training your more efficient fibres maximally, but due to rapid fatigue at that output you are better off working at an output that barely needs fast twitch fibres and collecting more minutes.
Due to glycogen availability and catabolic stresses, we keep the other curve-pushing at a low enough intensity not to impede the higher curve-pushing work.

@tmcmanners

Great discussion points!

Re. point #1.
There are distinct metabolic and hormonal changes after athletes cross LT1 and LT2, so while I agree that discrete stages in a lactate profile test tend to highlight distinct inflections, there are quantifable metabolic changes that correlate with “thresholds”

Further - Yes, with fatigue of slow-twitch fibers (more efficient) the body will begin to recruit more fast-twitch fibers despite “on-paper” the intensity only needing slow-twitch contribution. However, at increasing intensities additional motor-units / fibers are recruited to meet force demands being placed on the body. You see increases in blood lactate because lactate is shuttled out of these glycolytic fibers, circulated in the blood stream and then oxidized in oxidative skeletal muscle, cardiac, liver, and other tissues. (Lactate Shuttle). This happens because the glycolytic fast-twitch fibers lack the number of mitochondria needed to oxidize the lactate molecule. The build up is not necessarily a bad thing, it is just the transfer to tissues that are able to utilize the lactate as fuel.

Re. Point #2.
This is debatable because the body recruits motor units of muscle fibers in a preferential order. The lowest intensity activities get efficient recruitments and they build incrementally from there. Fast twitch fiber recruitment happens very early in the intensity steps. This is essentially what you are seeing between LT1 and LT2: at LT1 more glycolytic fibers are recruited which causes lactate to enter the blood stream.However it does not overwhelm the ability of other tissues to uptake it from the blood stream and as such the lactate remains steady below MLSS.

If an athlete does not spend time at >90%of VO2max intensity, then there are motor-units of muscle fibers that are likely never recruited and never trained. This is because there is never a high enough workload to warrant them being recruited. Yes, with Fatigue and Glycogen depletion of lower level units, the body will eventually recruit more motor units than is necessary without fatigue, however, this only occurs after durations and glycogen depletions that are otherwise very stressful on the body, which may not be the best training strategy.

Now, there is certainly the “slow-component” of VO2 and any exercise in the severe intensity domain will eventually elicit VO2max. However this also a larger stressor to place on the athlete and their ability to handle larger volume of work in this fashion is quite limited - again likely leading to sub-optimal signaling and adaptation.

Disclosure:
I feel strongly about this one and figured this might be a fun place to put some more general thoughts on the interwebs. I mean no specific harm to the author of this thread, or of that paper, or any companies specifically involved.

I wrote two versions. Neither were edited or revised as thoughtfully as I have considered this subject. Pardon any lack of coherence, spelling, or grammatical issues.

**TLDR version: **

Will it work for some folks? Sure probably. Most things work for some folks. Is it the next step in the evolution of scientific training? Probably not. It could be just another one-dimensional approach to add to the growing list of tracking-first, thoughtfulness second, training approaches. And there is possible harm from growing that list.

Rant version:

This novel approach is another “tracking first, thoughtful intervention second” thought paradigm. We need to stop letting training paradigm hypotheses, and all training and nutrition thinking, be dictated primarily by what we now have the capacity to measure. If we do not stop these lines of thinking, and think more critically about the whole, we will be increasingly misled by the now-exploding ability to measure new things.

The ability to measure something does not confer any special value to the measured thing, no matter how accurately, continuously, or granularly we are able to measure that thing. Nor does the ability to prescribe interventions using that thing as a measuring stick.

The fact that some of the earliest “things we could measure” were indeed quite meaningful in terms of training & nutrition prescription (aka ‘intervention’) misled many would-be inventors over the coming decades. The reason the early measured things changed the game in training was because they were closer to fundamental realities (basic physics), less reductionist (looking at overall body systems instead of singular molecules and concentrations), and because someone would only invent something to help measure something if they were pretty confident that that thing was going to truly enhance performance. Now, compared to 20 years ago, the cost of invention is lower, the capital has been flowing loosely to get things in front of broad markets quickly and publicly, and we’re seeing an explosion of “lets measure an alphabet soup, throw it at the wall, and see what sticks” approach.

The result is myopic recommendations based on overly reductionist tracking and measuring of often misleading biomarkers and other ‘variables’ which can be measured, graphed, derived, or otherwise correlated to various and sundry other things.

The “graph” is a hot-button seller. Everyone wants graphs. We want to see ourselves visualized. Analyzed. Measured and assessed. But before purchasing something to see a graph, we must ask ourselves: "if that graph were to tell me exactly and reliably X, and it told me to do exactly and clearly Y, precisely because of X, do I have any evidence that doing Y would actually make me better off, in light of the things that doing Y might otherwise replace or cause me to lose? Here are things you might lose because you decide to do Y: freedom, time, other training characteristics or emphases, training variability, enjoyment of training, money. There are more. Each of these often independently outweighs any potential value of doing Y, when Y is based purely on a reductionist X.

At least we must not delude ourselves that just because we measure that thing very well, we will therefore be better off than if we did not measure that thing. Among the nerdier crowd, including me in my younger years, there is often a pronounced placebo effect simply from measuring more things. This is not true for about half the population. I’m not opposed to self-measurement for fun and the personal enjoyment it might bring to some of the nerdier among us, or for leveraging the placebo effect to your advantage. Just take care not to delude yourself in the process.

I *am *deeply opposed to measuring things automatically making people better. It does not, and it very often, especially among people who will never post on this beloved slowtwitch forum, makes people worse. Even the mere knowledge that other people are able to measure, and are measuring, more than someone is measuring, is enough to have a negative placebo effect on folks who do not love analyzing data in their free time. Sometimes this negative effect - of either tracking, or the awareness of one’s own decision not to track oneself more comprehensively - can be quite strong and it is certainly not limited to a small subset of the population. It probably affects more than half the population negatively. Their awareness of such negative effects may not match the reality of the negative effects.

The take-home: if you love data and love analyzing data, nerd out. Track and measure to your heart’s content. Just don’t delude yourself in doing so. If you do not love data and analyzing it, take heart, you don’t need to start using fancy tech to be tracking lactate, or blood sugar, or sweat rate, or heart rate variability, or sleep stages, or anything else. And you shouldn’t. You’re genuinely better off if you don’t, unless a medical provider who knows you well, and is looking at the whole you, says that you definitely should start measuring something regularly.

/rant

Totally agree, great summary of how muscles get the work done differently at different intensities. I emphasise WHY training makes you better. One intensity is more or less stressful than you might expect, yes, but is the goal to maximise stress? Minimise stress? There’s a missing link when you train by ‘thresholds’ - why this means improvement. If someone’s program is 10 hours of ironman pace weekly, and they switch to 2 hours threshold 8 hours easy, they may well get worse at ironman - both because of reduced specific durability training and potentially lower training load as well. The norwegian method is designed for people with lots of time and who are not in a race-specific phase.

Disclosure:
I feel strongly about this one and figured this might be a fun place to put some more general thoughts on the interwebs. I mean no specific harm to the author of this thread, or of that paper, or any companies specifically involved.

I wrote two versions. Neither were edited or revised as thoughtfully as I have considered this subject. Pardon any lack of coherence, spelling, or grammatical issues.

**TLDR version: **

Will it work for some folks? Sure probably. Most things work for some folks. Is it the next step in the evolution of scientific training? Probably not. It could be just another one-dimensional approach to add to the growing list of tracking-first, thoughtfulness second, training approaches. And there is possible harm from growing that list.

Rant version:

This novel approach is another “tracking first, thoughtful intervention second” thought paradigm. We need to stop letting training paradigm hypotheses, and all training and nutrition thinking, be dictated primarily by what we now have the capacity to measure. If we do not stop these lines of thinking, and think more critically about the whole, we will be increasingly misled by the now-exploding ability to measure new things.

The ability to measure something does not confer any special value to the measured thing, no matter how accurately, continuously, or granularly we are able to measure that thing. Nor does the ability to prescribe interventions using that thing as a measuring stick.

The fact that some of the earliest “things we could measure” were indeed quite meaningful in terms of training & nutrition prescription (aka ‘intervention’) misled many would-be inventors over the coming decades. The reason the early measured things changed the game in training was because they were closer to fundamental realities (basic physics), less reductionist (looking at overall body systems instead of singular molecules and concentrations), and because someone would only invent something to help measure something if they were pretty confident that that thing was going to truly enhance performance. Now, compared to 20 years ago, the cost of invention is lower, the capital has been flowing loosely to get things in front of broad markets quickly and publicly, and we’re seeing an explosion of “lets measure an alphabet soup, throw it at the wall, and see what sticks” approach.

The result is myopic recommendations based on overly reductionist tracking and measuring of often misleading biomarkers and other ‘variables’ which can be measured, graphed, derived, or otherwise correlated to various and sundry other things.

The “graph” is a hot-button seller. Everyone wants graphs. We want to see ourselves visualized. Analyzed. Measured and assessed. But before purchasing something to see a graph, we must ask ourselves: "if that graph were to tell me exactly and reliably X, and it told me to do exactly and clearly Y, precisely because of X, do I have any evidence that doing Y would actually make me better off, in light of the things that doing Y might otherwise replace or cause me to lose? Here are things you might lose because you decide to do Y: freedom, time, other training characteristics or emphases, training variability, enjoyment of training, money. There are more. Each of these often independently outweighs any potential value of doing Y, when Y is based purely on a reductionist X.

At least we must not delude ourselves that just because we measure that thing very well, we will therefore be better off than if we did not measure that thing. Among the nerdier crowd, including me in my younger years, there is often a pronounced placebo effect simply from measuring more things. This is not true for about half the population. I’m not opposed to self-measurement for fun and the personal enjoyment it might bring to some of the nerdier among us, or for leveraging the placebo effect to your advantage. Just take care not to delude yourself in the process.

I *am *deeply opposed to measuring things automatically making people better. It does not, and it very often, especially among people who will never post on this beloved slowtwitch forum, makes people worse. Even the mere knowledge that other people are able to measure, and are measuring, more than someone is measuring, is enough to have a negative placebo effect on folks who do not love analyzing data in their free time. Sometimes this negative effect - of either tracking, or the awareness of one’s own decision not to track oneself more comprehensively - can be quite strong and it is certainly not limited to a small subset of the population. It probably affects more than half the population negatively. Their awareness of such negative effects may not match the reality of the negative effects.

The take-home: if you love data and love analyzing data, nerd out. Track and measure to your heart’s content. Just don’t delude yourself in doing so. If you do not love data and analyzing it, take heart, you don’t need to start using fancy tech to be tracking lactate, or blood sugar, or sweat rate, or heart rate variability, or sleep stages, or anything else. And you shouldn’t. You’re genuinely better off if you don’t, unless a medical provider who knows you well, and is looking at the whole you, says that you definitely should start measuring something regularly.

/rant

Mostly saved, in case you edit in the future.

The more you post…the more I like you. :slight_smile:

“Why are you looking for your keys here? Because, this is where the light is!”

As I was writing that post, I was searching for that streetlight analogy. Couldn’t find it. lol.