Sorry if this is a duplicate of some topics that have been hammered here, but there are some specific points of clarification around multiple of these points.
I recently moved to Texas from the Northeast, I moved in April. The temperature from the 6 months up to me moving was low. Highs in the 30s in Dec, Jan and Feb then in the 40’s and 50’s in March and April.
Since moving to Texas its been mid-90s into the 100’s a lot, especially when accounting for heat index. As such, my HR is understandably higher than in the cooler climate (like 20 BPM for the same effort)
Pivoting for a second to Lionels recent video, talking about leaving Tucson and heading to Flagstaff. He talked about heat being a primary factor and not being able to train appropriately in that heat.
Now back to the question. Is HR the source of truth for training zones? Or is power/pace? If my Z2 pace in a cooler climate was 7:00 per mile, and now to hit that same HR I need to run closer to 9:00 per mile, what is correctly hitting my training stimulus? I feel like that data is conflicting on this and I have seen “HR is too erratic and not trustworthy” but also, HR is used to set training zones and many zone 2 advocates talk about people setting their Z2 pace too high and not getting the proper aerobic adaptations. This extends to cardiac drift. As I run further at a given pace, and HR drifts, am I leaving the aerobic zone and should long runs be decelerations? Or should I use my zones set in good conditions to guide my training and ignore HR when it is hot?