Alrighty folks this is just a friendly reminder that the title of the thread is "Energy consumption" and not "Chip coverage"
Yesterday I drove back and forth from Stanford to Walnut Creek, almost identical trips. See screenshot.
Bottom line: I also see a mismatch between %age of battery used and the consumption reported for the 'journey' [kWh/100 miles], resulting in considerably lower estimates of usable battery capacity: For both legs of the trip the implied battery capacity was ca. 74 kWh.
If I assume 84 kWh capacity and take the SoC difference at face value, I get ca. 50 kWh/100miles. That's a lot worse than the reported 36 kWh/100miles.
The end of the first 'journey' (which is the same as the beginning of the second 'journey') is at 50% SoC; the percentage used was 29 or 30. The kWh/100 miles figures are also very close. In other words, none of the estimates seem to depend on higher (first journey) vs. lower (second journey) SoC, and do not imply that the SoC estimate is off.
This is puzzling. My best guess is that the SoC estimate is pretty good and that the kWh/100 miles is off and gives us an optimistic estimate. Evidence in favor of this idea is this: Both trips involved about 80% of the miles driven at between 65 and 80 mph. I can't believe that the average consumption was 36 kWh/100miles. 36 kWh/100miles is almost exactly my long-term average, which is dominated by commute miles. The calculated ca. 50 kWh/100miles is much more realistic for that trip yesterday.
I believe that JLR's calculation of the average rate of consumption makes a rookie mistake: averaging the rate as a function of time, instead of miles driven. Additional evidence for that hypothesis is the fact that the real time estimates of consumption [kWh/100miles] rapidly decrease after driving many miles at high speed and then driving a few miles at lower speeds. I will put that to an actual test some time this weekend.