I’ve been playing with my Labradar data similarly. Brian Litz impressed on me that long range accuracy is significantly influenced by BC variation. And have a Labradar that can measure that. So I spent a lot of time calculating BC from the series summary data and the five downrange velocities recorded per shot. The BC seemed to change a lot, much more than the vertical spread on target would imply, and after some deeper digging I think it’s a data issue much more than actual shot to shot BC variance.
The deeper dive revealed that Labradar uses very lazy math to calculate the downrange velocities. Each shot is recorded as a series of distances over regular intervals of time. A downrange velocity is simply (d2-d1) / (t2-t1) where 1 and 2 are two points collected at the chosen distance. The distance is the measured data and error shows up as variations in the distance. So when two consecutive distances in the shot data are perfect, the downrange velocity is calculated accurately. As error develops in the distance value, it causes the velocity data wander high and low. The velocity errors build rapidly because the time intervals are so tight. The LabRadar series summary data shows you those downrange velocities without any indication of whether it’s a solid value or a snip of the noisy garbage. So if you pull up the data for an individual shot you’ll see the more distant downrange velocity data is jumping up and down from the actual values.
We have two choices: drop the noisy data and play with shorter ranges or calculate the velocities using a better data fitting model. The SNR data did not correlate crisply with my perception of where the velocities went unusable.
It didn’t take me long to move from the former to the latter. Using Excel I have calculated a fifth order polynomial fit of the raw distance vs time data, then I calculate a velocity vs time by taking the derivative of the distance function. One limitation is that can process only one shot at a time this way. Processing a series requires maybe 20 seconds of copy-pasting per shot.
Explanation: Magenta = raw shot data. I copy and paste these columns to process an individual shot.
Yellow: some controls that set the bounds for the data pulled by the following functions
Green: My Distance vs. time and Velocity vs. time functions. Green column is populated using the velocity vs. time function.
Blue: Since I want to know velocity at specific distances, I calculated a 5th order poly of the velocity vs. distance data. (It would have been very tedious to figure out how to traverse my velocity data and interpolate between the two adjacent points of my velocity data.) From the velocity vs. distance function I pull velocity at 0, 20, 40, 60, 80, and 100y.
Orange: note the Labradar velocity going wonky after 0.4 seconds. This one appears to be from me shooting at a 100y target and the LR somehow recorded data after the pellet passed through. In most cases, the velocity flip-flops from high to low and gets really noisy. Looking at the velocity plot shows you quite clearly at what distance the Labradar data loses validity, and each shot is different.
My next step is to play around with the BC math posted here to try to calculate BC for the shot from my fitted velocity data. I had been doing that using ChairGun and it would be great to include it in my Excel processing.
The deeper dive revealed that Labradar uses very lazy math to calculate the downrange velocities. Each shot is recorded as a series of distances over regular intervals of time. A downrange velocity is simply (d2-d1) / (t2-t1) where 1 and 2 are two points collected at the chosen distance. The distance is the measured data and error shows up as variations in the distance. So when two consecutive distances in the shot data are perfect, the downrange velocity is calculated accurately. As error develops in the distance value, it causes the velocity data wander high and low. The velocity errors build rapidly because the time intervals are so tight. The LabRadar series summary data shows you those downrange velocities without any indication of whether it’s a solid value or a snip of the noisy garbage. So if you pull up the data for an individual shot you’ll see the more distant downrange velocity data is jumping up and down from the actual values.
We have two choices: drop the noisy data and play with shorter ranges or calculate the velocities using a better data fitting model. The SNR data did not correlate crisply with my perception of where the velocities went unusable.
It didn’t take me long to move from the former to the latter. Using Excel I have calculated a fifth order polynomial fit of the raw distance vs time data, then I calculate a velocity vs time by taking the derivative of the distance function. One limitation is that can process only one shot at a time this way. Processing a series requires maybe 20 seconds of copy-pasting per shot.
Explanation: Magenta = raw shot data. I copy and paste these columns to process an individual shot.
Yellow: some controls that set the bounds for the data pulled by the following functions
Green: My Distance vs. time and Velocity vs. time functions. Green column is populated using the velocity vs. time function.
Blue: Since I want to know velocity at specific distances, I calculated a 5th order poly of the velocity vs. distance data. (It would have been very tedious to figure out how to traverse my velocity data and interpolate between the two adjacent points of my velocity data.) From the velocity vs. distance function I pull velocity at 0, 20, 40, 60, 80, and 100y.
Orange: note the Labradar velocity going wonky after 0.4 seconds. This one appears to be from me shooting at a 100y target and the LR somehow recorded data after the pellet passed through. In most cases, the velocity flip-flops from high to low and gets really noisy. Looking at the velocity plot shows you quite clearly at what distance the Labradar data loses validity, and each shot is different.
My next step is to play around with the BC math posted here to try to calculate BC for the shot from my fitted velocity data. I had been doing that using ChairGun and it would be great to include it in my Excel processing.
Last edited:
Upvote 0