It is well known that when setting up a chronograph, ideally to minimise errors, the axis of the bore should be parallel to the horizontal ( canted up and down) and rotational slew or swivel axis of the chronograph sensors.
However, just how accurate should the alignment of the chrony and bore axis be ?
Not that anyone would ever set up a chronograph with 30 or 45 degree misalignments, but how many would be well satisfied if a rough eye check shows that the chronograph is out of alignment, say 5- 10 degrees.
Well the answer might be somewhat of a surprise!
Note – that if a Chrony chronograph is misaligned, the inherent errors can only increase/decrease the apparent velocity readings as the relative distance over which the projectile passes is shortened/lengthened, sensor spacing cannot be changed (which would decrease/increase shown velocities) unless of course the actual sensors moved, which is impossible with a Chrony (without shooting it


Using a Chrony as an example, it is basically and electronic clock with two fixed sensors spaced 12 inches ( 1 foot) apart.
The motion of a projectile initialises the closest sensor, starting the inbuilt timer, then upon passing over the furthest sensor, the timer is stopped and the inbuilt electronic system calculates and displays the average velocity of the projectile as it passed over both sensors.
If the axis of both the chrony and projectile path are parallel, then theoretically the velocity displayed should be accurate within the factory specifications.
According to Chrony -“ Accuracy: 99.5% or better. Displayed velocity will not differ from actual velocity by more than 1 part in 200, i.e., ±10 fps on a velocity reading of 2000 fps.”
Which in turn, correlates to potential error approximates of +_ 15fps @ 3000fps or +_ 20fps @ 4000fps.
So let’s assume that the chronograph horizontal axis is tilted up or down 10 degrees.
For this inclination the actual relative distance (to the flight path of the projectile) between the sensors is no longer 12 inches ( 1 foot) but is in fact shortened/lengthened to 11.817693 inches or 12.185 inches
This distance changes the displayed velocity error by +- 1.5 percent.
1.5 percent error equates to around 45fps @ 3000fps. (1.54266 percentage error =46.279842 fps)
If the chronograph axis was aligned parallel to projectile path in a horizontal direction but slewed either way by 10 degrees, exactly the same induced error would apply ie 1.5 percent or around 45fps @ 3000fps.
What would the theoretical error be if both axis of chrony alignment were out 10 degrees?? (up/down and slew either left or right)
According to the calculator,(and a lot of guesswork and luck) apparently a potential error of approx 3 percent, which equates to around 90 fps at 3000fps. (which by the way, equals the combined error of the individual horizontal and slew errors for a misalignment of 10 degrees).
So it seems, that that extra care in aligning the axis of the bore with both axis of the chronograph is worth the effort, if the minimisation of errors and enhanced and accurate meaningful readings are your priority.
Not forgetting of course, that the total chronograph error, must include consideration and allowance for any inherent electronic tolerance error as well as and in addition to any misalignment errors.
cheers
dave g