Comparison of All Test Optics: Test #3- Medium Range Positional Shooting

This test attempted to measure the friendliness of the scopes to multiple positions under a serious time crunch.  The target was the standard target I’ve been using, a ~4.2” target, placed at 25 yards.  The drill is as follows: Begin with a magazine of 5 rounds.  5 shots standing, reload with a magazine of 10 rounds.  Shoot five rounds from kneeling, shoot five rounds from prone.  A passing score requires all hits in 15 seconds or less.

Hits were measured as hits in the 5 zone, which is the primary target on the paper.  A hit is defined as more than half the bullet hole being in the scoring ring.  The target is printed on an 8.5” x 11” sheet of printer paper.  Here is the target:

New Target

I spoke on accuracy and reliability of test results in the last post.  This test turned out to be essentially invalid.  I don’t believe that the problem was in the accuracy of the testing methodology.  I think the test has the potential to measure what I intended to measure.

I believe that the problem is that my consistency of skill in solving this shooting problem was way too erratic.  We all know that equipment is basically secondary to skill, within reason.  Equipment does make a difference, especially when there is a great disparity between one item and the next.  When the equipment is basically comparable in class, the differences will be fine.  If the shooter’s consistency isn’t well within that margin, it won’t be possible to see how the equipment affected the differences in performance.  That’s what happened here.

Here’s what I can say about what happened over the course of the testing:

I got faster.  That is good.

Test 3 Average Time

My hit rate went rather steadily down.  That is bad:

Test 3 Hit Rate

In the balance of speed and accuracy, my performance over time degraded.  That is bad:

Test 3 Raw Points Per Second

I was not great in the extreme close range type of shooting, but I was at least consistent enough to make a comparison.  Tests 4 and 5, in the 100-400 yard venue, worked toward my strengths.  This test was in a style in between the others.  The subconscious lesson that ‘worked’ for me in test #2 also affected my shooting in this test, but the context was so different as to have a drastically different result.  What saves your life on Sunday will get you killed on Monday.

In order to get better at this test I would first make an analysis of my gunhandling that was wasting time via video. Then I would reduce the distance (increasing the target size) until my hit rate would be 100% at the par time.  Then I would gradually extend the distance.  The idea is to begin where I actually am and steadily improve, pushing the edge of my envelope.

That is all.  Thanks for reading.


Comparison of All Test Optics: Test #2- Close Range Transitions

Links to Individual Optic Test Results:

U.S. Optics SR-8c
Swarovski Z6i
U.S. Optics SR-4c
SWFA SS HD 1-6×24

Thoughts on the Drill Itself

I’ve come to refer to this drill as the X-Box.  As I described before, this test uses directional transitions to test the friendliness of the optic to target acquisition.  Two target stands are placed approximately 7 yards apart.  Each target stand is 8′ tall and has a target at the bottom and a target near the top.  The shooter stands equidistant from each stand so that they are both approximately 10 yards from the shooter.


Although the distance of this drill is not all that different from test #1, which is at 7 yards, the drill itself is qualitatively quite different.  While test #1 occurs over a very short duration, is a very simple shooting task, and puts 100% of the score into the outcome of 1 shot, the X-Box drill is more complex (I would call it ‘moderate’).  Test #2 occurs over a longer duration with more shots, which allows the shooter to leave the mental game of waiting on the timer, pre-planning, etc., thus reducing the mental game aspect of it.  Since the drill has more rounds it is less sensitive to mistakes, and provides a bigger picture of performance.

I would have liked to have the targets presented in a perfect square, but 21′ target stands are not realistic, and it was important to me to have a wide area between lateral targets to traverse.  I wanted the transition to be significant, and not simply a matter of picking up something that was already in my scope.  It is obvious that the up and down transitions were faster due to the much short distance between targets, approximately 6′ versus approximately 21′.  Shot #4, the first of the up/down transitions, was always the point in the drill when it felt like ‘it’ took over shooting for me and I just tried to stay out of the way.  The same phenomenon is probably what caused me to forget where I was going to in about 1 out of 4 runs.

Average Split Times:

Test 2 Average Split Times

What Makes an Optic do Better in This Drill?

I came into this drill expecting field of view to make a big difference.  Here are the published numbers for field of view at the lowest power setting of each optic from the manufacturer’s websites:

SR-8c                83.25′
Z6i                   127.5′
SR-4c              110′
SWFA              95′
I could not find a number for the Aimpoint.

I came out of the tests with a different theory, which I have already discussed a bit in the overview of the SWFA scope.  With an unmagnified optic, and shooting with both eyes open, field of view inside the scope is not exactly what is important.  The total field of view is what matters.  Essentially total field of view can be defined as everything in front of view minus what the rifle and non-see-through parts of the scope obstruct.  I initially expressed that as: vision = field of view inside the scope + field of view outside the scope – the field of view obstructed by the rifle and scope.  Note that the “-” symbol represents a subtraction sign and not a dash.  Since then I’ve learned something about reticles and have to conclude that the reticle, although necessary, is essentially obstructive in nature.  That would make it necessary to amend the formula to something like this: vision = (field of view inside the scope – that which is obstructed by the reticle) + field of view outside the scope – the field of view obstructed by the rifle and scope.

In light of my theory, and speaking of field of view, I should note that I did not shoot this course of fire with any of the variable power optics at the lowest power setting.  I have already noted several times that at the lowest power setting the image size appeared to be smaller than what I see with the naked eye.  Therefore the optics were “turned up” a bit, in each case somewhere between 1 and 2.  The SR-8c had to be turned up to about 1.6x, the SR-4c barely at all, and the two 1-6x scopes in between those extremes.

In terms of total field of view, I would rank them as follows, as viewed with a perfect sight picture, from best to worst: Z6i, SR-8c, SR-4c, T1, and SWFA.  I actually referred to through the scope photos to come up with that, although it is still based on my impression.  I was surprised to see that the SR-8c was actually better than the SR-4c in that respect, but the SR-4c had a thicker ‘flange’ of material between the in the scope image and the view outside of the scope.  What likely led to that confusion was performance in another key area: ease of eyebox acquisition.

The eyebox is the area that the eye needs to be in to see a perfect edge to edge image through the scope.  This varies quite a bit between optics and usually between powers of any single variable power optic.  This is probably an equally important quality in a scope to be able to transition quickly and accurately.

Ideally, the eye can discover a target, acquire it, and the rifle will be presented so that the scope image (via the eyebox) arrives at the eye, hopefully presenting an acceptable sight picture.  The tighter the eyebox, the more exacting that procedure must be, and therefore perhaps making it slower and more difficult.  The better trained the shooter is to acquire a consistent cheekweld, the less of an issue this will likely be, but no one is perfect all the time, and not all positions are as easy as others.  I cannot rate the eyeboxes of the test scopes objectively at this time, only from memory.  I would rate them as follows, from best to worst: T1, Z6i, SR-4c, SWFA, and SR-8c, with the last two being close to a tie.  None of them were bad at all, but the T1, being unmagnified, really didn’t have an ‘eyebox’ per se, and the Z6i is incredible.  The SR-4c is quite good as well.

Accuracy and Reliability of the Test

So I went shooting with different optics and kept good track of how I did.  How does that make for a meaningful evaluation of an optic?  By itself, it doesn’t.

Like rifle shooting, a good test should be accurate, meaning that it should shed light on a particular aspect of performance of the tester’s choosing.  Since this drill involved transitions, it should be a good indicator of how the optic affected the shooter’s ability to perform transitions with that optic.

Another important quality in a test is reliability of the data.  How much confidence can be placed in the findings?  I have already pointed out that it would have been a lot better with a large number of shooters doing the same drill, as this would have been more likely to provide results useful for the average shooter.  In lieu of having a large number of shooters, I had to rely on my own consistency in shooting and ability to analyze what I experience.  In this test there was a consistency issue, but I was very aware of it as it occurred.

The Beginning and the End of My Time on the Plateau

This test involved hitting a plateau in the middle of the testing and moving out of it just at the end.  The first optic I tested in this drill coincided with the first time I shot it.  That was not a great plan looking back in hindsight, but I figured at the time that the only times I would shoot the drill would be in testing.  I felt like the infrequency of shooting this drill would preclude and increase in skill.  I was wrong.

You may recall that in the Z6i test I remarked, “On runs 1 and 2 I turned in good times, but nothing out of the ordinary. Run 3 felt normal but was significantly faster for me. On run 4 I could feel that I was moving at a comparatively smoking hot pace…”  This was an instance of learning taking place.  That individual test affected the rest of the optics in this test through the T1, which was the second to last.

I left the plateau the day that I re-tested the SR-8c.  I believe this can be explained by the fact that this was the last optic in the test that I had to do ‘work’ on and collect numbers on.  As this process was new to me, and at that point I had a lot of numbers I wasn’t sure how I was going to best interpret, it was no small relief to be done with the actual tests.  I was beginning to relax.  You can actually see that in the results from Test #1, where my hit rate may have turned in a little lower than normal.  Since the tests are qualitatively different they demanded different levels of focus to shoot them well (intensity vs. open focus), and I think that is what allowed me to do so much better in the SR-8c retest in this instance.

Plateau Graph


Average total time

In the previous drill I think that the time was a better indicator than the hits.  In this drill I don’t think that I can assign more value to either, but will still present each as an average total time for a single run with each optic.

Test 2 Average Total Time

Taking the “plateau graph” into consideration when looking at this graph, I think what can be clearly stated is that the Aimpoint is clearly fast and that the SWFA was clearly slow in comparison to the others.  I would also speculate that since the Z6i was basically on the verge of the plateau as I entered it, it would be running closer in time to the SR-4c, but I don’t know if it would match or surpass it.

The SR-8c is an interesting case, as it came in last in the initial test and at a close second in the retest.  I would say that it would probably run a close third behind the Z6i and SR-4c.

On Paper:


SR-8 X Drill




Total x-box SR-4c


Test 2 Targets

Aimpoint T1 Micro

Test 2 Aimpoint

U.S. Optics SR-8c Retest

SR-8c Recap Test 2

 Hit Rate

Test 2 Hit Rate

The graphs are interesting to me, because they make it so much easier to see things that I was not all that aware of before, even though I had the results on a spreadsheet.  The adaptation that I referenced earlier with respect to my ability to shoot this well pertained mostly to speed.  It would also seem clear that this graph somewhat coincides with the “plateau graph” above, notable exceptions being the T1 and SR-4c, which were among the fastest.  With the T1 especially, I believe that the relative lack of ability to see hampered my efforts in this drill.

Remember that I was puzzled by the overall lackluster performance of the Swarovski Z6i in Test #1.  After a lot of guessing, I made a comment, “The other thing I have wondered is if I could just see better and it caused me to be more discriminating in my decision to fire.”  I think that is close, but not quite accurate.   I think that it would be more accurate to say that each shooting problem demands a certain amount of visual information.  It might seem like more is always better in terms of the speed of observation to action (Boyd’s cycle would be a better way to understand that process), but I don’t think that is that case.  As with many other things, I think that precisely just enough information makes for the most streamlined action cycle that would arrive at an appropriate solution.  Even in test #1, while the T1 made a very respectable showing as far has hit rates, the hits were not all of as high a quality as with other optics in terms of points, which took into account the quality of the hits.  As the distance is increased and the complexity of the problem made “deeper” the requirement for information is intensified.  I believe why this is where the T1 began to show its deficiencies in comparison to the other optics.

Remember that the SWFA scope was given an accuracy boost due to my taking part in some training in the days preceding these tests with that scope.  The training addressed similar shooting problems as these.  It could not be helped, and I mention it only to aid your interpretation of the results.

I should point out the obvious in saying that the ~4.2” targets are smaller than necessary for most people’s requirements for hitting the heart/lung vital zone of a large animal.  If that is the case for you it would probably be safe to double the distances I worked at.

Average Total Points

Test 2 Total Average Points Per String

The points measure is similar to the hit ratio, except it places more weight on a center hit than an edge hit.  It also does reward 1 point to the larger circle outside the primary target.  Note that it would take several hits nearer to the center to make up for one single point shot.  The red dashed line in the graph represents the score if nine shots scored the minimum for a hit, which is 5 points per shot.

Hits Per Second

Test 2 Hits per second

Finally it becomes apparent how good my last runs with the SR-8c were in comparison to the others.  I think the graph illustrates how the U.S. Optics scopes were efficient performers.  Taking my “plateau graph” into account, I would say that the Z6i was right up there as well.  I already remarked in the SWFA individual test results how the reticle was just overwhelming my ability to receive the information I needed.  With the Aimpoint I just couldn’t see as well.

Points Per Second

Test 2 Points Per Second

In this measure the SR-4c again shows its dominance.  It just had some better hits, although its hit ratio was not as high as the SR-8c retest.


In looking at the totality of the tests, if I had to actually put money down on the optimum optic for this application, multiple targets at relatively close range, I would say that the Swarovski Z6i probably has the best balance of attributes to allow the shooter to work.  I think that the two U.S. Optics scopes allow very close performance to the Z6i, with the SR-4c having a slight edge over the SR-8c.

Comparison of All Test Optics: Test #1

Links to individual test results:

U.S. Optics SR-8c, 1-8×27
Swarovski Z6i, 1-6×24
U.S. Optics SR-4c, 1-4×22
SWFA SS HD, 1-6×24

The idea for this test was simple.  How fast can I bring the rifle up, get a sight picture, and fire one shot at a close range target?  In this case 7 yards was the distance and the target size was appriximately 4.2”.

I worked at this skill prior to beginning the testing to the point that I felt that I would not endanger the validity of the testing process by doing something as egregious as easily improving my performance over the course of the tests.  I don’t believe the process was perfect, as one error over the course of 20 shots would affect my hit rate by 5%.  I don’t believe that the speed was affected as much by variables as the accuracy, as a few times I could just tell that I messed up and missed.


What I find striking is the similarity in speed between the Aimpoint T1 and the SR-4c.  Conventional wisdom says that these variable 1x scopes are almost as fast as a non-magnified red dot.  As far as speed they were basically indistinguishable.  This trend continued beyond this test and wasn’t confined to speed.

I’m also incredulous at the Z6i being the slowest.  I have doubts about what caused it, because I can’t understand why an optic with such a clear, bright image, wide field of view, minimalistic and out of the way reticle, daylight bright illumination, and generous eyebox could turn in such low numbers.  The thing is, I can’t find a me related explanation for that.

I do wonder whether some aspect of the Z6i made it slower for me in this test.  Note that both of the slowest scopes, the Z6i and the SWFA SS HD, were those that I found to have the clearest, brightest images.  The Swarovski really is just amazing to look through.  I would not be far off to say that it looks better than real life.  I wonder if it’s just a little too much to adapt to in a small amount of time.  The Aimpoint is much faster, and really isn’t impressive at all optically.  When it comes down to it though, I still simply cannot accept that the Z6i could really be that much slower.  I would need to do more work with it to believe what the results I have are showing.  The other thing I have wondered is if I could just see better and it caused me to be more discriminating in my decision to fire.

 Test 1 fastest slowest

Looking at the fastest and slowest times with each optic, I think what you are seeing is a graphic estimation of my personal maximum deviation for this test.  What I do see again, however, is that the Aimpoint and SR-4c standout from the rest and must have offered something to allow me to gain about a tenth of a second improvement over all the rest.  As for the slowest, The 1.36 of the SWFA was the 20th rep in a perfect run.  I didn’t want to ruin it.

On Paper:


7 Yard Snapshots- resized





SWFA SS HD 1-6×24:


Aimpoint T1 Micro:

Test 1 Aimpoint

SR-8c Retest:

SR-8c Test 1 recap

I probably need to clarify again what I count as a hit in a given scoring zone, as different people have different methodologies.  In my mind, if the tip of the bullet misses the target it is not a hit, but a grazing shot.  On the other hand, if the tip is inside the scoring zone I count it.  Therefore, if over half of the bullet hole is in the scoring ring I count it as a hit.  If I couldn’t tell I give the benefit of the doubt to the shooter (me), which I think is the best policy in general.

Looking at the targets, it comes to my mind that the Swarovski got robbed of a perfect score by my trigger finger.  I didn’t take a lot of notes about the tests, so when I see that phrase “I had one wild miss due to a trigger control mishap,” I can’t help but feel as though I let it down.  That illustrates the big weakness with this test, which is me.

As I look at each target, I interpret the wide misses, such as the Z6i, the top hit on the SR-4c target, and the same type of hit on the SR-8c retest, as gross errors on my part.  I haven’t massaged the numbers to reflect this interpretation, but I just want to point it out.  I don’t know what to say about the other misses, so I just accept them.  I probably just thought I saw an acceptable sight picture and went for it.  The mistake could have come from either part of that process (the seeing or the acting).

To keep you from having to count misses and points, here are some graphs for you:

Hit Rate

The best performers in terms of hit rate were the slowest ones.  That shouldn’t be surprising.  It’s possible I just took the time I needed to get better hits.  The SWFA did have the benefit of me having had some extra practice the week leading up to the test, and I had expected a higher hit ratio than normal, although I didn’t expect a perfect one, and I expected to be faster, which I wasn’t.  There wasn’t much I could do about the extra week of shooting prior to the test.  My schedule just worked out that way.

Total Points

The points tell a similar story, although not the same, as points reward center hits as ‘better’ than edge hits.  The target also awards 1 point to an ~8″ ring in deference to the universality of the paper plate as a target (alternately read as a body shot).  Note that due to the slightly offset group of the SWFA, the Z6i still beat it in points, as 95% of the Z6i’s group was more well-centered in the sweet spot of the hit zone.  Likewise, the SR-4c and the Aimpoint run neck and neck in points, although the Aimpoint had a 5% higher hit ratio.

Because, as I pointed out when describing these tests initially, speed and accuracy are both important, I think the most telling results are those that show them factored in together.  I did this in two ways, one with points as marked on the target and one with simple hits.  Here are the hits per second:

Hit Factor

Interestingly, this was a battle between the SWFA and the SR-4c, which in this test embodied accuracy and speed respectively.  Anyone who shoots USPSA and has an eye for stage strategy will have an idea that each course of fire usually has a sweet spot in how it needs to be shot on the continuum between pure speed and pure accuracy.  In this case the accuracy won out.  It would be interesting to go back through the testing process with that bit of knowledge.

The SR-8c seems to be showing a bit of comparative weakness at close range, especially in the re-test.  The re-test really suffered due to some close misses, which really hurt its points.  I don’t think it’s performance up close was particularly bad, especially considering the versatility of the scope, but for a shooter with a bias towards the close range end of the spectrum, it would seem that other options might be preferable.

Point Factor

The points per second measure enabled a scope to make a better showing if it got some hits nearer to the center of the target.  This did make a slight bit of difference, such as with the Z6i gaining back some of its lost ground and nudging right up to the SR-4c.  The Aimpoint also loses quite a bit of its ground in this measure.  To me, that is an indication that I just couldn’t see quite as well with it, which is what I felt at the time.  Apparently I could see well enough to get a hit most of the time, but not to make those hits of higher quality, if you believe in that sort of thing.

One thing I would like to revisit is the subject of illumination.  All of the scopes except for the SWFA had daytime bright illumination in the form of a single dot, similar to the Aimpoint.  I shot this with illumination activated on all the scopes except for the SWFA.  The SWFA was one of the slower scopes.  Some might wonder how the other scopes would do without their illuminated dot.  While I didn’t run all of the scopes without illumination to test this, I did so with the SR-4c.  This was at the request of someone on an email list, so I didn’t save all the data or the target, but I do have the average time and hit rate.

SR-4c with and without illumination

The SR-4c lost some of its luster without the dot.  It was significantly harder to use, although I wouldn’t go so far as saying it was hard to use.  I don’t know exactly how meaningful this is.  I can see only a few instances in which one might have to use the optic without the dot.  The battery could be dead with no replacement handy.  The user might not have time to activate the illumination, or may have forgotten to do so.  The illumination might have timed out and turned itself off.  Of these possibilities, I did have the illumination time out a couple of times with the SR-8c and the SR-4c.  I did not experience a battery failure in either of these in the 6 months I had them here, probably due in large part to the auto shut off feature.  If the possibility of failure in the illumination system is a big issue for you, the SWFA obviously might be a good choice, and I think that the second focal plane of the Z6i would be less affected than the first focal plane reticles (with the probable exception of the SWFA) by a loss of illumination.

Which one would I choose out of these if I had to pick from one of these?  Based on the test results alone, it would appear as though the SWFA had the best balance of speed and accuracy.  I didn’t miss any shots with it after all.  After I factor in what I know about the testing process, I would pick the SR-4c or the Aimpoint for close range performance only, without making considerations for what versatility any of the scopes might add at longer ranges.  Those two are fast.  I think with more practice the accuracy could be improved, and if I’d shot with either of those at the time I shot with the SWFA, the hit ratio would likely have been better.

Thoughts on the Testing Process

Before I get into laying out the results of the tests and make comparisons for how the optics did I’d like to share some of my thoughts on the testing process, and the strengths and weaknesses of my methodology.  First of all it was a great opportunity to have access to such an array of scopes that I would likely never have access to, but for the thoughtfulness and generosity of my friend at U.S. Optics and Ilya, the optics guru/webmaster at Optics Thoughts.  This turned out to be a lot more than just a chance to get to play with what for me could be viewed as prohibitively priced “toys”, but a chance to learn a great deal about shooting, specifically the seeing part of it.

I tried to approach the tests in a way that I haven’t seen done before.  One of the reasons I started this blog was because I kept looking for information on certain topics that just didn’t seem to be available.  What I tried to do in these tests was evaluate gear not in terms of how neat it was, or how nice a certain facet of it appeared upon examination, but how the product as a whole and total thing affected specific aspects of my shooting.  Evaluating optical clarity or brightness, for example, may be important, and probably do relate to an aspect of work in the field.  But to take an optic see exactly how it performs in relation to other competing optics where the rubber hits the road, and to do it in a carefully measured and analyzed way is not something I have seen before.  This small article explains the strengths and shortcomings of my attempt.

The Bad:

The primary issue with the testing is one of sample size, in the number of shooters participating in the study (n=1).  I could also say that it would be better to have access to more samples of each scope, which it would, but I don’t think that scopes are as variable a good as, say, guitars.  Also, I wasn’t really trying to pick the scopes apart mechanically or optically, just seeing how well their designs and features enabled me to shoot.

With more shooters, the results of the tests would be much more likely to apply to other shooters.  As it is, they relate specifically to me.  That places the burden on me to explain how the subjective interacted with the objective test results.  Luckily, I think my mind is geared towards analyses such as these.  Hopefully I can bring something meaningful out of the results for you.

There were some issues with consistency in methodology.  These were mostly things that couldn’t be avoided.  My trigger broke during optic #2, test #3.  The replacement was quite different.

I wasn’t able to use the same mount for the SR-8c as I was for all the other optics.  I don’t think this turned out to be a big deal, but it was something different, and less ideal.  When I re-tested the Sr-8c there was really no need to re-adapt to it.

One of the major wildcards was me.  Originally my intent was to get up to speed with the AR so that I was at a plateau.  It turned out that I hit that pretty fast, and at a lot lower of skill than I had hoped.  Overall my level of skill was quite stable, with some unintended learning and adaptation going on in test #2, the X-Box drill.

The tests took place over a rather long span of time, probably about two and a half to three months.  I would have liked to have spent a significant amount of time with each optic, because I think that really getting to know the strengths and weaknesses of a system takes a lot of time.  The problem with all that time is that it’s hard to maintain a stagnant level of skill for that long a time.  I did my best, but the subconscious just wants to adapt.

The major differences between testing the optics themselves were that I spent a lot more time with the SR-8c than any other optic, and that the frequency of testing sped up at about the time I tested the SR-4c.  I also spent the week shooting just prior to my test of the SWFA scope, which I’m positive skewed the accuracy numbers favorably for that scope.

I made a few minor shooting errors in testing.  Larger sample sizes of shooters probably would have ironed this problem out.  There are times when I just messed up and, for example, screwed up the hit ratio in the 7 yard testing of the Swarovski Z6i.  I got my hold wrong with the SR-4c.  I used the wrong mark to hold with the SWFA scope.  Test #2 typically had at least one “short circuit” moment with each scope except for the SR-8c.  Some of those were more egregious than others.  I’ll note them in the tests.

Better ammo would have been nice, especially for the 100 yard precision test and the ‘long’ range transitions.  I do think that the XM193 is consistent, especially within the same lot, as I used for the testing.

The Good:

First of all, and most importantly, I think that the tests were valid, except for test #3, DD25.  I just couldn’t shoot it well enough to be consistent.  Otherwise I think the tests essentially got at what I wanted them to get at.  I will re-state the purpose of each test as I cover the results.

The state of the barrel’s cleanliness was extremely consistent, as I cleaned the barrel with Patch Out and Accelerator prior to beginning each round of tests.  By the time I got to the point where precision mattered, the barrel was thoroughly fouled, but not so much as to affect the precision negatively.

The tests were conducted with care to ensure uniformity in the procedure of setup and administration.  The same type of targets were used for each optic.  I shot them in the same places, at the same distances.  The times of day were generally the same.  The conditions didn’t change as much as they might have over such a large span of time.  When they did, such as the grass getting longer, I figured out ways to overcome that without compromising the tests, such as using my vehicle to shoot out of instead of the ground.

One thing I did that I haven’t discussed yet was that I tested the Aimpoint T1 Micro at the first 3 tests as sort of a non-magnified ‘control’ optic.  I also re-tested the SR-8c at the first three tests to see how my skills changed over the course of the testing process.  I’ll bring those results into play as I go over those specific tests.

Finally, for most of the tests I think that my shooting has been consistent enough to see the difference in the optics.  When I was less than consistent I’m at least able to recognize that the testing was essentially invalid, as with DD25, or when it changed, as with the X-Box test.

I realized that I’ve been throwing a lot of densely packed information out, and perhaps in not the most digestible fashion.  I decided to try to take after the format that Cal over at Precision Rifle Blog has been presenting his scope tests, which are done in a much more systematic and rigorous manner by the way.  Hopefully you’ll be able to make sense of the results, which are coming up next.


Dr. Scopelove, or How I Learned to Stop Worrying and Love Parallax


(From the undiscovered, secret “Art of the Rifle” archive vault of shooting past. I thought I would take a break from the number crunching of scope testing and write about something quick and easy. Apparently I have a mental disorder that compels me to collect and crunch numbers.)


Parallax seems to be in the mind of more shooters as they are starting to notice that a lot of scopes have side focus knobs. There’s a knob there that says “Parallax” with yard numbers and such, so it has to be a big deal. I must have noticed that knob at some point, as evidenced by this old article on my blog.

Before I entered the abnormal mental state of coming up with numbers to crunch in relation to shot group stats, I had a different abnormal mental state of coming up with creative shooter-related diagnoses for what turned out to be a gun that needed some accuracy work. One of the things I got hung up on was parallax, as my scope on that rifle, formerly the FN PBR-XP, currently the “Mark Deux”, is not equipped with a side focus knob. Sometimes not having the means to adjust something can give one the idea that that lack of that ability is a serious and significant deficiency.

Earlier in the year I was loaned a sample of the upgraded SWFA 3-15×42. Perhaps upgraded is the wrong word, because I don’t know that it’s actually intended to replace the 3-9×42, but it would just make sense. As Nigel might say, “This one goes to fifteen. Well that’s six more, innit?” I thought it would make sense to compare them and see how that turned out.

The newer 3-15×42 has a side focus knob. One of the things I wanted to compare was whether that resulted in better capability for precision. Before I did that, I needed to establish a baseline with my tried and true 3-9×42. What follows is my submission of evidence that I actually made my journey to the dark land of parallax and returned to tell the tale…

Plan A: “No plan survives first contact with the enemy.”

Armed with the On Target TDS program and some of the proprietary targets that go with it, I set out to attempt to induce parallax error at 100 yards. Those of you who are smarter than me (only the 7,000,000,000 who have that distinction) may have already spotted the flaw.

I have to reverse engineer my grand plans to some degree, because this happened a couple months ago, but the plan seems to have been to shoot 3-4 sessions of 12 shot experiments. Each target consisted of three columns of four bulls. For those actually keeping track that’s one shot per bull. Since I had three columns, for the left column I would put my eye as far left in the scope’s eyebox as I could while still being able to see the target. For the center column I carefully aligned my eye in the center. For the right column I moved my eye to the far right. I shot them from left to right, which would mean that each group was formed in a “round robin” fashion to reduce the possibility that something would make one group better than the others due to some outside condition. After a few outings I should have collected large enough sample sizes to be able to tell something.

After shooting the 12 shots I compiled the results into three groups of four shots, one for each column, using On Target TDS. On target allows for precise measurement of group size in extreme spread and mean radius, the location of the exact group center, and the measurement of the distance of the actual group center from the point of aim in terms of horizontal, vertical, and total deviation.

The raw targets look like this:


Here are the composite groups:


Parallax Left


Parallax Center


Parallax Right


Parallax Sum

After analyzing my first target I remember being confused. In retrospect I should not have been thinking anything after only four shots on target, but I did not see what I had expected to see, which was three distinct points of impact that correlated directly to the directional change in eye position. Nothing in life ever turns out that perfect.

Later in the day I tried again with exactly the same format. This time I felt like I shot a little better. Note that in this case the group centers were all very similar. There are differences in group size, but I felt that could be easily attributable to having more difficulty with sight picture and eyestrain at the edge of the eyebox.


PM Parallax Left


PM Parallax Center


PM Parallax Right


PM Total Parallax

You might have noticed little to no shift in the point of impact in the last three shot groups and the composite total.  I noticed that too.  I now had a total of 24 rounds telling me something different than I expected. I started considering something that I should have thought of before shooting. I emailed SWFA and asked them what the parallax was set to in the scope. Skylar answered the same day with the following: “The SS 3-9 has a fixed parallax set at 100 yards. “ Trying to induce parallax error at the precise distance that the scope is set to be parallax free is not a really good idea for those of you that are sane and well-adjusted.

Plan B: Trying to induce parallax at a distance other than the factory setting.

After hearing from Skylar, I set about researching the problem a little more. I found something interesting on the internet here: (scroll down to post #6 by LouBoyd).

Using his equation, RT = RS(Dt-Ds)/Ds, I figured out that at 200 yards the possible parallax error was equal to the radius of my objective lens, or approximately 0.827”. I used 200 yards because of my overdue shooting goal, which is to be able to hit a 4” target within that distance under a wide variety of conditions (understated). 0.827” isn’t much, but I decided to test at 200 anyway.

I placed three targets at 200 and shot them in round robin fashion, with 6 shots each. Here you go:


200 Parallax Left


200 Parallax Center


200 Parallax Right;

Here is a composite target of all 18 shots:

Total Parallax 200

Here is a chart of the deviation, in MOA, of each group.

Parallax Chart

Here is a chart of the mean radius if each group, also expressed in MOA:

Mean Radius Parallax Tests

After seeing that the group that should have been pretty good was not, and that the groups that should have not looked as good, but did not, and after not seeing any particular shift in point of impact, and, after seeing that my groups at 200 yards in which I tried to induce parallax were better than the groups I fired at the distance at which my scope is parallax free… I decided the following:

While parallax is a real thing, the importance of it isn’t universal to all shooters. For long range it’s going to be more important. For my purposes, there are more important things to work on, like follow through, which is what I think happened with the “center eye” target at 200. Once again, it comes down to fundamentals.

SWFA 1-6×24 HD Test Results

There were a few issues with the testing of this scope that had nothing to do with the scope.  On Monday of that week I cut my support hand badly.  I had also been doing some specific training the during the week leading up to the tests.  That put me in a time crunch so I couldn’t get stitches for my hand and the wound kept getting reopened during the training.  This training, coincidentally enough, concentrated on hitting small targets at close range.  Misses were heavily discouraged.  It was worth it to take the time to get the hit.  The day prior to the test I practiced field movement with my bolt gun and all my gear leading  up to shots on the 4”ish targets in sub-optimal terrain under time constraints.

All the other tests had nothing much going on before them, so it seems likely that the results may have been affected.  Going into the testing I expected that there could be a difference.

I also had started to shorten up the interval between scope testing at about the time I tested the SR-4c.  This was not really intentional, other than that I started feeling the pressure to get the scopes back to their owners after they were gracious enough to let me try them out and I took way too long.  My expectation going in was increased speed and accuracy on tests #1 and #2

Test 1: Single Shots at 7 Yards

I discovered upon taking sight of the targets after setting them up that I couldn’t see the illumination.  I double checked to see that it was on, which it was.  I could also tell that the reticle took on a slightly crimson hue when the illumination was on, but for all intents and purposes it was worthless.  I ran the tests with it off, which make the optic the only one of the tests to have been officially tested without illumination.  Upon reflection it would have been something worthwhile to include with all the scopes, but that is another time and money thing where a line had to be drawn.  I did shoot this course with the SR-4c without illumination as an unofficial experiment.  I’ll share those results later.

This test had the distinction of being my only completely clean run at it over the duration of all 20 shots.


Test 1 numbers

In terms of speed, the SWFA 1-6×24 was in between the SR-8c and the Z6i.  The hit ratio was the best out of all the optics tested.  I don’t know how much, if any of that, to ascribe to having had a rifle in my hand for a large portion of the four days preceding the test.  In terms of the scores that factored in time, this scope was on top at that point.

What’s odd to me is that during the test it felt as though I was fighting the reticle, but I ended up shooting more accurately.  I tend to think that my level of skill was “warmer than ambient” due to the work I had done in the preceding days, but I think the only fair way to deal with that is just to take the results as they are.

Test 2: Transitions- the X-Box Drill


Like test 1, I felt like I was fighting my way through this drill to some degree.  How I think it should work is that both eyes should be open and my eyes should pick up the next target and drive the point of aim to it without losing the scope’s eyebox at any time in the process.  I experienced that to the greatest degree with the Z6i.  I had a hard time maintaining the eyebox in this test.  The other thing was that I ended up shutting my left eye because I felt like I was fighting through too much information.

Test 2 Targets

Test 2 numbers

As with test 1, my hit rate was better than any of the others tested so far.  The hits per second score was 6 thousandths less than the Z6i.  The other scores were below the SR-4c and the Z6i, but above the SR-8c, which had the honor of being tested the first time I shot this drill.

Almost every time I shot this drill, I would have at least one “short circuit” type moment where I suddenly realized I was heading to the wrong target or just forgot where to go next.  This first happened with the Z6i, and I actually went to the trouble of figuring how to ‘correct’ the numbers.  At the end I’m going to leave them as I shot them but point the errors out.  Judging by the times, I had two minor “where am I going” moments that added about a half second each with this scope.  These are not as complicating as the “oh crap I’m heading to the wrong target, which one am I supposed to be going to, am I sure?” type mistakes that can add about a second and a half per mistake.

Test 3: DD25

Test 4 Targets

Test 3 Numbers

This drill was enough to unravel whatever accuracy edge I might have trained in over the course of the week leading up to it.  Again, you can see the trend that I started pushing my times at the expense of accuracy.  Eventually you’ll see a nifty graph that shows this course of fire getting progressively worse with each time I shot it.  This has nothing to do with the scopes, but perhaps something to do with my attempt to keep my skills at all these tests as static as possible in order to keep the results accurate across the board.  What actually happened was that increasing speed was paying off for me in the X-Box test and I tried to do the same thing subconsciously with this one.  This being a different type of shooting altogether, it didn’t pay off to speed up.

Test 4: Groups at 100

This was the big chink in the armor for the SWFA scope.  The target I use for this test is not really too hard to see, as it is made up of a black circle and dot on a white piece of paper.  The problem is that the circle and dot are relatively fine.  The boldness and density of the SWFA reticle made it difficult to discern the target.  This was ironic, because the SWFA’s reticle basically bracketed the target circle perfectly, the circle being 3.5”.  A mil is 3.6” at 100 yards, and factoring in the line thickness it was just about, as Ace Ventura would say, “Like a glove.

Here’s an illustration of what I was up against:

6x view

And just a bit closer to what I see:

This should really be a double tripod job, one for the scope and one for the camera.  Alas, I have not a single tripod.

In contrast, I happened to have the SR-8c still on hand.  The reticle is a little more sparse, since it lacks the half mil hashes, which are not only unnecessary in my opinion, but detrimental.

At 6x:

 SR-8c at 6x

SR-8c at 6x close up

That’s what an extra $1500 will get you.  Yes the crosshairs do obscure the target center, but I believe that the line thickness may be less, and/or the reticle might not be as dark.  In any event it’s easier to see the target.  At 8x it was significantly easier as can be seen (approximately) below:

SR-8c at 8x close up

Here are the SWFA’s numbers:

 Group 1

Group 2

Group 3

30 Round Group

Test 4 Numbers

Test 5: “Long” Range Transitions

I’ll lay out the procedure for you again.  I have four targets at the following distances: 170, 230, 270, 330.  I have put them in a different order, left to right, for each optic test.  I have shot the same order, left to right, in the same way every time, in a manner that best balances the permutations for each transition.  I have a total of 36 rounds, 18 in two magazines, which gives me a total of 9 shots on each target.  I evaluated the targets in terms of points on target, group size in inches, extreme spread, mean radius, and deviation of the group center from the point of aim (horizontal, vertical, and total).



I had hypothesized that this optic would do very well on this test.  I have felt that the reticle is too dense for most applications, but that the density would pay off at longer ranges.  It turned out that it basically held its own against much more expensive scopes.

I made an error with the first four shots, being the first shot on each target.  Although I had known that the reticle had half mil hashes, I still reverted back to reading each has as a mil, causing my holdover to be less than it should have been.  To compensate for this I left the lowest shot from each target out when I plotted them with On Target.  This gives the SWFA scope a bit of an unfair advantage in the extreme spread numbers, but not a huge one.

Where the scope shined was in the precision of the groups.  The spoiler alert is that the SWFA showed the best average extreme spread (by 0.055 MOA) and was a close second in mean radius.  It was consistently last or second to last in terms of deviation though, which put it in third place for points.





Here are the numbers.

Test 5 Numbers


All in all, this was a solid scope held back by a busy, “over bold” (not what happened to Mr. T- that was over gold) reticle.  Upgrading the reticle with something simpler, and daylight bright illumination would very likely put this scope’s performance up there with scopes over double the price.  In my opinion, doing all that and putting the reticle in the second focal plane would put this on my list of scopes I would buy.  Sounds quite similar to the features of the Vortex Razor 1-6, but I wasn’t able to test that scope to say for sure.

Coming up soon I will break down the performance of the scopes in each test for comparison and further analysis so you can see exactly how they compared and maybe get an idea of why.

Test Optic 4: SWFA 1-6×24 HD


You might be getting sick of the scope reviews. That’s alright, because I talked with Steve and he said he liked them. Steve, if you get a chance email me about 3-Gun stuff. Back to our regularly scheduled programming.


I should begin by saying that prior to putting this scope on a rifle, I had been spoiled by scopes that cost $2500 (x2), and $2000. This scope, the SWFA 1-6×24 HD runs at about $1000 and I have seen it on special for about $800. I was very curious to see how it performed in comparison. I go back and forth between wanting the absolute best (typically thinking it must be the most expensive) equipment and thinking I should be able to do more with less, so this was interesting.

I received all of the scopes that I tested within a short time span, probably a few days (it was a while ago). Before I mounted any of them on a rifle I just looked through them at ‘stuff’. At the time I was very impressed with the image quality of the SWFA. Not being an optics guru, I’m not going to try to describe it in more detail than by saying I thought it was nearly the equal of the Swarovski, and noticeably brighter than the USO scopes. Temper that with the admonition that I’m not anywhere near the optics connoisseur as I am a trigger aficionado.

trigger aficionado

In the interim after receiving the scopes and before I mounted the SWFA, I spent months playing with the SR-8c, the Z6i, and the SR-4c. The SR-8c was the first one I mounted and spent the most time with. It basically set the bar, so I think I should try to explain how that bar was set.

My impression of the U.S. Optics scopes is that they are well designed, well executed, and have nothing at all gimmicky about them. I think that they are meant to fulfill the expectations of an end user who knows what he’s doing and who is willing to shell out the mackerels for the best he can get. The only trends they need to worry about are performance related. The Swarovski was similarly lacking in anything other than what would make it functionally excellent in its intended genre.

What all the high priced scopes had in common were relatively minimalistic reticles that stayed out of the way of their daylight bright illumination. It didn’t take much time on the range with these scopes to appreciate the utility of an uncluttered visual workspace. This also had an influence on how I handle mechanical offset. In my mind, having had the chance to try it, the superiority of a single illuminated dot combined with a clean, clear field of view is obvious. I disagree with some other reviewers on this, who feel that reticle trumps illumination. I, of course, am right, no matter how much I may enjoy and learn from their reviews (my wife said to make sure it’s obvious I’m joking so I don’t look like an ass).

The context of where the bar was set led me to an immediate dislike of two of the characteristics of the SWFA 1-6 as soon as I mounted it. The knobs and scope caps are unnecessarily large and the reticle is too busy. I’ll explain.

Big adjustment knobs seem to be part of some trend in the tactical scope market, particularly with precision rifle scopes. I own one of those scopes myself, the Vortex Razor 5-20×50, which practically has a beer can sized knob. I don’t think it’s that big a deal with a long range scope, but in a scope with a minimum power of 1x there are some assumptions about the venue of use, namely, it’s made for use at close range. Close range means that things happen quickly and probably dynamically. Success under those conditions calls for keeping one’s capacity for observation and adaptation so high that Scotty would be telling Captain Kirk that the ship is breaking up (“The visual circuits can’t handle much more of this!).

If for some reason the target is no longer in the field of view by the time the rifle is raised (maybe it has moved), or perhaps a second target is detected, the ability to use vision = field of view inside the scope + field of view outside the scope – field of view obstructed by rifle and scope. Therefore the lower profile the scope can be while still fulfilling its functional requirements the better.

Having said all that, although I saw the large knobs and disliked them, I think they were far enough forward to be out of view. Just to prove I’m still human, I continue to dislike them because I think they were made that way to be trendy. Everyone knows that big knobs are cool (like, duh).

The reticle was a different matter.  This was the scope I wanted to test so badly before I had the opportunity to test any of these because I thought that the reticle design was brilliant.  We’re talking about a first focal plane scope that had a big, obvious circle to line up at close range, where the big circle completely disappears when the scope is dialed up to maximum magnification!!!  The rest of the people in the running for the Nobel Prize might as well forget about it.  That was how certain I was that this was the best answer .

It was a big surprise when I finally mounted the scope on the X-15 and sighted through it. “Dang. It’s kind of hard to see the target.” It’s not precisely correct that it was hard to see a target, it would be more precise to say that there was also a lot of other stuff to see in the scope that was not the target.  I was not prepared for the reticle to come up as part of the vision = field of view inside the scope + field of view outside the scope – the field of view obstructed by the rifle and scope equation.

I also learned something about my reticle preferences for a scope in the 1-?x role.  I don’t like circles.  That’s ironic because I’ve been using an EOTech for about 8 years now.  Having used the single dot, the circle just seems to create visual noise that fulfills no actual function.  I’m not reticle ranging at close range (if ever).  I’m not using reticle holds at close range.  I don’t need to have some magical alignment circle to line up the scope, because the scope is a big circle if I look through it correctly.  All I see is that recoil creates a giant, bright, vibrating ring right smack in my field of view when maintaining a continuous sight picture in rapid fire.  The dot provides perfectly adequate usefulness with little to no uselessness.

The other primary difference with the SWFA is that the illumination isn’t always visible. The term “daylight bright” has been all the hubbub over at the yonder gun forums for a while now, and after all this scope testing I finally can concede that the concept is more than a marketing ploy.

In some situations in daylight the illumination appears to be nice and bright. Other times it just seems to wash out. Ironically, it’s the prominence of the reticle that allows for consistency in performance whether the illumination is visible or not. So in one way, the over-bold reticle can be seen as a design compromise to make up for the lack of a daylight bright dot, which would increase the cost of the scope.

The only other thing that I found even slightly wanting was that I preferred the illumination modules on the other scopes to the conventional knob on the SWFA. I think I killed two 2032 batteries in the short time I was in possession of it. Some people prefer the knob, as something that can be grabbed and turned, rather than toggled or pressed, but I liked the other scopes’ systems better.

I ended up writing about the things I didn’t like about the scope for some reason.  It would only be fair to point out that I’ve recently noticed that the way I perceived these scopes for testing was significantly different than if I had bought it myself.  Ironically I was less willing to accommodate any flaw.

I should point out that, even given the things I don’t like about it (which are also completely subjective), the scope is well made and will likely be able to fulfill the requirements is was made to.  The testing will unravel that bit.

The scope seems to be a quality piece. There were no issues in mounting or setting it up. The build quality seems very good.  Oh, and the knobs clicked like chocolate wafers atop clouds of cotton candy.  There.  No scope review is complete without going into gyrations over clicks.

Test results coming up.



U.S. Optics SR-4c Test Results

I apologize for taking so long between posts lately.  It’s been busy, and crunching the numbers for these test results is a tedious task.

Test 1: Single shots at 7 yards


Over 20 shots my average time was 0.9305. This was 0.105 seconds faster than the SR-8 (10.14%) and 0.1545 faster than the Swarovoski Z6i (14.24%). The fastest hit time was 0.75 seconds and the slowest was 1.28 seconds (shot #1- cold). The standard deviation for the times was 0.135 (SR-8c was 0.1415, Z6i was 0.1329). The hit rate was 85%, which was the same as the SR-8c, and just less than the Z6i, which had a hit rate of 95%. I believe that the speed number is more indicative of the performance than the hit rate, as hits are dependent on me not making mistakes. Misses typically occur because of some obvious lapse on my part, but I’ll leave that for you to decide.

The total points over 20 rounds were 95 and the average points per shot was 4.95. The ‘standard’ is 5 points per shot. This was the first optic to fall below that standard.  My standard for scoring is that over half the hole is in the scoring zone.

My hits over time score, what I’m calling “hit factor” was .91. Again, one hit per second would yield a score of 1. My points over time (point factor) was 5.32. Both of these scores were the best out of any of the optics tested so far. This illustrates that by factoring in time, even a slightly worse accuracy score can be overcome with some extra pep in the step.

This optic came out to be what I would consider significantly faster than any of the others so far. I probably collected enough data to see if the difference was actually statistically significant, but I have to confess that I don’t remember enough from one of the only useful classes I took in college.


Test 2: X-Box


I don’t recall that the feeling of shooting with this scope was markedly different than the SR-8, which is similarly laid out, but the proof is in the puddin’. The SR-4c was fast. My average time with the SR-4c was about a second faster than with the Swarovski Z6i, which had been the fastest up to that point.

Total x-box SR-4c

Average transition times for all four runs:

1. 1.22  Upper left (start)
2. 1.26  Upper right (right)
3. 1.10  Lower left (diagonal- down/left)
4. 0.84  Upper left (up)
5. 1.13  Lower right (diagonal- down/right)
6. 1.07  Lower Left (left)
7. 1.30  Upper right (diagonal- up/right)
8. 1.05  Lower right (down)
9. 1.09  Upper Left (diagonal- up/left)

Total Average Time: 10.04 seconds
Total Average Points: 43.25
Average points per shot: 4.81
Average hit rate: 81%
Average hits per second: 0.72
Average points per second: 4.31

While the actual hit rate was lower than the previous two scopes, which were tied at 86.11%, the times, which were again significantly faster more than made up for it according to the scoring system I’ve devised. Both the hits per second and points per second were higher than either of the previously tested scopes.

Test 3: DD25

I’ve determined that for me, at my current skill level, this drill is not a good measure of anything. It seems as though I can be quite consistent in types of shooting that I feel like I have some competence in. My performance in this drill is completely erratic, far beyond any differences that the optics might bring. The only trend I see is that I became quicker and less accurate the more times I shot it. I would need to work at this a lot to gain any competence with it, but part of my testing protocol is that I tried not to work toward improvement that would skew the results. I include the results only for the sake of completeness (and to keep me humble, as is the running theme of this blog).

DD25 SR-4c
Looking at this now I realize how sloppy I’ve gotten at this drill since then. Not good.

I completed 2 runs of this drill. My average raw points were 65 total of the 120 total possible (75 required to pass) . My average time was 27.25 seconds.  My average corrected points after penalties were 48.  My average hit rate was 73.33% (11/15).  My average hits per second were 0.4036697.  My average points per minute were 103.54 of the 300 minimum passing (34.51%).  Again, I shot faster, but this time it wasn’t fast enough to make up for the lower hit rate and points.

The trend with tests #2 and #3 is that I became more willing to take risks to shoot quicker. It seemed to pay off with test #2 but with test #3 being farther out it had the opposite effect. If I really wanted to get better at this I would move the target closer so I could get all my hits and gradually move it back.

Test #4: Groups at 100.

I began to feel a slight disadvantage with the reduction in maximum power in comparison to the other scopes while shooting groups at 100, but like many feelings it wasn’t justified by the results (which is why I’m going to rely on test results rather than fancy pictures and subjective opinion to do most of the talking here). There wasn’t much of a meaningful difference on paper in comparison to the other scopes, other than the extreme spread of the composite 30 round group was larger, which appeared to be caused by a high outlier in the second 10 shot group.  Other than that, some of the individual groups were smaller and some were larger.  The average mean radius was smaller than with the SR-8c, and the total mean radius was almost identical to that scope.  The second focal plane Swarovski Z6i was still the top performer in this category at this point in the testing.


Group 1

Group 2


30 Rounds


Group    Extreme Spread    Mean Radius

1.                 2.713                 0.725
2.                 3.639                 0.862
3.                 2.279                 0.662

Average Group

Extreme Spread | Mean Radius
2.849              0.805

Total 30 Round Group

Extreme Spread | Mean Radius
4.357              0.906


Test #5: “Long” Range Transitions

This, unfortunately, is the only photo I took of the rifle with the SR-4c on it.

A close up of the targets. The order, left to right was 270, 230, 330, 170.

I did not come into this test with any preconceived notions about scope power and performance, but I came out of the test with some ideas about it.

I made a couple mistakes in the test. First, I made a clerical error that led to a holdover error at the closest target (170). I wrote down 0.5 (which I think was the bullet drop in inches) vs. 0.1, which is that actual holdover in mils. Therefore my hold was too high. This shouldn’t have made too much of a difference, only about 2.44 inches if my hold had been perfect, but something else went wrong.  Three of my shots were off paper, and the center of the group that remained was much higher than it should have been.

Interestingly, at the time I was shooting this course of fire I did not expect to have poor results at all. I felt that although it was a little more difficult to see the targets, that I could see them well enough. I also felt as though my holdovers were on and my follow through was as good as it gets with me. When I noticed my mistake with the holdover mid-way through the course of fire, I determined that I should continue with the incorrect hold at 170, and that I could adjust the group down to figure my points. It just didn’t work out quite as well as I thought.

The second issue that popped up in this test was with the iPhone shot timer. I took a brief moment to check my holdover, which necessitated using a different app. I didn’t think that the shot timer would be affected, but it was. Therefore I only have time data for 15 of the 36 shots. At least I can still get an average time per shot.  That was 10.17 seconds, which was faster than any of the previous scopes, but that did not include the reload, which in this case was rather fast and efficient.

LR 170

LR 230

LR 270

LR 330

Distance | Extreme Spread (in.) | Extreme Spread (MOA) | Mean Radius (MOA)

170                                              Invalid Sample
230                  7.167″                           2.989                          0.899
270                10.528″                           3.378                          0.933
330                 8.788″                            2.567                          0.760

Distance | Points | Vert. Deviation* | Hor. Deviation | Total Deviation

170            14                           Invalid Sample
230            12              0.832                0.503               0.972
270            14              0.883                0.783               1.180
330              6              1.285                1.391               1.894

*Deviation of group center from intended point of aim (MOA)

I guess I could say that the good new is that some of the group sizes weren’t too bad. Some of the groups were better than with the Swarovski.  In fact, the average mean radius was better with the SR-4c than with the Swaro.  So in terms of precision the SR-4c did as good as can be expected when shooting a 4.2″ target at 4x out to 330 yards with reject military surplus ball ammo.

The bad news would be in the accuracy department (nearness of group center to point of aim).  I used points in this test to indicate the accuracy, and the performance basically fell off the wrong end of the chart in this case.  Even after adjusting my 170 yard target down by the equivalent of my hold error, the total score for all the targets was 46 points, compared with 107 and 102 with the SR-8c and the Z6i respectively.  It wasn’t wind.  It wasn’t mirage.  The earth wasn’t spinning any different than before.  I think I demonstrated my ability to use reticle holdovers in the Z6i test, so I don’t know what really went wrong.

In most respects the maximum 4x of the SR-4c fared pretty well with the requirements I placed upon it. I was quite disappointed when I completed the last test, but the decently sized groups just didn’t quite go in exactly the right place. I suspect that this could be mitigated with some time and training with the scope, but these tests are intended to get an idea of how easy the scopes are to use in comparison with each other. I found out that power matters, and it actually makes a pretty significant difference in how easy the optic is to use.