Comparison of All Test Optics: Test #1

Links to individual test results:

U.S. Optics SR-8c, 1-8×27
Swarovski Z6i, 1-6×24
U.S. Optics SR-4c, 1-4×22
SWFA SS HD, 1-6×24

The idea for this test was simple.  How fast can I bring the rifle up, get a sight picture, and fire one shot at a close range target?  In this case 7 yards was the distance and the target size was appriximately 4.2”.

I worked at this skill prior to beginning the testing to the point that I felt that I would not endanger the validity of the testing process by doing something as egregious as easily improving my performance over the course of the tests.  I don’t believe the process was perfect, as one error over the course of 20 shots would affect my hit rate by 5%.  I don’t believe that the speed was affected as much by variables as the accuracy, as a few times I could just tell that I messed up and missed.


What I find striking is the similarity in speed between the Aimpoint T1 and the SR-4c.  Conventional wisdom says that these variable 1x scopes are almost as fast as a non-magnified red dot.  As far as speed they were basically indistinguishable.  This trend continued beyond this test and wasn’t confined to speed.

I’m also incredulous at the Z6i being the slowest.  I have doubts about what caused it, because I can’t understand why an optic with such a clear, bright image, wide field of view, minimalistic and out of the way reticle, daylight bright illumination, and generous eyebox could turn in such low numbers.  The thing is, I can’t find a me related explanation for that.

I do wonder whether some aspect of the Z6i made it slower for me in this test.  Note that both of the slowest scopes, the Z6i and the SWFA SS HD, were those that I found to have the clearest, brightest images.  The Swarovski really is just amazing to look through.  I would not be far off to say that it looks better than real life.  I wonder if it’s just a little too much to adapt to in a small amount of time.  The Aimpoint is much faster, and really isn’t impressive at all optically.  When it comes down to it though, I still simply cannot accept that the Z6i could really be that much slower.  I would need to do more work with it to believe what the results I have are showing.  The other thing I have wondered is if I could just see better and it caused me to be more discriminating in my decision to fire.

 Test 1 fastest slowest

Looking at the fastest and slowest times with each optic, I think what you are seeing is a graphic estimation of my personal maximum deviation for this test.  What I do see again, however, is that the Aimpoint and SR-4c standout from the rest and must have offered something to allow me to gain about a tenth of a second improvement over all the rest.  As for the slowest, The 1.36 of the SWFA was the 20th rep in a perfect run.  I didn’t want to ruin it.

On Paper:


7 Yard Snapshots- resized





SWFA SS HD 1-6×24:


Aimpoint T1 Micro:

Test 1 Aimpoint

SR-8c Retest:

SR-8c Test 1 recap

I probably need to clarify again what I count as a hit in a given scoring zone, as different people have different methodologies.  In my mind, if the tip of the bullet misses the target it is not a hit, but a grazing shot.  On the other hand, if the tip is inside the scoring zone I count it.  Therefore, if over half of the bullet hole is in the scoring ring I count it as a hit.  If I couldn’t tell I give the benefit of the doubt to the shooter (me), which I think is the best policy in general.

Looking at the targets, it comes to my mind that the Swarovski got robbed of a perfect score by my trigger finger.  I didn’t take a lot of notes about the tests, so when I see that phrase “I had one wild miss due to a trigger control mishap,” I can’t help but feel as though I let it down.  That illustrates the big weakness with this test, which is me.

As I look at each target, I interpret the wide misses, such as the Z6i, the top hit on the SR-4c target, and the same type of hit on the SR-8c retest, as gross errors on my part.  I haven’t massaged the numbers to reflect this interpretation, but I just want to point it out.  I don’t know what to say about the other misses, so I just accept them.  I probably just thought I saw an acceptable sight picture and went for it.  The mistake could have come from either part of that process (the seeing or the acting).

To keep you from having to count misses and points, here are some graphs for you:

Hit Rate

The best performers in terms of hit rate were the slowest ones.  That shouldn’t be surprising.  It’s possible I just took the time I needed to get better hits.  The SWFA did have the benefit of me having had some extra practice the week leading up to the test, and I had expected a higher hit ratio than normal, although I didn’t expect a perfect one, and I expected to be faster, which I wasn’t.  There wasn’t much I could do about the extra week of shooting prior to the test.  My schedule just worked out that way.

Total Points

The points tell a similar story, although not the same, as points reward center hits as ‘better’ than edge hits.  The target also awards 1 point to an ~8″ ring in deference to the universality of the paper plate as a target (alternately read as a body shot).  Note that due to the slightly offset group of the SWFA, the Z6i still beat it in points, as 95% of the Z6i’s group was more well-centered in the sweet spot of the hit zone.  Likewise, the SR-4c and the Aimpoint run neck and neck in points, although the Aimpoint had a 5% higher hit ratio.

Because, as I pointed out when describing these tests initially, speed and accuracy are both important, I think the most telling results are those that show them factored in together.  I did this in two ways, one with points as marked on the target and one with simple hits.  Here are the hits per second:

Hit Factor

Interestingly, this was a battle between the SWFA and the SR-4c, which in this test embodied accuracy and speed respectively.  Anyone who shoots USPSA and has an eye for stage strategy will have an idea that each course of fire usually has a sweet spot in how it needs to be shot on the continuum between pure speed and pure accuracy.  In this case the accuracy won out.  It would be interesting to go back through the testing process with that bit of knowledge.

The SR-8c seems to be showing a bit of comparative weakness at close range, especially in the re-test.  The re-test really suffered due to some close misses, which really hurt its points.  I don’t think it’s performance up close was particularly bad, especially considering the versatility of the scope, but for a shooter with a bias towards the close range end of the spectrum, it would seem that other options might be preferable.

Point Factor

The points per second measure enabled a scope to make a better showing if it got some hits nearer to the center of the target.  This did make a slight bit of difference, such as with the Z6i gaining back some of its lost ground and nudging right up to the SR-4c.  The Aimpoint also loses quite a bit of its ground in this measure.  To me, that is an indication that I just couldn’t see quite as well with it, which is what I felt at the time.  Apparently I could see well enough to get a hit most of the time, but not to make those hits of higher quality, if you believe in that sort of thing.

One thing I would like to revisit is the subject of illumination.  All of the scopes except for the SWFA had daytime bright illumination in the form of a single dot, similar to the Aimpoint.  I shot this with illumination activated on all the scopes except for the SWFA.  The SWFA was one of the slower scopes.  Some might wonder how the other scopes would do without their illuminated dot.  While I didn’t run all of the scopes without illumination to test this, I did so with the SR-4c.  This was at the request of someone on an email list, so I didn’t save all the data or the target, but I do have the average time and hit rate.

SR-4c with and without illumination

The SR-4c lost some of its luster without the dot.  It was significantly harder to use, although I wouldn’t go so far as saying it was hard to use.  I don’t know exactly how meaningful this is.  I can see only a few instances in which one might have to use the optic without the dot.  The battery could be dead with no replacement handy.  The user might not have time to activate the illumination, or may have forgotten to do so.  The illumination might have timed out and turned itself off.  Of these possibilities, I did have the illumination time out a couple of times with the SR-8c and the SR-4c.  I did not experience a battery failure in either of these in the 6 months I had them here, probably due in large part to the auto shut off feature.  If the possibility of failure in the illumination system is a big issue for you, the SWFA obviously might be a good choice, and I think that the second focal plane of the Z6i would be less affected than the first focal plane reticles (with the probable exception of the SWFA) by a loss of illumination.

Which one would I choose out of these if I had to pick from one of these?  Based on the test results alone, it would appear as though the SWFA had the best balance of speed and accuracy.  I didn’t miss any shots with it after all.  After I factor in what I know about the testing process, I would pick the SR-4c or the Aimpoint for close range performance only, without making considerations for what versatility any of the scopes might add at longer ranges.  Those two are fast.  I think with more practice the accuracy could be improved, and if I’d shot with either of those at the time I shot with the SWFA, the hit ratio would likely have been better.

8 thoughts on “Comparison of All Test Optics: Test #1

  1. Really interesting read… great work !
    Could you do the same test at 25m, and then 100m, and then 300m.
    That should give a very comprehensive data result for 3 Gun Competition.
    Pity you haven’t also got an:
    – Eotech
    – Bushnell 1-4 PCL – Low end ($299)
    – Leopold Mk6 1-6 CMR-W – High end ($1,999)
    – S&B 1-8×24 PM ShortDot – dual reticle – Very high end ($3,799)
    …which would probably cover the range completely, and give a real world application distance indication of each type of optic.

    • Rob,

      I did different tests at different distances. I agree about having other scopes to test. I would have liked to have done them all. A person could probably do anything with unlimited money, time, and support. I’m still working on those. In the mean I’m happy with what I do have.

      • Have you thought about talking to scope suppliers to provide you with scope, etc to test ?
        You’re articles are very well written, well researched and very informative. I’m kinda surprised no one has approached you to test more stuff.

        • I tried with Bushnell and got no response. I probably would have had more luck with a smaller company like Vortex. Ultimately time was also a factor. Sightron makes an interesting 1-7, Minox has a 1-8, and there were some other choices that interested me.

          Maybe I could do it again now that I’ve ironed out some of the difficulties, but I think to get an apples to apples comparison the time window needs to be as narrow as possible.

Leave a Reply

Your email address will not be published. Required fields are marked *