There are a lot of aspects of shooting, gear, tactics, etc. that have generated (and in some cases still do generate) a fair amount of controversy.
Examples include Weaver vs. isoscoles stances, point shooting versus aimed fire, high ready versus low ready, whether competition is good or bad for real world performance, appendix carry or hip carry, retention holsters or not and what level of retention, magazine position / facing direction, and the list goes on.
Some of these may ultimately boil down to not much more than personal preference. However, a great deal of the angst usually stems from how a single question is answered.
“What does success look like?”
When everybody is using the same measuring stick, most of the time controversies quickly subside. Performance tends to speak for itself. However, in the real world, this is rarely, if ever, the case.
For example, a relatively common disagreement in the training industry is over varying methods for sending a locked-back slide or bolt forward after conducting a reload. There are two basic variations for doing this. The first is using the slide stop lever or equivalent to send the slide or bolt forward, typically using a thumb.
The second is using the support hand to “slingshot” the slide or bolt forward by pulling it back, releasing it, and letting the recoil spring send it home. There are several variations, but that is enough for this discussion.
Without getting into why individuals prefer one technique over another (there are treatises available on the subject if you care to look for them), at the end of the day it is a question of which one works.
Why the disagreement? Because there is no agreement on what “works” means.
If “works” is defined as performing a reload from slide lock with a split time of, say, 1.2 seconds or less, then the slingshot method simply does not work.
There really is not any disagreement that the “thumb method” is faster. However, there is not a widespread agreement that faster means “works.”
Many, if not most, serious disagreements in the training industry, particularly those related to fundamental skill performance, basic equipment configurations, or tactics, boil down to something similar. That is, an inability to agree on what succeeding means.
If you have read much of our material in the past several years you are probably familiar with our view of this issue already. This inability to agree on what succeeding means is not, by itself, the problem.
Rather, this is a symptom of the problem. The real problem is our collective inability as an industry to measure things that matter with respect to real-world shooting performance.
In the competitive disciplines, it is a pure balance of speed and accuracy that matters (at least from a measurement standpoint). There is a defined scoring system, depending on the game being played.
Until a competitor reaches a very high level of performance—one that most people who shoot are not equipped to even understand conversations about—there is no real substantive argument about what works better. What works better is what most reliably produces the best score within the established rules.
On the tactical and personal defense side, things are far less clear. Varying objectives, situations and environments are part of this. However, a substantial issue across the industry at large is the fact that our ability to measure success in training has long been limited to the same tools that are used for measuring in competition. This means only speed and accuracy.
There is general agreement that pure speed is not everything in a tactical world. Yet, there is also general agreement that speed matters. There is further agreement that accuracy is not everything in a gunfight. However, there is likewise agreement that accuracy does matter.
This has created something of a conundrum—one that has been addressed for decades by the development of shooting standards. These usually consist of discreet skill performance in specific areas, or the performance of specific sequences of skills.
For example, the following could be a standard. At a specific distance, draw from a specific holster configuration and fire a defined number of rounds at a specific target to achieve a specific standard of accuracy within a specific time frame.
An excellent book on this topic, and a good resource for a wide variety of pistol standards, is Strategies and Standards for Defensive Handgunning by Karl Rehn and John Daub.
The objective of standards is to provide a measurement tool that allows students to have objectives that accomplish the goal of being “prepared” to “succeed” in the real world. Part of the idea is that the standards provide the balance between speed and accuracy.
These standards are useful tools. We like them. We use them. We have trained to them. We have also developed them ourselves when designing training programs.
What all of them (ours included) fail to do, however, is provide common answers to the question, “What does it mean to succeed?”
Each individual standards course can define success within its own parameters, but it cannot answer the real question.
For example, if two rounds in two seconds at seven yards into the A Zone of an IPSC target is the standard, then equipment, techniques, and skills that accomplish this task equal success, on the standard.
Is this actually success? Let the debate rage. Frankly, nobody knows. The only real answer is “It depends.” This is why we have a problem.
As we demonstrate in detail in our 2016 book Building Shooters, the reality is that, none of these standards can claim to require use of the same neurological and physiological systems and processes that are required during real-world tactical performance. They simply do not.
The reason for this is not human failure. It is because the training and measurement tools available are not capable of addressing these issues.
Therefore, standards today cannot provide an answer to this question. What the standards do is generate a substitute question that can be answered with the tools we currently have.
Standards like these are good tools for training and it seems like everyone in the industry now is making their own version of them. We are not criticizing this, lest there be confusion. We understand the business case. Also, standards are the best live-fire tools we have available today for developing and measuring skill.
If we want to advance the tactical and defensive side of the training industry forward, however, we cannot do it just by developing new standards. What we need are new tools and methods for measuring performance.