The National Highway Traffic Safety Administration does not investigate every motor vehicle crash. There are far too many crashes to review each one. Instead, the NHTSA uses a sampling of crashes to analyze overall trends. More than 20 sites have been designated for this sampling, including Chicago and Los Angeles. Since 1988, the NHTSA has analyzed approximately 4,700 crashes per year, nationwide.
The NHTSA put a new system in place that saw a substantial reduction in the car and truck crashes reviewed. The Government Accountability Office recently disclosed the change, showing that the NHTSA reviewed roughly 3,400 crashes in 2013. The reduction was blamed on rising costs and a flat budget. The NHTSA is attempting to maintain the quality of its crash information while actually reviewing fewer crashes. The GAO declared the new approach “reasonable,” citing a White House imposed restriction on travel expenses for federal agencies.
Sample size has an impact on the value of the data collected. Certain types of accidents are commonplace and may be adequately understood in a smaller sample. Other types of injury and collision are rare and require larger samples to facilitate a full review. Part of the change in data collection was to focus on accidents involving serious injuries and newer vehicles. The old data sampling system included roughly 7 percent of such accidents. The new data sampling system is intended to include at least 10 percent of accidents that contain these factors.
Source: The Detroit News, “NHTSA revising how it analyzes crash data,” by David Shepardson, 6 March 2015