The controversy over how air traffic controllers are recruited in the United States shows no signs of abating. Since the story broke in the consumer media, the Transportation Department’s internal watchdog has launched an investigation into hiring practices at the (FAA), and several high-profile “resignations” have followed. Some are now calling for hearings that will force FAA leaders to answer before the American people.
Training is an essential part of any successful business. A recent survey found that 40% of workers who receive poor training leave their job within the first year. Depending on the profession, however, training can also be a costly affair. The FAA incurs a cost of $93,000 per Air Traffic Control (ATC) trainee annually. In comparison, most businesses will spend just $1,208. Such a wide fiscal discrepancy reflects the uniqueness of a job where mistakes can be fatal, and in which close calls make headlines. However, preselecting candidates who are most likely to succeed can help minimize these costs. This approach strikes a balance between fiscal and operational concerns. It also inextricably ties the employer’s fate to that of the trainee. Trainee success means employer success.
The FAA has historically used the Air Traffic Selection and Training (AT-SAT) test to screen potential recruits. Since its introduction in 2002, more than 22,000 applicants have taken the test, and more than 6,800 controllers have been hired as a result. However, 2014 saw the FAA introduce a new Biographical Assessment (BA). Although this move was criticized, the FAA argues that the BA “measures qualities known to predict air traffic controller success” and has been “validated based on years of extensive research.” But do these claims stand up to scrutiny?
AT-SAT consists of eight subtests that measure, amongst other things, an applicant’s ability to scan and interpret instrument readings; detect targets that change over time; and determine the angles of intersecting lines—in other words, abilities that play a vital role in ATC. The predictive power of AT-SAT is well-documented. A 2013 FAA study found a positive relationship between test performance and training outcome. Higher AT-SAT scores meant a greater likelihood of being certified as a controller. Researchers concluded that the available evidence “supports the validity of AT-SAT as a personnel selection procedure for the [ATC] occupation.” Multiple investigations over the last 15 years have come to the same conclusion.
The BA on the other hand, is something of a different beast. It asks questions such as, “How would you describe your ideal job?”; “What has been the major cause of your failures?”; and perhaps most notably, “The number of different high school sports I participated in was: A) 4 or more . . . B) 3 . . . C) 2 . . . D) 1 . . . E) Didn’t play sports.”
Though seemingly trivial, such questions may well help preselect the best and brightest candidates. However, the lack of publicly available FAA data makes it difficult to establish whether this is the case. The only known study tackling the subject concluded that certain BA-related questions “did little” in improving the FAA’s ability to preselect applicants, and that the evidence for using such questions was “weak.” These findings are not lost on lawmakers, such as Congressman Randy Hultgren, who is now co-sponsoring legislation that forces the FAA to abandon the BA altogether.
When this story broke last May, Fox Business interviewed Matthew Douglas, a 26-year-old from Washington State who completed the BA. When reviewing the questions posed by the new test, he asked a simple yet important one of his own: “How does this relate to the job?”
The answer lies somewhere within the walls of the FAA. The agency has a long and distinguished history of making taxpayer-funded research data publicly available. So why not now?
This article first appeared in The Hill.