Back in 2011, I posted an entry to the Tutor.com blog that was deleted after we sold the company. Today I found a copy of that post thanks to the “Wayback Machine” that archives the internet. Here is that post in its unaltered entirety.
We’ve had the pleasure of posting about some of Bart Epstein’s adventures in the past. From volunteering to pilot for Angel Flight to his latest experience helping out NASA, he always has a story to tell. And what makes these stories so great is that Bart always shares what he learned. Below he writes about his recent journey to NASA’s Langley Research Facility where he flew a new flight simulator as part of a research project. Check it out and see how Bart proves, you never stop learning!
Last week I had the unique experience of being the first civilian pilot to fly a new flight simulator at NASA’s Langley Research Facility in Hampton, Virginia. I spent two full days flying dozens of procedures as part of their Interface Comparison Experiment (ICE) study, the goal of which is to evaluate how pilots of different levels of skill and experience react to two completely different automation control interfaces.
As you might imagine, the ideal interface should be easy enough for a rookie to fly the plane safely yet powerful enough to handle the demands of the pros. It needs to have just enough buttons and dials without being overly complicated. For two days, NASA gave me all sorts of scenarios to fly and then watched me carefully, graded my performance, and asked me hundreds of questions about the experience and my perspective on the pros and cons of each user interface. It was fascinating.
To be honest, when I first heard that this project would be in a simulator I was little disappointed. I wanted another chance to fly NASA’s Cessna’s 206 the way I did when I was a pilot in their Synthetic Vision study, but after a few minutes in the simulator I forgot all about the “real” plane. The simulator was not only very cool (with huge visual displays and advanced “force feedback” control sticks), but advanced simulators also make it possible to fly multiple procedures in a short amount of time. After completing a simulated “approach” to an airport, the researchers could push a button, instantly transport me back to the starting point and have me fly it all over again with slightly different parameters. This means more science in less time.
Spending time in NASA’s Human and Autonomous Vehicle Systems lab also meant spending all day on the NASA Langley campus and getting escorted around to see fun things such as their hypersonic wind tunnel and the famous Landing and Impact Research Facility (where the lunar landers were originally tested). While there, I got to eat lunch in the NASA cafeteria, though I skipped the Shuttle Dog with Chili in favor of the Satellite Salad Bar in order to minimize the chances of getting sick in the simulator, which would have been awkward to put it mildly. Some of the pictures I took are below.
While at Langley, I even got within 50 feet of the new Orion Crew Excursion Vehicle (CEV). This spacecraft is expected to ferry crews of astronauts to lunar orbit and eventually return astronauts to Earth on the final leg of a human Mars mission. Had I arrived one day earlier I might have been able to watch them shoot this video of the CEV being dropped into the giant pool at the Landing and Impact Research Facility!
For those of you who may be wondering if I must be some kind of special pilot to be asked to test these new systems out, the answer is exactly the opposite. I get recruited for these projects because I am NOT an airline pilot or a military pilot with tens of thousands of hours of flight time. Rather, NASA wants feedback from a good cross section of pilots, including pilots (like me) who are instrument rated commercial pilots with a lot of experience, but do not fly for an airline or the military.
When the study is complete it will be interesting to learn what other pilots who tested the new integrated system thought of it, and to see what the system looks like when it eventually comes to market.
Bart Epstein is Senior VP & General Manager of the Tutor.com for U.S. Military Families program. In addition to occasionally testing for NASA, he serves as a volunteer pilot for Angel Flight and as President of Gifts for the Homeless in Washington, D.C.
For the more technically inclined keep reading below for an in-depth review of the simulator Bart flew and all the cool gadgets.
One interface I flew was the Mode Control Panel currently installed in Boeing 737 aircraft. Being a single engine pilot, this system was new to me, but I found it easy to learn in a few hours. You hit the heading select button (HDG SEL), dial in your desired heading, and the plane flies in that exact direction, hopefully towards the path you previously programmed into your Flight Management System. When you get close to your desired flight path, you engage the lateral navigation (L-NAV) button and the autopilot captures the lateral navigation path and flies the plane there – assuming that you are close enough to the proper flight path when you engage the L-NAV feature. (The principle for establishing a descent profile is essentially the same, using the ALT HLD and V-NAV features.)
One serious downside of this system, however, is that it requires taking one’s hand off the control stick to turn dials and push buttons. Another is that it requires pilots to turn their heads to separately program the Flight Management System. (For maximum safety, pilots prefer to be looking out their windows or at their primary flight instruments as much as possible.)
The other interface I flew was NASA’s experimental “integrated” system, which has all of its input controls built in to a series of buttons, switches, and a trigger on the flight stick. I didn’t get a good picture of it but it looks much like what you’d see on a home computer simulator setup, with the pilot able to provide system inputs by clicking buttons, pushing different directions on a “hat switch,” and pulling the trigger to engage the autopilot.
The system is still in development but I found it to be highly promising. Without having to turn my head or take my hand off the flight stick, I was able to select published portions of instrument approach procedures, activate them, and then engage the autopilot to fly them. On the other hand, there are still kinks to work out of the system, and part of my feedback to NASA was that the system would benefit from some tweaks to reduce unintentional climbs during turns, as well as an additional input dial or switch to more easily set heading.
- Here’s a link to more pictures if anyone is interested: