Friday, January 27, 2017

3.6 – Research: UAS Integration in the NAS


3.4 – Research: UAS Integration in the NAS

Gregory Laxton

ASCI 638 – Human Factors in Unman Aerospace Systems

Embry-Riddle Aeronautical University-Worldwide

January 29th, 2017



            The Federal Aviation Administration (FAA) Next Generation (NextGen) air transportation system was started in 2003 and originally intended to improve the “capacity, efficiency, and safety” (In Liddle & In Millett, 2015, p. vii) of the National Air Transportation System. In addition, it hopes to reduce carbon emissions and lower pollution. There are many components that fall under the “NextGen” moniker at the FAA. For example, they would like to quicken departures and arrivals by increasing the digital communications between the FAA controllers and users of the NAS (Federal Aviation Administration, 2017). The FAA hopes using performance navigation (PBN) along with required new technology onboard aircraft, will allow more takeoffs and landings from existing airports, increasing capacity. The FAA wants to improve navigation with more direct routing and increase the amount of aircraft that can take off and land each hour on existing runways. It’s a very ambitious plan for the FAA.

            En route flow improvements the FAA hopes to make will utilize Time Based Flow Management (TBFM) and Automatic Dependent Surveillance-Broadcast (ADS-B) to increase efficiency across the country (Federal Aviation Administration, 2017). On arrival, one way the FAA hopes to smooth flow in to congested airports, is creating new waypoints and constant descent profiles, and specific arrival times at these points. There are many more pieces of the NextGen puzzle such as sending taxi instructions to pilots prior to landing in hopes of expediting aircraft off the runway and minimizing confusion with controllers.

Not everyone is pleased with new departure and arrival flows, which can be very different from long standing flight patterns. It may increase numbers at the airport, but if air traffic has increased 500% over your home, you may not be a fan of NextGen. In Phoenix, residents are upset exactly over this issue. They were not consulted by the FAA before the new flight paths were implemented. Consequently, resident noise complaints in the affected areas have risen dramatically. The mayor of Phoenix said he felt blind-sided by the FAA (CBS News, 2015).

            The FAA, like every federal agency, has a limited budget and must priorities resources. NextGen is an expensive goal, and the FAA said in 2015 it was forced to choose between ongoing maintenance of the existing infrastructure and “keeping NextGen progress” on schedule (Broderick, 2015).

One of the NextGen technologies which may help integrate UAS in to the NAS, is the proposed national Airspace System Voice System (NVS) (Federal Aviation Administration, 2017). This should allow controllers and aircraft to communicate via router based technology, essentially bypassing the line of sight VHF procedures in place now. In the proposal, a controller will be able to talk with an aircraft anywhere in the system, not just in its geographic region. This could benefit UAS operators flying BLOS. For example, if I can fly a UAS three states away, and still use router based communication to speak with a local controller, it helps remove an obstacle for UAS operating in the same airspace as manned aircraft. This would be a crucial benefit for UAS operators. If the GCS is in Nevada, but the UAS is overflying Texas, this technology if implemented, would allow the GCS to speak directly with Ft. Worth center, just like the commercial manned aircraft overhead. It doesn’t solve all the communication problems, but it helps.

From a human factors perspective, communications between traffic controllers and UAS GCS operators will be a challenge. It’s not hard to foresee several missed transmissions because the UAS operator needs a physiological break and is away from their station. If datalink sends text messages, the operator could execute the instructions immediately when back in position, but this response may lag in comparison manned aircraft.






References



Broderick, S. (2015, February 5). FAA Budget Request Balances Current Needs, NextGen | Aftermarket Solutions content from Aviation Week. Retrieved from http://aviationweek.com/aftermarket-solutions/faa-budget-request-balances-current-needs-nextgen

CBS News. (2015, January 30). FAA's new air traffic control system NextGen causing major noise pollution - CBS News. Retrieved from http://www.cbsnews.com/news/faa-new-air-traffic-control-system-nextgen-causing-major-noise-pollution/

Federal Aviation Administration. (2017, January 12). Next Generation Air Transportation System (NextGen). Retrieved from https://www.faa.gov/nextgen/

In Liddle, D. E., & In Millett, L. I. (2015). A review of the next generation air transportation system: Implications and importance of system architecture. DC.

National Research Council (U.S.). (2015). Transformation in the air: A review of the FAA's Certification Research Plan. DC: National Academy of Sciences.

Thursday, January 19, 2017

2.5 - Research: UAS GCS Human Factors Issue


UAS Ground Control Station (GCS) and Human Factors

The UAS GCS selected for this paper is Insitu’s Common Open Mission Management Command and Control (ICOMC2). The ICOMC2 is portable and can be installed and operated from a laptop. Insitu says the ICOMC2 can control multiple unmanned vehicles from a single work station and provides the operator with a video overlay function (Insitu, 2016).

The ICOMC2 display is configurable by the operator. For example, the user can show engine parameters in one window and use drop down menus to select specific vehicle conditions such as airspeed and altitude in another. The operator may choose to display a map overlay in yet another window and a sensor feed in a third. These can be adjusted and manipulated with common keyboard, mouse and touchpad laptop controls (Insitu, 2016). For navigation of the UA, the user can select a waypoint on the map or have the aircraft fly a pre-programmed route.

Insitu advertises an Augmented Video Overlay System (AVOS) that provides operators with various overlays of a video sensor feed in the ICOMC2 system. It can add terrain elevation, acoustic detectability and other satellite data such as borders or restricted airspace (Insitu, 2016). This capability should increase the situational awareness for the user, helping them navigate safely around the search area. This display method should increase the “operator imagery interpretation (Cooke, 2006, p. 153)” as described by Cooke, and increase situational awareness compared to a top-down view. However, environmental conditions such as cloud cover or nighttime operations may negate the AVOS advantages, similar to manned aircraft. Technology may offer mitigation in the form of a virtual reality headset such as the Oculus Rift, a COTS virtual reality headset that retails for $699 dollars (Greenwald, 2016). Virtual reality may offer enough of an immersive experience that the operator can maintain a higher level of situational awareness.

The ICOMC2 is a laptop display capably of controlling multiple UA at once (Insitu, 2016). This implies a considerable workload increase for the operator as each UA is added to the mission; a concern from a human factors perspective. If the operator is trying to control multiple aircraft, it would quickly become difficult to maintain situational awareness, especially if one of the UAs experience a malfunction. This may focus the operator’s attention on the problem aircraft and may not allow enough extra mental capacity to make time critical decisions for the other aircraft, a typical task saturation and negative human factors concern.

Another possible negative human factors issue with the ICOMC2 is the laptop interface described above. The user manipulates the display while operating the various compatible UAs. Familiarity with laptops controls is likely universal. Everyone should understand how a mouse or touchpad controls functions on a screen. The downside from a human factors point of view is this is not normal aircraft controls for trained pilots. For example, a pilot will pull back on the controls in an aircraft to increase flight altitude. On the ICOMC2, or any laptop controlled UAS, it may be a mouse command or even typing in the desired altitude before the UA begins a climb.

A conventional laptop human machine interface (HMI) lacks tactile feedback. A manned aircraft pilot may experience audible and tactile alerts when approaching a stall, something likely not available from standard laptop configurations. A mitigation strategy for this could include using controls similar to manned aircraft, and provide more conventional aircraft controls to the operator.

REFERENCES
Cooke, N. J. (2006). Human factors of remotely operated vehicles. Amsterdam, United Kingdom: Elsevier JAI.

Greenwald, W. (2016, December 20). The Best VR (Virtual Reality) Headsets of 2017. Retrieved from http://www.pcmag.com/article/342537/

Insitu. (2016). Insitu - Insitu Common Open Mission Management Command and Control (ICOMC2). Retrieved from https://insitu.com/information-delivery/command-and-control/icomc2#2