lunes, 15 de abril de 2019

Lockheed Martin's 360-Degree Pilot Visual System Completes First Flight On Bell V-280 Valor

NASHVILLE, Tenn.April 15, 2019 /PRNewswire/ -- Lockheed Martin's (NYSE: LMT) Pilotage Distributed Aperture Sensor (PDAS) system took flight for the first time aboard the V-280 Valor, Bell Helicopter's next-generation tiltrotor aircraft in a series of flights over Fort Worth, Texas, in March. PDAS is a multi-functional sensor system that generates high-resolution, 360-degree imagery around the aircraft to enhance situational awareness for pilots and other users.
The PDAS system captured complete spherical infrared imagery while operating in a high-speed, tactically relevant flight environment and generated real-time imagery.
"Conducting PDAS flight tests on the V-280 is an exciting first step toward delivering a level of situational awareness unavailable on today's Army rotorcraft," said Rita Flaherty, strategy & business development vice president at Lockheed Martin Missiles and Fire Control. "With its embedded, multi-functional sensors, PDAS is the ideal foundation for an integrated survivability suite that will enable Army aircrews to own any environment and universally detect and defeat incoming threats."
Specifically designed for current and future vertical lift aircraft, PDAS consists of six infrared sensors distributed around the aircraft linked to aircrew helmets and cockpit displays via an open-architecture processor.
During testing, engineers demonstrated PDAS's ability to provide simultaneous coverage to multiple independent displays. Aircrews benefit from its all-weather pilotage imagery while transported ground troops can survey the environment for tactical information and threats. While PDAS is currently generating imagery for two users, the system will ultimately support up to six users, which could include pilots in other aircraft and mission commanders on the ground.
Planned capability upgrades will demonstrate additional integrated survivability suite capabilities like Multi-Modal Sensor Fusion (MMSF). MMSF blends data from multiple types of sensors to restore aircrew situational awareness in degraded visual environments and enables navigation in GPS-denied zones.