Visual Servoing of Aerospace Vehicles

Unmanned vehicles are image increasingly being used to perform both military and civilian surveillance, inspection and monitoring missions. Their levels of autonomy range from being fully tele-operated to being fully autonomous and only requiring very high-level mission commands. The greater the human-directed control required by an unmanned vehicle, the larger the support crew will need to be.

The AVS Lab is researching semi-autonomous visual sensing and control techniques of a free-flying craft/end-effector relative to another spacecraft. Consider the scenario where an unmanned space vehicle is to operate in close proximity to a larger craft as illustrated in figure on the right. The target craft can be either collaborative or non-collaborative in this effort. The goal is to have the operator be able to select a visual feature using the scout's on-board camera, and then instruct the craft to maneuver relative to this visual target.

To measure the relative motion of the trailing vehicle with respect to the lead vehicle, a passive visual sensing approach is investigated where a video camera is used to see the target spacecraft. A distinct passive visual feature is then tracked in real time. Note that the target is not emitting an electronic signature as active optical beacon do, thus reducing the overall signature emission of the vehicles. Rather, these targets would be uniquely colored shapes that are tracked robustly.

The control objective is to use the 2D visual snake information and be able to maintain a specific position relative to the secondary craft, or perform small proximity operations in this neighborhood using the visual target as a reference land mark. Computing the 0th, 1th and 2th area moment of the visual snake contour the target heading direction can be estimated, well as some depth and orientation information.

image The ongoing research is establishing a novel hybrid hardware-software test bed platform to help develop and evaluate relative motion sensing concepts. An advanced Sandia developed software framework called UMBRA is used to develop a highly modular simulation where physical and software versions of hardware can easily be interchanged. For example, scenarios are envisioned where physical vehicles and sensors are interacting with software simulated virtual vehicles and environments as illustrated in figure on left. Testing the planar or near-planar motion of aerospace vehicles in this test bed provides a cost-effective means to evaluate proposed sensing and control strategies, before trying to implement them in costly three-dimensional scenarios or actual mission flights. The unmanned ground vehicles (UGVs) provide two translational and one heading degree of freedom. By adding a pan-and-tilt unit to the UGV, the sensor heading can be controlled independently from the vehicle heading, and small out-of-plane rotations can also be simulated by tilting the sensors up and down.

The current AVS lab implementation has demonstrated hybrid camera/UGV simulations where virutal camera and UGV models have been created with identical interfaces as the UMBRA controlled camaera and UGV hardware. Simulation test runs of such visual control operations can be seen on the media page.

Current projects are investigating visual control of unmanned ground vehicles to enable semi-autonomous convoy operations, visual spacecraft proximity flying, as well as aerial refueling of UGV applications. In all these projects the UAV lab capability to provide rapid hardware, softare and hybrid simulation capabilities is allowing for exciting and effective visual control research to be conducted.

AVS image

Relevant Articles