文档库 最新最全的文档下载
当前位置:文档库 › Abstract A Survey of Unmanned Aerial Vehicles (UAV) for Traffic Surveillance

Abstract A Survey of Unmanned Aerial Vehicles (UAV) for Traffic Surveillance

Abstract A Survey of Unmanned Aerial Vehicles (UAV) for Traffic Surveillance
Abstract A Survey of Unmanned Aerial Vehicles (UAV) for Traffic Surveillance

A Survey of Unmanned Aerial Vehicles (UAV) for Traffic Surveillance

Anuj Puri

Department of Computer Science and Engineering

University of South Florida

4202 E Fowler Ave, Tampa, FL 33620

Abstract

The United States Department of Transportation (DOT) has been interested for the past several years in obtaining data on traffic trends and to monitor and control traffic in real-time. Currently, there are several methods by which the DOT regulates and monitors road transport. Cameras mounted on towers, detectors embedded in pavements or pneumatic tubes, and unmanned aircraft have been proven to be expensive and time-consuming solution candidates. However, aerial monitoring has the potential to yield detailed information to help traffic planners, as well as commuters. Unmanned Aerial Vehicles (UAVs) may provide a “bird’s eye view” for traffic surveillance, road conditions and emergency response. The purpose of this technical report is to provide a survey of research related to the application of UAVs for traffic management.

1. Introduction

The increase in the number of vehicles on roadway networks has led transport management agencies to allow use of technology advances resulting in better decisions. The mission of roadway transportation agencies is to evolve from solely providing roadway infrastructure to focusing on the needs of the traveling public, management and operations, and improved performance of the surface transportation system. This requires collection of precise and accurate information about the state of the traffic and road conditions. It is also required to get timely information in case of emergencies (accidents, oil leaks, etc). In case of accidents, time of response is the most critical constraint in victim survivability.

Traditional technology for traffic sensing, including inductive loop detectors and video cameras, are positioned at fixed locations in the transportation network. Data related to traffic flow is currently obtained from detectors embedded in pavements or pneumatic tubes stretched across roads. Such methods do not prove to be time-efficient or cost-effective. While these detectors do provide useful information and data about traffic flows at particular points, they generally do not provide useful data for traffic flows over space. It is not possible to move detectors; further, they cannot provide useful information such as vehicle trajectories, routing information, and paths through the network.

Several on-going research projects have been working to come up with technologies that improve surveillance techniques for traffic management. Travel time estimation algorithms such as Extrapolation method and Platoon matching, have been developed based upon measurable point parameters such as volume, lane occupancy, or vehicle headways. Image matching algorithms are used to match vehicle images or signatures captured at two consecutive observation points.

Aerial view provides better perspective with the ability to cover a large area and focus resources on the current problems. It has the advantage of being both mobile, and able to be present in both time and space. Satellites were initially considered for traffic surveillance purposes, but the transitory nature of satellite orbits makes it difficult to obtain the right imagery to address continuous problems such as traffic tracking [24]. Also, cloud cover doesn’t give good image quality on days with bad weather. Some private companies have been flying manned aircrafts for commercial usage and survey. But this approach does not prove to be cost-effective. Also, the manned aircraft can not be flown in bad weather, or regions which are potentially unsafe for the operators.

UAVs may be employed for a wide range of transportation operations and planning applications: incident response, monitor freeway conditions, coordination among a network of traffic signals, traveler information, emergency vehicle guidance, track vehicle movements in an intersection, measurement of typical roadway usage, monitor parking lot utilization, estimate Origin-Destination (OD) flows [5]. The advantage of UAVs is that they can move at higher speeds than ground vehicles as they are not restricted to traveling on the road network. Unmanned vehicles have advantages over manned vehicles as most of the functions and operations can be implemented at a much lower cost, faster and safer. UAVs may potentially fly in conditions that are too dangerous for a manned aircraft, such as evacuation conditions, or very bad weather conditions. UAVs are programmed off-line and controlled in real-time to navigate and to collect transportation surveillance data. UAVs can view a whole set of network of roads at a time and inform the base station of emergency or accidental sites. It also permits timely view of disaster area to access severity of damage. The base station can then choose the best route and inform the police cars.

UAVs are equipped with a variety of multiple and interchangeable imaging devices including day and night real-time video cameras to capture real-time video; sensors such as digital video, infrared cameras, multi-spectral and hyper-spectral sensors, thermal, synthetic aperture radar, moving target indicator radar, laser scanners, chemical, biological and radiological sensors, and road weather information systems (RWIS) to record necessary information, such as weather, fire and flood information; and communications hardware to relay data to the ground station [2], [5]. With advances in digital sensing platforms, image processing, and computational speed, there are significant opportunities to automate traffic data collection. Different UAVs have different data collection capabilities. Some of them have real-time data transfer capabilities to the ground station, while the others are capable of storing high quality video or images on-board.

2. UAVs Overview

UAVs are semi-autonomous or fully autonomous aircrafts that can carry cameras, sensors, communication equipment or other payloads. UAVs have been a topic of research for military applications since 1950s. UAVs were used as prototypes in World War I and II. In the last decade, Defense Advanced Research Projects Agency (DARPA) initiated several projects to increase use of UAVs in military applications [1]. Lately, increasing interest has been found in diverse civilian, federal and commercial applications, such as traffic monitoring.

UAVs are classified as either rotary-wing or fixed-wing. Fixed-wing vehicles are simple to control, have better endurance, and are well suited for wide-area surveillance and tracking applications. Fixed wing vehicles have another advantage that they can sense image at long distances. One disadvantage though is that it takes sufficient time to react as turning a fixed-wing vehicle takes time and space until the vehicle regains its course. The rotary-wing vehicles are also known as Vertical Takeoff and Landing (VTOL) vehicles. They have the advantage of minimum launching time, as well as they don’t need enough space for landing. They have high maneuverability and hovering. Rotary wing vehicles have short range radars and cameras to detect traffic movement. The drawback of such type of vehicles is that the rotary motion leads to vibration.

Vehicle Endurance

(hours)

Payload Weight

(kg)

Altitude Capacity

(ft)

Aerosonde 40 1 20,000

Altus2 24 150 65,000

AV Black

Widow 5 0 1,000 AV Dragoneye 1 0.5 3,000

AV Pointer 1.5 0.9 3,000

AV Puma 4 0.9 3,000

AV Raven 1.25 0.2 3,000

BQM-34 1.25 214 60,000

Chiron 8 318 19,000

Darkstar 8 455 45,000

Exdrone 2.5 11 10,000 Global Hawk 42 891 65,000

Gnat 750 48 64 25,000

Helios 17+ 97,000

MLB Bat 6 1.8 9,000

MLB Volcano 10 9 9,000

Pathfinder 16 40 70,000

Pioneer 5.5 34 12,000

RMAX 1 28 500

Predator 29 318 40,000+ Shadow 200 4 23 15,000

Shadow 600 14 45 17,000

Table 1: Capabilities and characteristics of UAV systems presented and discussed during

the UAV 2003 workshop [28].

UAVs have different payload weight carrying capability, their accommodation (volume, environment), their mission profile (altitude, range, duration), and their command, control and data acquisition capabilities vary significantly. A summary of the UAV capabilities and characteristics were presented in [28] as shown in the Table 1.

The smallest vehicles are Micro UAVs (MAVs) like the AV Black Widow developed for military surveillance, law enforcement, and civilian rescue efforts. Their payloads are just a few grams with vehicle size of a few centimeters. Larger than MAVs are Small UAVs (SUAVs) like the MLB Bat. SUAVs are largely used for traffic surveillance oriented research as they are designed for small regional scales and carry a payload of a few kilograms. They are portable, flexible and autonomous in their applications. Medium altitude and medium endurance UAVs (MUAVs) are used for regional scale observations. They can be used for applications such as mapping and monitoring of fire hazards, weather phenomena etc. UAVs that operate in High Altitude with Long Endurance (HALE) range, like the Helios, are used for applications such as mapping, communication, and monitoring tasks of the earth surface and the atmosphere, as they can work at altitudes up to 100,000 feet.

3. Barriers to UAV Deployment

UAVs fall under the direct jurisdiction and control of the Federal Aviation Administration (FAA). The FAA has not yet issued governing regulations concerning their use. The FAA requires that UAVs must have onboard “detect, see and avoid” (DSA) capabilities to prevent in-air collisions. In addition, the Federal Communications Commission (FCC) regulates all non-Federal areas of communications and radio/television transmission in the United States. Wireless transmissions to and from the UAVs must meet all applicable FCC rules [2]. A fail-safe option for the mission must automatically apply if the ground to UAV communication link is lost, to prevent hazards from a UAV crashing to the ground.

Apart from getting clearance from the FAA and the FCC, some other key issues that need to be addressed for the successful deployment and acceptance of UAVs are:

Physical Layer

The setup requires locations of ground base stations such as the microwave towers. There are issues such as bandwidth requirement, channel characteristics, transceiver design, range of aerial platform to ground base stations, power and fuel consumption.

Communication Properties Issues

The UAV and the base station must have the ability to transmit and receive video, data, and control signals in a reliable and failsafe way. Issues to be considered under this section are high-bandwidth requirements, asymmetric data communications, integration

with ground sensors, potential real-time communications with an incident commander, and distributed data exchange.

Communication Network Layer Issues

Issues such as network configuration and reconfiguration, fixed infrastructure versus ad hoc networks, adaptive quality-of-service, mobility management (location update and handoff), and ground station (tower) location and distribution need to be covered for proper communication between the ground base stations and the UAV.

There are several more issues, such as spectrum allocation (unlicensed versus licensed), data security, and political and public acceptability, which need to be taken care off for the successful deployment of UAVs in civil airspace. Ground crew training and pilot certifications are required to fly the UAV. Also, various economics are involved such as system and lifecycle cost of hardware, software, data products, training and certification of ground crew, analysts etc. Yet, the most important issue remains the safety involved in flying the UAV in civil airspace; it should be a hazard to other aircraft, ground vehicles, people and facilities.

Many agencies, industry, and universities along with the FAA have made efforts to develop alternative regulatory tools for UAV deployment. The DOD has developed and updated its 25 years strategic UAV technology deployment roadmap, which could benefit manufacturers of civilian and commercial UAVs [32]. The ACCESS 5 regulatory UAV road-mapping efforts are funded by NASA, DOD and industry (UNITE) with FAA participation. They focus on the high end UAVs used primarily by DOD. Several voluntary standards and professional associations (ASTM, RTCA, AIAA, ICAO) have formed UAV standard committees to develop appropriate UAV safe operability standards for the FAA.

4. Existing Systems and Current Research Work

Several types of aerial surveys have been used or tested to measure data related to traffic management. The method of using fixed-wing aircraft to collect congestion and traffic information was being used as early as 1965 by a transportation consultant in Maryland. Researchers from the University of Karlsruhe in Germany examined the matching of vehicle images from aircraft in 1987. New methods of improving this technology are under development and research at various universities around the world. Researchers have tried experimenting on fixed wing aircraft, helicopter, observation balloons, and satellites.

Fixed-wing or Rotary-wing vehicles are been used as experimental aircrafts at several Universities. Bridgewater State College, Geodata Systems, and the MLB Company developed small winged craft with live video feeds and high resolution still imagery, and examined the suitability of the data for various applications. Iowa State University investigated camera equipped helium balloons that could be launched at short notice from

pickup trucks. This section covers some of the research work on-going at several universities such as University of Florida, Ohio State University, Linkoping University (LiU), Sweden, Georgia-Tech, Stanford, Carnegie Mellon University, etc.

4.1 University of Florida – Airborne Traffic Surveillance Systems (ATSS)

ATSS is a project initiated by the University of Florida (UFL). The ATSS research team includes the UFL research team, the Florida Department of Transportation (FDOT), Tallahassee Commercial Airport, and University of North Florida Road Weather Information System (RWIS) Research Team [1]. FDOT organized a proof-of-concept test to choose UFL as the primary contractor for conducting this project [4].

The primary interest of this project to the FDOT is monitoring remote and rural areas of the state of Florida. The ATSS proof-of-concept project also aims at evaluating the feasibility of the wireless communication systems, as well as switching of the video. The project serves as a case-study for the use of UAVs in remote sensing and multimodal transportation.

The SRA/Aerosonde was chosed as the UAV vendor. The Aerosonde UAV is a fixed-wing vehicle made in Australia and operated by Aerosonde Pty Ltd (AePL). It flies for over 32 hours, at an altitude of between 300 to 20000 feet above the ground, where it will be largely invisible during daylight hours. The Aerosonde employs a Sony XC555 video camera, which captures video of the traffic on the highway; and a pair of Vaisala RSS901 weather sondes to gather freeway surveillance and RWIS data for transmission to the FDOT microwave towers [4]. The data and video are transmitted using a 2.4 GHz wireless link.

Figure 1: The Aerosonde UAV

The proof of concept test intended to show that the UAV can fly for a certain distance collecting traffic information and successfully transmit it to the base stations. A small segment of highway between two of FDOTs microwave towers, at Lake City and White Springs, was chosen. The UAV is expected to capture and transmit the video in real-time while it flies along the highway. The aim is to investigate the integration of ATSS into FDOT’s existing microwave network, Traffic Management Centers (TMCs) and the State Emergency Operations Center (SEOC).

Figure 2: UAV captures video on highway [1].

The base station consists of video encoders, which receives the video from the UAV, encodes it and transfers it to the FDOT network. Both towers would transmit different signals with different signal strengths. These signals and data are received by the SEOC. Based on the signal strength and the designated handoff algorithm, SEOC switches the video signals and shows the video of highway traffic received by the better signal.

Figure 3: Video Encoding and Recording at the Microwave Tower [1].

Figure 4: Video Decoding, Switching and Display at the SEOC [1].

UFL has developed two software programs, SignalReader and VideoProcessor for efficient communication and processing of the video signals. SignalReader reads the signal strength received from the video receiver, uses an internal algorithm to parse the signals into the correct format, accurately decodes it, and transmits the signal strength value over the microwave IP network using TCP client sockets. VideoProcessor receives the video signals from the two microwave towers, encodes them in Windows Media format, and uses an embedded multimedia player to play the streaming video. It also switches the video signals based on a handoff algorithm built into the program. Simulated tests were performed in December 2003 and January 2004, using the communications equipment, FDOT’s microwave IP network and UFL-developed software, to demonstrate the feasibility of the project. Another simulated test was performed in April 2004 on the site, with the UFL research team testing the equipment and software at Lake City, White Springs and the SEOC. These tests demonstrated that the ATSS project is completely capable of supporting ground communication between the towers and the SEOC.

4.2 WITAS Unmanned Aerial Vehicle Project

The Wallenberg Laboratory for Information Technology and Autonomous Systems (WITAS) is conducting a long-term basic research project on Unmanned Aerial Vehicles at the Linkoping University (LiU), Sweden [17]. The project is multi-disciplinary and in cooperation with a number of Universities in Europe, USA and South America. The goal of this project is to develop technologies and functionalities necessary for the successful deployment of a fully autonomous UAV operating over diverse geographical terrain containing road and traffic networks. It involves integration of autonomy with an active vision system consisting of digital video and IR cameras, and a ground control dialogue system.

The UAV is intended to navigate autonomously at different altitudes, plan for mission goals such as locating, identifying, tracking and monitoring different vehicle types, and construct internal representations of its focus of attention for use in achieving its mission goals [17], [20]. The project also aims for identifying complex patterns of behavior such as vehicle overtaking, traversing of intersections, parking lot activities, etc. The main goals of this ongoing research project are:

?Development of reliable software and hardware architectures with both deliberative and reactive components for autonomous control of UAV platforms;

?Development of sensory platforms and sensory interpretation techniques with an emphasis on active vision systems to deal with real-time constraints in processing sensory data;

?Development of efficient inferencing and algorithmic techniques to access geographic, spatial and temporal information of both a dynamic and static character associated with the operational environment;

?Development of planning, prediction and chronicle recognition techniques to guide the UAV and predict and act upon behaviors of vehicles on ground; and ?Development of simulation, specification and verification techniques and modeling tools specific to the complex environments and functionalities associated with the project.

WITAS uses a generic UAV setup consisting of an air vehicle with a still or video camera, a tactical control station with one or more humans in the loop, and a data-link between the station and air vehicle used for downloading images and data and for uploading navigation and camera control commands. WITAS is currently collaborating with Scandicraft Systems, a university spin-off company that develops autonomous mini-helicopters [17]. The Apid Mk III has a payload of 20 kg including fuel. WITAS is also considering using Yamaha RMAX Aero Robots, which as a payload of around 30 kg which is helpful for an extra set of camera housing and on-board system.

Figure 5: The Scandicraft Apid Mk III UAV. Picture taken from

https://www.wendangku.net/doc/5d16043737.html,/Since80s/h_apid3.php

The WITAS project is divided into four stages. The first stage involves collection of a library of video sequences of various vehicle scenarios and traffic patterns. In stage two, a mathematical model of the helicopter platform is derived, which is used as the basis for

experimentation and development of robust fuzzy controllers for the platform. The project is currently at the end of stage two and beginning of stage three. Stage three would include the basic development of the on-board system, which will be initially used from the ground to control the Scandicraft platform. The input to the ground system consists of helicopter state and sensor information in addition to analogue video received via a radio link. The output from the system and to the helicopter platform consists of flight control and camera control commands. The fourth and final stage will integrate the system developed in stage three and be placed on-board the platform where both semi- and fully autonomous experimentation will ensue.

The project uses an Intelligent Vehicle Control Architecture (IVCA), which is a multi-layered hybrid deliberative/reactive software architecture. The architecture contains two main information repositories, the Knowledge Structure Repository (KSR) and the Geographic Data Repository (GDR). The deliberative and reactive layers of the architecture communicate directly with the core vision system. The vision system tries to determine the position, velocity, color and type of vehicle, or vehicles, in the foci of attention. This involves accurately determining the position of the UAV and camera angles, mapping positions in image coordinates to geographical coordinates, anchoring identified objects into qualitative descriptions of road segments, estimating absolute and relative motions of objects, and indexing or matching the view from the camera with the information in the GDR so as to derive additional information about a situation, or generate additional constraints to assist the operations carried out in the vision system.

A model-based distributed simulation environment was developed, to support the design and evaluate the software architectures and helicopter controllers. A set of scenarios are devised to test various functionalities of the architecture. For the purpose of generating a realistic simulation environment, all the data is collected using manned helicopters, and the data is post processed off-line.

Figure 6: Virtual Simulation: Traffic/Tunnel Scenario; Pseudo-Virtual Simulation over

Stockholm

4.3 Ohio State University

This research is an Ohio Department of Transportation research consortium led by Ohio State University (OSU) [5]. The project is pioneered by National Consortium on Remote Sensing in Transportation (NCRST). The UAV used for experimentation by OSU uses the BAT III technology provided by the MLB Company, carrying a payload of two video cameras, and can fly at an altitude of 500 ft with an air speed of 30 mph.

Figure 7: MLB BAT 3 Technology. Image taken from

https://www.wendangku.net/doc/5d16043737.html,/bat3.html

The BAT technology acquires information of the vehicle using videos and sensors such as GPS, and communicates with the base station on a 2.4 GHz data link.

Figure 8: BAT Technology used in OSU research.

The field experiments were conducted in July 2003 at Columbus, OH, on different freeway scenarios, collecting information on freeway conditions, intersection movements, network paths and parking lot monitoring. The UAV flew at an altitude of 500 ft and an air speed of 30 mph while transmitting the video images collected by its on-board camera to the ground station in real-time. The UAV flew over a freeway for the

purpose of observing flows, speeds, densities, off-ramp weaving, and vehicle trajectories. It also observed the flows, turning movements and queue lengths on intersections while gathering information on a network consisting of seven intersections [5]. Information collected by such flights can be useful in accessing and predicting network conditions which can be used by the state DOT to control real-time signal timing depending on the link speeds, link densities and queue lengths. The final scenario was that the UAV made a tour of surface parking lots to assess their utilization. The information gathered from such scenario can be helpful in space planning and distribution. It can also provide quasi real-time information to travelers.

Figure 9: Views of the SR 315 freeway interchange with Lane Ave captured in real-time from a UAV. (Left): Wide angle view looking south while flying along the freeway (Right): telephoto view looking south while flying along the freeway.

Figure 10: An example of circling a facility with the UAV, showing queue lengths and

turning movements.

Figure 11: An example of (Left): circling a network with the UAV, (Right): shows

utilization of three parking lots.

The team at the Ohio State University is now focusing on learning, discovering, and developing potential benefits of UAV applications to transportation surveillance; and quantifying the value of the potential benefits. The field experiment provides a strong indication that the application of the UAV technology to surface transportation surveillance seems viable and potentially viable. It was observed during these experiments that the UAV followed its pre-programmed flight plan covering the locations of interest accurately. Though, various refinements are yet to be made based on the current observations made through the experiments. It was observed that better resolution is required to identify distinguished characteristics in individual vehicles. Also, radio interference was observed beyond distance of 1 mile, corrupting the images. Such problems need to be addressed by utilizing a dedicated communication channel.

4.4 Georgia Tech’s Traffic Surveillance Drone

Traffic Surveillance Drone is a project funded by the Georgia Department of Transportation and the Federal Highway Administration’s Priority Technology Program. It is being developed at the Georgia Tech Research Institute’s (GTRI) Advanced Vehicle Development and Integration Laboratory. The focus of this research is on development of generic VTOL UAV test-bed that may be used to flight test other research projects such as advanced controllers, fault-tolerance algorithms and autonomous operation algorithms. This drone is being designed and tested with affordability and safety in mind, making it attractive to the law enforcement agencies, emergency search and rescue teams, and the highway departments.

Figure 12: Traffic Surveillance Drone developed at Georgia Tech. Image taken from https://www.wendangku.net/doc/5d16043737.html,/RCM/RCM/DronePictures/GDOT_Drone.GIF

A militarized version of the drone known as Dragon Stalker has capabilities that make it attractive in case of low intensity conflicts and urban warfare. Both the versions of the drone are VTOLs and hence they don’t need much space for takeoff and landing purposes.

The drone will be able to relay live video and two-way audio from the site of traffic incidents, back into the state’s Advanced Traffic Management System (ATMS). The images would be relayed from 5 to 10 miles via a spread spectrum link. The initial effort involves the development and working of a prototype to demonstrate the capabilities of traffic data collection. It will be capable of 30 minutes of flight at a maximum speed of 30 knots. The projection is that the later versions of this VTOL vehicle will be fully autonomous. They are focusing in developing software-enabled control methods for complex dynamic systems with application focus on intelligent UAVs. Also, further plans consist of developing intelligent, agent-based mission planning algorithms in order to achieve dynamic performance and flight control command generation under various aircraft dynamics and environmental constraints.

4.5 University of California, Berkeley: Ultimate Auto-Pilot

The University of California at Berkeley is building intelligent guidance systems for UAVs, which may be used for monitoring traffic conditions, collecting data from environmental sensors, etc. The project is sponsored by the Office of Naval Research’s (ONR) Autonomous Intelligent Network and Systems (AINS) program.

The first goal of this project is to develop strategies of path-planning for a UAV to track a ground vehicle. An algorithm based on waypoint strategy was created. The computer vision system detects natural features of the scene and tracks the roadway in order to determine relative yaw and lateral displacement between the aircraft and the road. The UAV will fly in a sinusoidal manner at a constant velocity while tracking the ground vehicle which has varying speed. If the ground vehicle is not moving, or its speed is under a selected threshold, the UAV starts to follow a circular path or rose curve trajectory. The effect of wind disturbances has been taken to offset the planned UAV trajectory. The path-planning algorithm has been developed, tested and debugged using the “controller development platform” [12].

Figure 13: Experimental autonomous aircraft: Sig Rascal radio-controlled airplane

It was observed that GPS is not suitable enough for most surveillance applications. The Berkeley team’s approach is to augment GPS with machine vision software and an off-the-shelf video camera [10]. Another approach was taken by the UC Berkeley based Partners for Advanced Transit and Highways (PATH) to develop a software which distinguishes the road from the surrounding area based on differences in contrast.

The fundamental research thrust is to make UAVs capable of self-directed navigation, and develop a higher layer of coordinated team control that autonomously integrates teams of self-navigating platforms to execute operations such as rapid, real-time mapping, continuous distributed surveillance, and on demand search and rescue.

Figure 14: Picture of video taken from a camera on the bottom of the UAV. Image taken

from [9].

The Berkeley UAV research is also focused on a problem called “canyon problem”. The problem basically states that a UAV maybe between buildings lining the road its tracking while monitoring traffic. Algorithms need to be made for the safety and efficient working of UAV in such scenarios.

Another research at UC Berkeley called the Berkeley Aerobot Project (BEAR) focuses on the development of intelligent control architectures for UAVs. The issues under consideration are multi-agent, multi-modal control and visual servoing.

Figure 15: Berkeley’s BEAR project. Image taken from

https://www.wendangku.net/doc/5d16043737.html,/bear/research/motivation.htm

4.6 European Commission’s COMETS Project

COMETS is a research project involving LAAS (Laboratoire d’Architecture et d’Analyse des Systemes), CNRS (the French National Scientific Research Centre); the Real-Time Systems and Robotics group (Prozessdatenverarbeitung und Robotik – PDV) of the Technische Universitat Berlin; ADAI (Associa??o para o Desenvolvimento da Aerodinamica Industrial – Association for the Development of Industrial Aerodynamics); CVL (Computer Vision Laboratory) at Linkoping University; and HELIV (Helivision). The main objective of COMETS is to design and implement a distributed control system for cooperative detection and monitoring using heterogeneous UAVs.

Both fixed-wing and rotary-wing UAVs are projected to be used. The three vehicles involved in the COMETS project are MARVIN Autonomous Helicopter (TUB), Karma Blimp (LAAS) and Remotely Piloted Helicopter (HELIV). MARVIN (Multi-purpose Aerial Robot Vehicle with Intelligent Navigation) is a helicopter with maximum payload of 5 kg [34]. Karma Blimp is a fixed-wing plane with speeds up to 45 km/h, with an endurance of 40 minutes and maximum payload of 5 kg.

Figure 16: UAVs used in the WITAS project. Images taken from https://www.wendangku.net/doc/5d16043737.html,

The mission goals involve development of new control architecture and control techniques for real-time coordination and control. Further, it will focus on integrating the distributed sensing techniques and real-time image processing capabilities. The project will also focus on autonomous control of helicopter (rotary-wing) and airship (fixed-wing).

The operational environment is over an area with widely varying geographical terrain containing traffic networks in city, suburban and rural areas. The project will give stress on cooperative environment perception: detection and monitoring perception tools, cooperative terrain mapping.

The objectives of the UAV in the COMETS project are

?To monitor traffic situation

?To identify and track individual vehicles

?To identify episodic behavior of both individual and group of vehicles

?To gather data pertaining to road network use and abuse

?To provide assistance to emergency services

?To serve as a mobile sensory platform with real-time information gathering and processing capabilities.

Figure 17: UAV being used to survey traffic trends

Some of the objectives of this research includes Management, Specifications, Architecture (design, development, communication, and interaction and cooperation paradigms), Central real-time coordination and control (mission planning, mission monitoring and control, real-time simulator, and tele-operation tools), Distributed reliable autonomous real-time control (helicopter upgrading and integration, fault detection and reliability tools, UAV control methods), Cooperative Environment Perception (detection and monitoring perception tools, cooperative terrain mapping), Testing and Validation,

Field Experimentation and Demonstration (preparation and coordination, field experiments, post-experiment analysis), and Dissemination and Exploitation.

In the COMETS architecture, each UAV in the Flying segment is endowed with: a) its Onboard Proprietary Components, that gathers the various functions specific to the UAV (flight control, data acquisition, possible data process), b) a Generic Supervisor, that interfaces the UAV with the other COMETS sub-systems (ground segment and other UAVs) and controls its activities, and c) a deliberative layer which provides autonomous decisional capabilities to the UAV. Communications in COMETS are realized via a distributed shared memory: the Blackboard Communication System.

Figure 18: Communication between ground station and aircraft [33].

The first general experiments for the project took place in May 2003 in Lousa, Portugal, one year after project began. Individual UAVs were tested, and coordination experiments were carried out. "The results were very satisfactory because they provided a lot of information to guide the research and development in the second year of the project.

4.7 Bridgewater State College/ UMass-Boston

The project team consists of U.S. Department of Transportation’s (USDOT) Research and Special Programs Administration, the National Aeronautics and Space Administration, Bridgewater State College in Bridgewater, MA, the University of Massachusetts at Boston (UMass-Boston), and the MLB Company of Palo Alto, CA. The aim of this project was to develop an autonomous UAV to collect and interpret real-time traffic data.

In August 2002, USDOT launched the UAV over Boston, MA, where it provided real-time video to a ground station on the UMass-Boston campus. MLB Bat 3 technology UAV was used for this experimentation. Using the remote-sensing technologies, the UAV performed an aerial survey, documented new land use, provided a clear picture of a central geographic point, and gathered multimodal data using its road-following capabilities [30]. The UAV also produced real-time imagery of a commuter train and traffic conflicts on a major arterial roadway.

Preliminary findings show that remote-sensing images from UAVs could provide an unprecedented level of detailed information to traffic and transit managers and transportation researchers, especially if used with automatic vehicle location systems and other flow-monitoring technologies.

4.8 Airborne Data Acquisition System (ADAS) for Traffic Surveillance

A National Consortium on Remote Sensing in Transportation (NCRST) Technology Application Partner is making use of an unmanned ADAS for traffic surveillance, monitoring and management. ADAS comprises of an aircraft, a sensor pod, and a ground station. The aircraft can fly for more than two hours with a sensor payload of 20 lbs.

A demonstration of capabilities was performed in cooperation with the Virginia Department of Transportation in December 2001 and April 2002 [31]. The goals were to demonstrate the feasibility of ADAS for traffic surveillance. The test-bed was chosen as a section of I-64 in Tidewater, Virginia. The imagery demonstrated the use of UAVs in real time traffic surveillance, monitoring of traffic incidents, signals, and environmental conditions of roadside areas.

Figure 19: ADAS for Traffic Surveillance [31].

4.9 Stanford University’s Hummingbird Project

The Hummingbird Project at the Aerospace Robotics Laboratory (ARL) is in the process of building its own helicopter, the Hummingbird, which can perform desired tasks due to their advanced capabilities and great flexibility. The ultimate goal of this research is to demonstrate the practicality of using inexpensive robot helicopters to perform tasks without the need for highly trained human operators. Initial work in this project focused

on demonstrating the feasibility of using Carrier-Phase Differential GPS (CDGPS) as a sensor system for altitude and position control as well as navigation.

Hummingbird is a small autonomous helicopter build by a team at the ARL. Navigational sensing is provided entirely by a pair of Trimble Global Positioning System receivers operating using Differential Carrier Phase calculations. Four antennas are used to sense the altitude as well as position with GPS. An additional sensor gathers information about the environment, which helps in object location, identification, and retrieval.

Precision flight was experimentally demonstrated by performing autonomous hover, autonomous retrieval of a ferromagnetic disk using a magnet-tether manipulator, and autonomous landing tasks.

Figure 20: Hummingbird Helicopter in autonomous flight. Image taken from http://sun-

https://www.wendangku.net/doc/5d16043737.html,/~heli/

4.10 Carnegie Mellon University’s Autonomous Helicopter Project

The main goal of this project is develop a vision-based robot helicopter which can autonomously carry out a well-structured set of mission goals in any weather conditions using only on-board intelligence and computing power. The mission goals include automatically start and takeoff, fly to designated area on a prescribed path avoiding obstacles, search and locate object of interest, visually lock and track object, send back images to the ground station while tracking the objects safely return home and land.

One of the proposed applications of this helicopter is law enforcement. Vision-based robot helicopters can fly overhead to aid the police in dangerous high-speed chases or criminal search operation. Another function is to relay images from trouble spots for timely assessment of the situation by human experts who can then dispatch police units to the area.

相关文档