Equipment and Capabilities

Equipment and Capabilities

The ASIM lab is equipped with the state-of-the-art equipment to conduct research in multi-agent robotics, autonomous and driverless vehicles, advanced driver assistance systems (ADAS), neuro-engineering and brain signal processing, and driver monitoring. We are using three safe testing and evaluation paradigms for our algorithm development and evaluation:

1. Scaled-down emulation of a smart city with mobile robots

2. Driving simulator

3. Autonomous vehicle platform

Autonomous and Tele-Operated Mobile Robots

In addition to computational modeling and simulation, the lab is equipped with a number of robots capable of tele-operation and autonomous motion. Mobile robots with onboard processors are augmented with a host of sensors and wireless communication equipment to operate independently and in coordinated assemblies.

ASIMcar

ASIMcar is a small-scale platform for Connected Autonomous Vehicle (CAV) research that outperforms currently available commercial options in several important benchmarks. The platform is built around a radio-control (RC) car using high-performance brushless DC motors allowing the vehicle to reach a maximum speed of 70 mph, expanding the possibilities for higher speed research applications. Furthermore, this platform is equipped with a robust sensor suite and features a state-of-the-art embedded GPU unit for onboard computation, allowing for real-time control over a wide range of challenging operations. For demonstration and comparison, lane keeping as an Advanced Driving Assistance System (ADAS) function was implemented and evaluated using the platform. Other commercially available mobile robots are expensive, offer limited capabilities, are harder to modify for various research needs, are more difficult to interface with other robots (cars), and could have proprietary software/hardware features which render them less flexible and less adaptable to specific research needs. The developed ASIMcar overcomes these limitations and provides a highly flexible and cost-effective alternative for automated/autonomous and connected vehicle research and development projects.

You can learn more about ASIMcar in the paper that we published here.

Smart City Scaled-Down Emulation

A small emulated track space is created in an indoor lab to allow testing of mobility functions in a communication-enabled environment. Continuous tracks, intersections, building blocks, and road features are simulated in a scaled-down model space in which the mobile robots can be tested. Road markings for lanes and scaled-size traffic devices (stop sign, etc.) are installed for robot’s (vehicle’s) vision processing. Although the robots are equipped with outdoor GPS, an indoor GPS function is required for localization. An overhead camera and digitization of the track map provides a GPS-like function. Proximity sensors emulate traffic loop detectors to determine the presence or absence of robots in intersections and estimate their speeds. Robots communicate with each other and with a central station if required.

Several other features are added to enable a complete scaled-down testing environment. The robot’s vision system can maintain lane keeping or road following using trajectory plans. Robots can be tested for tele-operation and autonomous driving using vision, GPS, and other sensors. Various functions such as lane change maneuvers, collision avoidance, car-following, platooning, intersection control, and numerous other connected vehicle functions can be tested in this environment. We are building and enhancing this environment as project needs arise.

Driving Simulators

Full Cabin Driving Simulator

A full-cabin driving simulator (developed by Realtime Technologies) simulates driver-controlled and autonomous driving in a smart traffic environment with several other advanced simulation capabilities. The simulator offers a totally embedded full-cabin environment with realistic control features; full vehicle dynamics with adjustable parameters; curved projection with 180-degree driver field of view; day, night, rain, snow, and fog driving environments; an extensive library of roads, infrastructure, and traffic control devices; along with advanced data acquisition systems. A unique feature is the ability of multiple drivers driving within the same scenario from different networked simulators.

                         

Desktop Simulator (Networked)

A desktop simulator using the same software is linked/networked with the full cabin simulator to allow simultaneous driving of driver-in-the-loop or autonomous driving on the same road/infrastructure scenarios. Multiple drivers driving within the same scenario is also possible and allows extensive testing of ADAS in a safe environment.

Driver (Human) Monitoring Systems

The lab is fully equipped with human (driver) monitoring systems such as eye-monitoring and brain EEG sensors for a variety of perception-response, controls, and ADAS experimentation. In collaboration with other departmental and College of Engineering labs, we are equipped with an extensive and comprehensive set of physiological monitoring systems.

Autonomous Vehicle

In collaboration with other ME colleagues and laboratories, an autonomous vehicle is developed based on Virginia Tech’s earlier vehicle platform (Victor Tango) which secured the third place nationally in the DARPA Urban Challenge competition in November 2007. The vehicle is updated with new sensors and up-to-date adaptable capabilities to accommodate experimentation of various sensory platforms, sensor fusion methods, decision algorithms, and control systems for driverless and connected vehicle research and testing.