Category: Autonomous Driving

Path planning using the OSM XML map

One of the main objectives of the AVP project is to create maps of car parks. Parkopedia is committed to working with our Open Source partners through the Autoware Foundation and have therefore released 3 maps of car parks to the community under the Creative Commons 4.0 BY-SA-NC license.

The maps are designed to be machine readable and are supplied in the OpenStreetMap XML format. This format is widely used and forms the basis for the OpenStreetMap mapsthat anyone can contribute to using tools such as the Java Open Street Map editor.

Our maps are designed to be useful for Automated Driving, which is why we’ve decided to make use of the Lanelet library as the data model for maps within the Autonomous Valet Parking prototype vehicle.

You can download the maps here and the following code can be used to plan a path using the lanelet library.

# libs
import lanelet2
import lanelet2.core as lncore

# load the map, for example autonomoustuff
osm_path = os.path.join(os.path.dirname(os.path.abspath('')), "AutonomouStuff_20191119_134123.osm")
print("using OSM: %s (exists? %s)" % (osm_path, os.path.exists(osm_path)))

# load map from origin
lorigin = lanelet2.io.Origin(37.3823636, -121.9091568, 0.0)
lmap = lanelet2.io.load(osm_path, lorigin)

# ... and traffic rules (Germany is the sole location, for now)
trafficRules = lanelet2.traffic_rules.create(lanelet2.traffic_rules.Locations.Germany, lanelet2.traffic_rules.Participants.Vehicle)
graph = lanelet2.routing.RoutingGraph(lmap, trafficRules)

# create routing graph, and select start lanelet and end lanelet for the shortest Path 
startLane = lmap.laneletLayer[2797] # lanelet IDs
endLane = lmap.laneletLayer[2938]
rt = graph.getRoute(startLane, endLane)
if rt is None:
    print("error: no route was calculated")
else:
    sp = rt.shortestPath()
    if sp is None:
        print ("error: no shortest path was calculated")
    else:
        print [l.id for l in sp.getRemainingLane(startLane)] if sp else None

# save the path in another OSM map with a special tag to highlight it
if sp:
    for llet in sp.getRemainingLane(startLane):
        lmap.laneletLayer[llet.id].attributes["shortestPath"] = "True"
    projector = lanelet2.projection.MercatorProjector(lorigin)
    sp_path = os.path.join(os.path.dirname(osm_path), os.path.basename(osm_path).split(".")[0] + "_shortestpath.osm")
    lanelet2.io.write(sp_path, lmap, projector)

# now display in JOSM both, and you can see the path generated over the JOSM map 
# Ctrl+F -->  type:relation "type"="lanelet" "shortestPath"="True"
# and the path will be highlighted as the image below

Happy path planning!

Halfway and a successful public demo!

The Autonomous Valet Parking project is a 30-month project funded by InnovateUK and the Centre for Connected and Autonomous Vehicles, due to end on 31 October 2020. We are five quarters in and a lot has been achieved:

  • We have created our first maps of car parks which we are trialling with customers,
  • The vision-based localisation algorithms are well advanced,
  • The stakeholder engagement work is complete,
  • The autonomous software to power our StreetDrone is ready to take out for a demo,
  • The safety case has ensured that we are in a position to demo this safely. 

At the CENEX-CAM event that took place at Millbrook Proving Ground UK 4-5 September 2019, we showed the progress made in the project at this halfway point. The main objective was to demonstrate that the Autoware-based software on the StreetDrone is able to control the vehicle by following waypoints consistently and accurately. The demonstration scenario consisted of three parts and reflects how we believe AVP will be used in real life. In the demo, you can see:

  1. The vehicle following a pre-defined set of waypoints to the designated parking spot, having been dropped off at a designated drop-off zone by its driver,
  2. The vehicle exiting the parking spot and driving to the pick-up zone (where the vehicle’s regular driver would collect it),
  3. A test of our automatic emergency braking, using the front-centre ultrasonic sensor on the vehicle.

This public demo was an important milestone for us to demonstrate our ability to control the vehicle using a PID controller for longitudinal (throttle) control and pure-pursuit for lateral (steering) control.

Localisation is done using the LiDAR with NDT matching. At this stage of the project we’ve limited the speed to 1m/s, this will double to 2m/s (5mph) in the future.

We are using the SAE convention for marking lamps, with green for manual control, blue when under autonomous control and red when an error state occurs. Adding the RGB LED lighting was done alongside the development work to enable switching between forwards and reverse in software while still in autonomous mode.

The safety case for the project combines operational and system safety. On the operational side we have a safety driver who can take over when a dangerous situation presents, and we also have system safety using the LiDAR and ultrasonic sensors, which will bring the vehicle to a stop to avoid driving into a hazard. We demonstrated Automated Emergency Braking using the ultrasonic sensor, following the testing and preparation done previously at Turweston.

Overall, the demo (done five times in two days) was well received, and we saw good levels of interest from delegates at the event, with lots of questions being asked about the project. It was a pleasure to speak to media and to delegates.

We’ve learned a lot from our friends at Ordnance Survey and we look forward to hosting them and others at the upcoming Autoware Hackathon. Many thanks to them of all the help and for storing our StreetDrone overnight under their gazebo!

Over the remaining 13 months of the project, we will be working on navigation and localisation using maps, with a final demonstration of the end solution due to take place in Autumn 2020.

Autoware workshop @ Intelligent Vehicles Symposium 2019

IV2019 brings together researchers and practitioners from universities, industry and government agencies worldwide to share and discuss the latest advances in theory and technology related to intelligent vehicles.

The Autoware foundation is hosting a workshop on Sunday 9th June 2019, with the aim of discussing the current state of development of Autoware AI and Autoware Auto, and considering various technical directions that the Foundation is looking to pursue. Parkopedia’s contribution is around maps and specifically, the integration of indoor maps for Autonomous Valet Parking.

Parkopedia’s Angelo Mastroberardino will be presenting our work on maps, answering questions like “Why do we need these maps?”, “How do we represent geometry and road markings within maps?”, and naturally leading towards the question of how we use these maps for path planning within indoor car parks.

Later, Dr Brian Holt will be joining Tier 4, Apex.AI, Open Robotics, TRI-AD, and Intel on a panel to discuss Autoware and its impact on autonomous driving.

Parkopedia joined the Autoware Foundation as a premium founding member, because we believe in open source as a force multiplier to build amazing software. We are contributing maps, including for the AutonomouStuff car park in San Jose, USA, which you can download for use with your own self-driving car in simulation. Find out more

First Autonomous Test

We successfully completed our first tests at Turweston Aerodrome last week.

Unloading the StreetDrone.ONE

The plan was to check and ensure the robustness of the drive-by-wire system, to train our safety drivers and to do basic path following.

We also took the opportunity to collect some data from the ultrasonic sensors that are on the StreetDrone.ONE which we will use for system safety.

Ultrasonic data collection
Drive-by-wire testing

For testing the drive-by-wire system, we carried out a number of test runs using teleoperation from the driver. We drove on the track forward, backward and changing steering at various speeds. The system performed satisfactorily. We also performed a full brake test to work out a safe driving speed and stopping distance in a case of emergency stop. Further details are presented in this blog post.

Path following with PID feedback control and a basic GPS and IMU localisation

On the final day, we tested a basic path following to make sure everything worked together. We integrated the drive-by-wire (dbw) system with a path follower, PID motion controller and a basic gps and imu localisation in this open space environment.

We managed to achieve the objective of testing the dbw with the feedback control for the path following. However, precision was not there as we expected. The basic imu and gps sensor localisation would not give very accurate positioning and tends to drift away or jump around to within 5m accuracy. To resolve this issue, we are working on a better localisation using RTK GPS (like a simpleRTK2B) using RTK corrections over NTRIP.

Our next test will focus on path following using the high quality localisation and we also hope to start with path planning within the open space. More updates will follow!

© 2019 AVP

Theme by Anders NorenUp ↑