Street-level Laser Point Clouds and Imagery for Smart Parking

Did you know that you can waste a week or more per year hunting for a parking space in busy cities? Nowadays, smart parking apps offer a much-needed answer to the problem – and mobile mapping laser point clouds and imagery are core ingredients of the solution. Besides saving time by guiding you straight to an available spot close to your destination, smart parking apps also reduce unnecessary fuel consumption, air pollution and greenhouse gas emissions. One such app is AppyParking by London-based AppyWay. The AppyParking app, which is available for free download for iPhone and Android devices, allows users to see all on-street and off-street parking areas, restrictions and operating hours. Before arrival motorists can make informed decisions about where to look for a parking place and can find the nearest and cheapest options. At the very core of such a smart parking app lies an accurate, detailed and up-to-date map. Ideally, the map should not only cover the locations and outlines of the parking zones, but also provide additional information on their characteristics or features. The signs placed at or near to the parking spaces are a major source of such features. In the case of AppyParking, the data used to create the map consisted of street-level laser point clouds and imagery. The survey was conducted by Getmapping Plc – a company headquartered in Fleet, Hampshire, UK, that is specialized in aerial photogrammetry, mobile mapping, Lidar, digital mapping and web-based services across Europe and Africa.
 

Mobile Mapping System

  Getmapping Plc operates multiple Pegasus:Two Ultimate mobile mapping systems (MMSs). This survey was conducted using a Pegasus:Two Ultimate MMS (Figure 1) and a Pegasus:Two upgraded to the Ultimate version. The cameras of the Ultimate have a high dynamic range thanks to a large sensor-to-pixel ratio and a dual-light sensor. The camera’s high dynamic range enables crisp images to be captured in a variety of lighting conditions and at various vehicle speeds. The image quality is further improved by the camera sensor resolution of 12MP. The onboard JPEG compression allows massive amounts of images to be stored on the removable drive on the spot without compromising on image quality. Data can be saved directly and connected seamlessly to any PC or server with a USB 3.0 interface. Compression is a prerequisite for prolonged surveys without interruptions since the camera produces imagery with a three times higher resolution compared to standard systems. The side cameras capture eight frames per second (FPS) at a field of view of 61 x 47 degrees. The maximum ground sample distance (GSD) at a distance of 10m from the camera is 3mm. The fish-eye camera system, which consists of two cameras mounted back to back, provides seamless 24MP imagery with a 360-degree field of view. The dual fish-eye camera system is aligned with the laser scanner, enabling colourization of the laser point clouds (Figures 2 and 3).  

Survey

  The MMS capture of London’s boroughs was conducted in two steps. In the first step, six boroughs were surveyed in autumn 2017. Three surveyors drove approximately 50km per day to capture 25km of road trajectories, usually in back-and-forth pass. The survey continued throughout the winter, even when light conditions were poor. No surveyors or traffic management crews were needed on the street – all data was captured from the vehicle. The captured images and laser point clouds were subsequently converted into maps during the first six months of 2018. Two office-based operators processed the data and stored it on hard disks using Leica Pegasus:Manager, which leverages the latest system calibration methodologies to precisely overlay imagery and point cloud data.
 
The second stage, in which a further 13 boroughs were surveyed, was completed by mid-2018. Many roads needed second passes to eliminate or reduce occlusions. The capture rates varied significantly depending on the borough being captured and the time of day. Each day’s data was processed within four days of being captured. The total time span between survey planning and delivery of the end products was three weeks. To give an impression of the amount of data captured: an MMS survey of 120km of roads resulted in around 0.5TB of raw data.  

Mapping

  The points, symbols, polygons and point features making up the parking zones were extracted manually and in a (semi-)automated manner from the point clouds and 360-degree fish-eye imagery lines using Leica’s MapFactory which is embedded into Esri ArcMap. The extracted data can be easily imported into Esri solutions or other GIS platforms for further processing and usage. The information on parking signs was extracted from the imagery and attributed to the relevant parking zone. The outlines of each parking zone were represented by polygons. Wherever outlines were occluded (i.e. visibility was obstructed by cars or other objects in the line of sight), the polygons were interactively collected and refined. For example, the visible lines were extended and intersected with each other to obtain the coordinates of invisible corner points of parking zones.
 

Results and Challenges

  The relative accuracy of outlines belonging to the same parking zone is better than two centimetres, and the absolute accuracy lies between three and ten centimetres depending on the area. Achieving that high absolute accuracy over the entire territory entailed measuring as a reference 250 ground control points (GCPs) distributed throughout the urban canyons of the city of London. During the MMS surveying step, which took 100 days in total, 19 London boroughs and five cities were captured (Getmapping also surveyed Brighton, Cambridge, Oxford, Portsmouth and Coventry to support smart parking). The total length of the MMS trajectory was more than 6,500km including multiple passes. The one million laser points captured per second together with the imagery resulted in 24TB of raw data, which expanded to 50TB after processing and mapping. For each parking zone, 27 features were extracted from the point clouds and imagery. The combination of 360-degree panoramic imagery supplemented by four 12MP side cameras ensured that all parking signs were clearly identifiable. All data requested by the client was delivered ahead of time. In terms of challenges, continuous and reliable high-accuracy GNSS positioning in urban environments is made difficult by multipath signal and frequent GNSS outrage. A high-grade inertial measurement unit (IMU) and sufficient GCPs are crucial to resolve these issues. A further challenge is that weather conditions and difficult light conditions may hinder the extraction of all the information on parking signs from the images.  

Concluding Remarks

  The imagery and point clouds can also be used for asset management, drainage projects, road safety improvements, highway maintenance, 5G telecoms and environmental analysis. Based on all the collected data and extracted features, AppyWay is able to deliver highly accurate and detailed traffic management data to its smart parking systems.

Share

Get in touch

If you would like to discuss a project please contact our team

Welwyn Office +44(0)1438 841300
[email protected]
Midlands Office +44(0)1788 877262
[email protected]

Max. 5MB images, .dwg, .dxf, .docx, or .pdf files only.

Contact Us Today

Max. 5MB images, .dwg, .dxf, .docx, or .pdf files only.