At Perception4D, staying at the forefront of 3D LiDAR expertise means keeping a close eye on the latest research in the field
Just two days ago, the team at #StachnissLab introduced KISS-SLAM, their latest #SLAM algorithm, claiming it requires no parameter tuning.
We had just acquired a new 5km urban dataset, featuring narrow streets and steep slopesβthe perfect test for KISS-SLAM.
The results? Outstanding! The algorithm successfully detected all necessary loop closures and produced a clean, accurate trajectory. Impressive work from the team! π
π A glimpse into a building block of a bigger project: Lidar point cloud segmentation, with only 3D points a Ouster OS1-128 Lidar.
πThat’s the daily task of Autonomous Vehicles, and Perception4D applies to many other usages
π€ Benchmarking and Retraining Deep Learning networks on inhouse data, from old-ish RandLanet PointPillars to KPConv and SuperPoint Transformer
π» The direct usage goes from cleaning the scanned point cloud to computing road surface or cable length. And thatβs a building brick for other projects.
π Excited to share Perception4D improvements in 3D object detection and tracking, thanks to close collaboration with MasterMind, LLC which has very nice Lidar scanning use cases, with 2x Ouster OS1-128 and a whooping 72 MP Ladybug6 spherical camera (from FLIR Systems, not used here).
π€ Leveraging Mastermind’s data and hardware expertise with our advanced algorithms, we’re pushing the boundaries in accuracy and efficiency for real-world applications like autonomous vehicles and urban planning.
π» The usage goes from cleaning the scanned point cloud to tracking and analyzing vehicle and people trajectories.
π Itβs time for a recap of our amazing 2023 year at Perception4D!
π We have been very busy on many projects involving 3D visualization, LIDAR-based SLAM and machine learning applied to 3D point clouds (semantic segmentation and object detection). Check this short video for an overview of some of our results.
β€οΈ A heartfelt thank you to our incredible team, our partners, and everyone who contributed to making 2023 an unforgettable year for Perception4D.
π We love getting updated on the latest breakthroughs and to meet the driving forces of research in computer vision and robotics. The program is so rich in our main subjects of interest: #3D#perception, #segmentation and object #detection in LiDAR point clouds, #SLAMβ¦
π€ Reconnect and Network: Bastien and Manon are excited to catch up with old acquaintances and meet new friends in the industry. Whether you’ve collaborated in the past or share a common interest in #LiDAR, #Robotics, or #computervision, this is a fantastic opportunity to exchange ideas and experiences.
π Simulator for a two 360Β° LiDAR car-scanner, to tune their position & scanning coverage. Proof of concept done in a few hours using #Unity3D, building on past #UE5 work. (Sorry for quick-and-shaky video π) Stay tuned for the corresponding real-world data π
Added value: βͺ π Design: Choose the best Lidar positioning for robot field of view coverage. βͺ π€ Robotics: Test the algorithms with a controlled, synthetic, LiDAR datastream. βͺ π₯ Surveillance: Place the Lidar at the best position covering intersection or parking spots. βͺ π· Lidar Manufacturers: Optimize laser directions and assembly for specific needs.
π Combined expertise in Robotics, Lidar and Visualization get us pretty quick to prototype 4D (3D+time) data analysis in hours.
π» Perception4D take your R&D concept to a tailored prototype in record time.
Manon Cortial-Picard shares her experiences after 6 months as an associate at Perception4D!Β So much learning π§βπ about #SLAM, #LiDAR and #3DPointsClouds (fun stuff), but also about handling the French administration and the subtleties of CMake (less fun π ) She is ready to tackle all your fun #3D, #AI or #AMR projects!