Autonomous driving requires highly accurate environmental perception,and traditional localization methods heavily depend on Global Navigation Satellite Systems(GNSS).However,in complex environments such as urban areas...Autonomous driving requires highly accurate environmental perception,and traditional localization methods heavily depend on Global Navigation Satellite Systems(GNSS).However,in complex environments such as urban areas,GNSS signals are often attenuated or blocked,rendering it challenging to provide low-cost and real-time navigation solutions for precise environment perception.A novel multistate constraint Kalman filter-based Vision/Inertial Navigation System(INS)/GNSS fusion algorithm is explored to address these limitations and improve localization accuracy in GNSS-challenged environments.In contrast to conventional approaches that unify motion state representation within a single Cartesian coordinate system,our method propagates covariance matrices independently across multiple coordinate systems,enabling real-time navigation state estimation tailored for autonomous driving.Furthermore,we introduce a novel covariance matrix-based navigation error fusion strategy,which dynamically corrects navigation parameters and facilitates the seamless integration of absolute and relative information from multiple sources.Real-world experiments are carried out to validate the performance of the designed integration algorithm.Experimental results indicate that the algorithm effectively combines visual and GNSS measurements,meeting the real-time and accurate positioning requirements for autonomous navigation in GNSS-challenged environments.展开更多
基金co-supported by the Chinese National Programs for Fundamental Strengthening Research(No.2020-jcjq-zd-133-11)the National Natural Science Foundation of China(No.62073163)+2 种基金the Aeronautic Science Foundation of China(No.202055052003)the 2023 Fundamental Research Funds for the Central Universities of China(No.QZPY202310)the Changzhou Sci&Tech Program Funds(No.CJ20241081)
文摘Autonomous driving requires highly accurate environmental perception,and traditional localization methods heavily depend on Global Navigation Satellite Systems(GNSS).However,in complex environments such as urban areas,GNSS signals are often attenuated or blocked,rendering it challenging to provide low-cost and real-time navigation solutions for precise environment perception.A novel multistate constraint Kalman filter-based Vision/Inertial Navigation System(INS)/GNSS fusion algorithm is explored to address these limitations and improve localization accuracy in GNSS-challenged environments.In contrast to conventional approaches that unify motion state representation within a single Cartesian coordinate system,our method propagates covariance matrices independently across multiple coordinate systems,enabling real-time navigation state estimation tailored for autonomous driving.Furthermore,we introduce a novel covariance matrix-based navigation error fusion strategy,which dynamically corrects navigation parameters and facilitates the seamless integration of absolute and relative information from multiple sources.Real-world experiments are carried out to validate the performance of the designed integration algorithm.Experimental results indicate that the algorithm effectively combines visual and GNSS measurements,meeting the real-time and accurate positioning requirements for autonomous navigation in GNSS-challenged environments.