摘要
室内Wi-Fi接收信号强度指示(RSSI)指纹定位广泛应用于位置服务,但面临数据采集困难、环境动态变化的RSSI测量剧烈波动等挑战,难以实现高精度定位。为解决因数据稀缺和环境动态变化导致定位精度不佳问题,采用双编码器结构独立处理RSSI数据和位置坐标数据,引入地理信息损失函数,构建了位置条件变分自编码器(LCVAE)模型,生成具有地理准确性的指纹数据,以此增强定位模型的性能。进一步设计了共享的卷积神经网络(CNN)模型特征提取层,整合分类与回归功能,提出一种基于LCVAE-CNN的多任务室内Wi-Fi指纹定位方法。实验结果表明,所提LCVAE-CNN方法在UJIIndoorLoc数据集的楼层分类准确率和平均定位误差(MPE)分别达到98.80%和6.79 m,在Tampere数据集上分别达到97.22%和5.44 m。与现有五种方法相比,楼层分类准确率最少提升1.9百分点,MPE最少提高19%。
Indoor Wi-Fi received signal strength indicator(RSSI)fingerprinting widely supports locationbased services.However,it faces challenges such as the difficulty of data collection and severe RSSI fluctuations caused by dynamic environmental changes,which hinder achieving high-accuracy localization.To improve localization accuracy under data scarcity and dynamic environments,this paper proposed a dual-encoder structure that independently processed RSSI data and location coordinates.The study introduced a geographic information loss function and constructed a location conditional variational autoencoder(LCVAE)model to generate fingerprint data with geographic accuracy,enhancing the localization model’s perfor-ma nce.Additionally,the research designed a shared convolutional neural network(CNN)feature extraction layer,integrating both classification and regression functions,and presented a multitask indoor Wi-Fi fingerprint positioning method based on LCVAE-CNN.Experimental results show that the proposed LCVAE-CNN method achieves a floor classification accuracy of 98.80%and a mean positioning error(MPE)of 6.79 meters on the UJIIndoorLoc dataset,and 97.22%and 5.44 meters respectively on the Tampere dataset.Compared to five existing methods,the approach improves floor classification accuracy by at least 1.9 percentage points and reduces MPE by a minimum of 19%.
作者
吴仕勋
曾鑫睿
徐凯
蓝章礼
张淼
金悦
Wu Shixun;Zeng Xinrui;Xu Kai;Lan Zhangli;Zhang Miao;Jin Yue(School of Information Science&Engineering,Chongqing Jiaotong University,Chongqing 400074,China)
出处
《计算机应用研究》
北大核心
2025年第6期1844-1851,共8页
Application Research of Computers
基金
重庆市自然科学基金资助项目(CSTB2024NSCQ-MSX0275)。