High-accuracy indoor location technology using simple visual labels
CSTR:
Author:
Affiliation:

1.Technology Center of Passenger Car, Dongfeng Liuzhou Automobile Co., Ltd., Liuzhou, Guangxi 515005, P. R. China;2.College of Mechanical and Vehicle Engineering, Chongqing University, Chongqing 400044, P. R. China

Clc Number:

TH7

Fund Project:

Supported by Open Fund of State Key Laboratory (KFY2209), Chongqing Automotive Collaborative Innovation Center (2022CDJDX-004) and Chongqing Technology Innovation and Application Development Project (CSTB2022TIAD-KPX0139).

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    To achieve a low-cost, high-precision indoor location system, this study designs a method using simple visual labels while balancing computation complexity and practical requirements. Only color and shape features are used for label detection, minimizing both detection complexity and data storage needs. To deal with the issue of nonunique solutions caused by simplified label features, a rapid query and matching method is proposed by incorporating the camera’s field of view and the label's azimuth. Furthermore, a pose and position estimation method using a weighted least square algorithm is developed. This method is integrated with an interactive algorithm guided by a designed switching strategy. These techniques strike an effective balance between algorithm complexity and location accuracy. Simulation and experimental results show that the proposed method effectively resolves singularity issues in overdetermined equations and attenuates the negative effects of poorly distributed label groups. Compared with ultra-wideband technology, the proposed approach reduces location error by more than 62%.

    Reference
    Related
    Cited by
Get Citation

熊禹,高锋,马杰.基于简易视觉标签的高精度室内定位技术[J].重庆大学学报,2025,48(1):45~53

Copy
Related Videos

Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:March 18,2023
  • Revised:
  • Adopted:
  • Online: February 19,2025
  • Published:
Article QR Code