A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 11 Issue 8
Aug.  2024

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 15.3, Top 1 (SCI Q1)
    CiteScore: 23.5, Top 2% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
M. Vargas, C. Vivas, and  T. Alamo,  “Optimal positioning strategy for multi-camera zooming drones,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 8, pp. 1802–1818, Aug. 2024. doi: 10.1109/JAS.2024.124455
Citation: M. Vargas, C. Vivas, and  T. Alamo,  “Optimal positioning strategy for multi-camera zooming drones,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 8, pp. 1802–1818, Aug. 2024. doi: 10.1109/JAS.2024.124455

Optimal Positioning Strategy for Multi-Camera Zooming Drones

doi: 10.1109/JAS.2024.124455
Funds:  This work was supported by grants PID2022-142946NA-I00 and PID2022-141159OB-I00, funded by MICIU/AEI/10.13039/501100011033 and by ERDF/EU
More Information
  • In the context of multiple-target tracking and surveillance applications, this paper investigates the challenge of determining the optimal positioning of a single autonomous aerial vehicle or agent equipped with multiple independently-steerable zooming cameras to effectively monitor a set of targets of interest. Each camera is dedicated to tracking a specific target or cluster of targets. The key innovation of this study, in comparison to existing approaches, lies in incorporating the zooming factor for the onboard cameras into the optimization problem. This enhancement offers greater flexibility during mission execution by allowing the autonomous agent to adjust the focal lengths of the on-board cameras, in exchange for varying real-world distances to the corresponding targets, thereby providing additional degrees of freedom to the optimization problem. The proposed optimization framework aims to strike a balance among various factors, including distance to the targets, verticality of viewpoints, and the required focal length for each camera. The primary focus of this paper is to establish the theoretical groundwork for addressing the non-convex nature of the optimization problem arising from these considerations. To this end, we develop an original convex approximation strategy. The paper also includes simulations of diverse scenarios, featuring varying numbers of onboard tracking cameras and target motion profiles, to validate the effectiveness of the proposed approach.

     

  • loading
  • 1 Without loss of generality, each target (or cluster) is alloted a number that matches the number of the tracking camera to which it is assigned.
  • [1]
    M. Schwager, B. J. Julian, M. Angermann, and D. Rus, “Eyes in the sky: Decentralized control for the deployment of robotic camera networks,” Proc. IEEE, vol. 99, no. 9, pp. 1541–1561, Sep. 2011. doi: 10.1109/JPROC.2011.2158377
    [2]
    I. Sa and P. Corke, “Vertical infrastructure inspection using a quadcopter and shared autonomy control,” in Field and Service Robotics: Results of the 8th International Conference, K. Yoshida and S. Tadokoro, Eds. Berlin, Heidelberg, Germany: Springer, 2014, pp. 219–232.
    [3]
    S. Jordan, J. Moore, S. Hovet, J. Box, J. Perry, K. Kirsche, D. Lewis, and Z. T. H. Tse, “State-of-the-art technologies for UAV inspections,” IET Radar, Sonar Navig., vol. 12, no. 2, pp. 151–164, Feb. 2018. doi: 10.1049/iet-rsn.2017.0251
    [4]
    E. Salahat, C. A. Asselineau, J. Coventry, and R. Mahony, “Waypoint planning for autonomous aerial inspection of large-scale solar farms,” in Proc. 45th Annu. Conf. IEEE Industrial Electronics Society, Lisbon, Portugal, 2019.
    [5]
    J. Sun, B. Li, Y. Jiang, and C.-Y. Wen, “A camera-based target detection and positioning UAV system for search and rescue (SAR) purposes,” Sensors, vol. 16, no. 11, p. 1778, Oct. 2016. doi: 10.3390/s16111778
    [6]
    M. A. Khan, W. Ectors, T. Bellemans, D. Janssens, and G. Wets, “UAV-based traffic analysis: A universal guiding framework based on literature survey,” Transp. Res. Proc., vol. 22, pp. 541–550, Dec. 2017.
    [7]
    P. Kumar, S. Sonkar, A. K. Ghosh, and D. Philip, “Real-time vision-based tracking of a moving terrain target from light weight fixed wing UAV using gimbal control,” in Proc. 7th Int. Conf. Control, Decision and Information Technologies, Prague, Czech Republic, 2020, pp. 154–159.
    [8]
    I. Ahmed, S. Din, G. Jeon, F. Piccialli, and G. Fortino, “Towards collaborative robotics in top view surveillance: A framework for multiple object tracking by detection using deep learning,” IEEE/CAA J. Autom. Sinica, vol. 8, no. 7, pp. 1253–1270, Jul. 2021. doi: 10.1109/JAS.2020.1003453
    [9]
    P. Sun, S. Li, B. Zhu, Z. Zuo, and X. Xia, “Vision-based fixed-time uncooperative aerial target tracking for UAV,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 5, pp. 1322–1324, May 2023. doi: 10.1109/JAS.2023.123510
    [10]
    R. Miller, G. Mooty, and J. M. Hilkert, “Gimbal system configurations and line-of-sight control techniques for small UAV applications,” in Proc. SPIE 8713, Airborne Intelligence, Surveillance, Reconnaissance Systems and Applications X, Baltimore, USA, 2013, pp. 39–53.
    [11]
    H. Choi and Y. Kim, “UAV guidance using a monocular-vision sensor for aerial target tracking,” Control Eng. Pract., vol. 22, pp. 10–19, Jan. 2014. doi: 10.1016/j.conengprac.2013.09.006
    [12]
    N. Farmani, L. Sun, and D. Pack, “An optimal sensor management technique for unmanned aerial vehicles tracking multiple mobile ground targets,” in Proc. Int. Conf. Unmanned Aircraft Systems, Orlando, USA, 2014, pp. 570–576.
    [13]
    H. H. Helgesen, F. S. Leira, T. I. Fossen, and T. A. Johansen, “Tracking of ocean surface objects from unmanned aerial vehicles with a pan/tilt unit using a thermal camera,” J. Intell. Robot. Syst., vol. 91, no. 3–4, pp. 775–793, Sep. 2018. doi: 10.1007/s10846-017-0722-3
    [14]
    S. Wang, F. Jiang, B. Zhang, R. Ma, and Q. Hao, “Development of UAV-based target tracking and recognition systems,” IEEE Trans. Intell. Transport. Syst., vol. 21, no. 8, pp. 3409–3422, Aug. 2020. doi: 10.1109/TITS.2019.2927838
    [15]
    C. Robin and S. Lacroix, “Multi-robot target detection and tracking: Taxonomy and survey,” Auton. Robots, vol. 40, no. 4, pp. 729–760, Apr. 2016. doi: 10.1007/s10514-015-9491-7
    [16]
    M. Senanayake, I. Senthooran, J. C. Barca, H. Chung, J. Kamruzzaman, and M. Murshed, “Search and tracking algorithms for swarms of robots: A survey,” Rob. Auton. Syst., vol. 75, pp. 422–434, 2016. doi: 10.1016/j.robot.2015.08.010
    [17]
    R. Sharma and D. Pack, “Cooperative sensor resource management to aid multi target geolocalization using a team of small fixed-wing unmanned aerial vehicles,” in Proc. AIAA Guidance, Navigation, and Control Conf., Boston, USA, 2013, pp. 4706.
    [18]
    Y. Zhao, X. Wang, C. Wang, Y. Cong, and L. Shen, “Systemic design of distributed multi-UAV cooperative decision-making for multi-target tracking,” Auton. Agents Multi-Agent Syst., vol. 33, no. 1, pp. 132–158, Mar. 2019.
    [19]
    S. Baek and G. York, “Optimal sensor management for multiple target tracking using cooperative unmanned aerial vehicles,” in Proc. Int. Conf. Unmanned Aircraft Systems, Athens, Greece, 2020, pp. 1294–1300.
    [20]
    M. Vargas, C. Vivas, F. R. Rubio, and M. G. Ortega, “Flying chameleons: A new concept for minimum-deployment, multiple-target tracking drones,” Sensors, vol. 22, no. 6, p. 2359, Mar. 2022. doi: 10.3390/s22062359
    [21]
    C. Chen, Y. Tian, L. Lin, S. Chen, H. Li, Y. Wang, and K. Su, “Obtaining world coordinate information of UAV in GNSS denied environments,” Sensors, vol. 20, no. 8, p. 224, Apr. 2020.
    [22]
    N. Bisagno, A. Xamin, F. De Natale, N. Conci, and B. Rinner, “Dynamic camera reconfiguration with reinforcement learning and stochastic methods for crowd surveillance,” Sensors, vol. 20, no. 17, p. 4691, Aug. 2020. doi: 10.3390/s20174691
    [23]
    Z. Wang and X. Xu, “A survey on electronic image stabilization,” J. Image Graphics, vol. 15, no. 3, pp. 470–480, Mar. 2010.
    [24]
    C. Dahlin Rodin, F. A. de Alcantara Andrade, A. R. Hovenburg, and T. A. Johansen, “A survey of practical design considerations of optical imaging stabilization systems for small unmanned aerial systems,” Sensors, vol. 19, no. 21, p. 4800, Nov. 2019. doi: 10.3390/s19214800
    [25]
    X. Xiao, J. Dufek, T. Woodbury, and R. Murphy, “UAV assisted USV visual navigation for marine mass casualty incident response,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Vancouver, Canada, 2017, pp. 6105–6110.
    [26]
    C. Burke, P. R. McWhirter, J. Veitch-Michaelis, O. McAree, H. A. G. Pointon, S. Wich, and S. Longmore, “Requirements and limitations of thermal drones for effective search and rescue in marine and coastal areas,” Drones, vol. 3, no. 4, p. 78, Oct. 2019. doi: 10.3390/drones3040078
    [27]
    H. H. Helgesen, T. H. Bryne, E. F. Wilthil, and T. A. Johansen, “Camera-based tracking of floating objects using fixed-wing UAVs,” J. Intell. Robot. Syst., vol. 102, no. 4, p. 80, Jul. 2021. doi: 10.1007/s10846-021-01432-z
    [28]
    L. Fusini, T. I. Fossen, and T. A. Johansen, “Nonlinear observers for GNSS- and camera-aided inertial navigation of a fixed-wing UAV,” IEEE Trans. Control Syst. Technol., vol. 26, no. 5, pp. 1884–1891, Sep. 2018. doi: 10.1109/TCST.2017.2735363
    [29]
    M. P. Atkinson, M. Kress, and R. Szechtman, “Maritime transportation of illegal drugs from South America,” Int. J. Drug Policy, vol. 39, pp. 43–51, Jan. 2017. doi: 10.1016/j.drugpo.2016.07.010
    [30]
    P. Campana, “Human smuggling: Structure and mechanisms,” Crime Justice, vol. 49, pp. 471–519, May 2020. doi: 10.1086/708663
    [31]
    N. Mahajan, A. Chauhan, and M. Kajal, “An introduction to deep learning-based object recognition and tracking for enabling defense applications,” in Advances in Aerial Sensing and Imaging, S. Kumar, N. R. Moparthi, A. Bhola, R. Kaur, A. Senthil, and K. M. V. V. Prasad, Eds. Scrivener Publishing LLC, 2024, pp. 109–127.
    [32]
    S. Bajaj, S. D. Bopardikar, E. Torng, A. Von Moll, and D. W. Casbeer, “Multivehicle perimeter defense in conical environments,” IEEE Trans. Robot., vol. 40, pp. 1439–1456, Jan. 2024. doi: 10.1109/TRO.2024.3351556
    [33]
    M. A. Ma’sum, M. K. Arrofi, G. Jati, F. Arifin, M. N. Kurniawan, P. Mursanto, and W. Jatmiko, “Simulation of intelligent unmanned aerial vehicle (UAV) for military surveillance,” in Proc. Int. Conf. Advanced Computer Science and Information Systems, Sanur Bali, Indonesia, 2013, pp. 161–166.
    [34]
    X. C. Ding, A. R. Rahmani, and M. Egerstedt, “Multi-UAV convoy protection: An optimal approach to path planning and coordination,” IEEE Trans. Robot., vol. 26, no. 2, pp. 256–268, Apr. 2010. doi: 10.1109/TRO.2010.2042325
    [35]
    X. Li and A. V. Savkin, “Networked unmanned aerial vehicles for surveillance and monitoring: A survey,” Future Internet, vol. 13, no. 7, p. 174, Jul. 2021. doi: 10.3390/fi13070174
    [36]
    I. Mademlis, V. Mygdalis, N. Nikolaidis, M. Montagnuolo, F. Negro, A. Messina, and I. Pitas, “High-level multiple-UAV cinematography tools for covering outdoor events,” IEEE Trans. Broadcast., vol. 65, no. 3, pp. 627–635, Sep. 2019. doi: 10.1109/TBC.2019.2892585
    [37]
    I. Sa and H. S. Ahn, “Visual 3D model-based tracking toward autonomous live sports broadcasting using a VTOL unmanned aerial vehicle in GPS-impaired environments,” Int. J. Comput. Appl., vol. 122, no. 7, pp. 1–7, Jul. 2015.
    [38]
    I. Mademlis, V. Mygdalis, N. Nikolaidis, and I. Pitas, “Challenges in autonomous UAV cinematography: An overview,” in Proc. IEEE Int. Conf. Multimedia and Expo, San Diego, USA, 2018, pp. 1–6.
    [39]
    R. Bonatti, C. Ho, W. Wang, S. Choudhury, and S. Scherer, “Towards a robust aerial cinematography platform: Localizing and tracking moving targets in unstructured environments,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Macau, China, 2019, pp. 229–236.
    [40]
    L. F. Gonzalez, G. A. Montes, E. Puig, S. Johnson, K. Mengersen, and K. J. Gaston, “Unmanned aerial vehicles (UAVs) and artificial intelligence revolutionizing wildlife monitoring and conservation,” Sensors, vol. 16, no. 1, p. 97, Jan. 2016. doi: 10.3390/s16010097
    [41]
    V. Panadeiro, A. Rodriguez, J. Henry, D. Wlodkowic, and M. Andersson, “A review of 28 free animal-tracking software applications: Current features and limitations,” Lab Anim., vol. 50, no. 9, pp. 246–254, Jul. 2021. doi: 10.1038/s41684-021-00811-1
    [42]
    I. Bozcan and E. Kayacan, “AU-AIR: A multi-modal unmanned aerial vehicle dataset for low altitude traffic surveillance,” in Proc. IEEE Int. Conf. Robotics and Automation, Paris, France, 2020, pp. 8504–8510.
    [43]
    M. A. Khan, W. Ectors, T. Bellemans, D. Janssens, and G. Wets, “Unmanned aerial vehicle-based traffic analysis: Methodological framework for automated multivehicle trajectory extraction,” Transp. Res. Rec.: J. Transp. Res. Board, vol. 2626, no. 1, pp. 25–33, Jan. 2017. doi: 10.3141/2626-04
    [44]
    M. A. K. Jaradat, M. H. Garibeh, and E. A. Feilat, “Autonomous mobile robot dynamic motion planning using hybrid fuzzy potential field,” Soft Comput., vol. 16, no. 1, pp. 153–164, Jan. 2012. doi: 10.1007/s00500-011-0742-z
    [45]
    J. Tang, Q. Pan, Z. Chen, G. Liu, G. Yang, F. Zhu, and S. Lao, “An improved artificial electric field algorithm for robot path planning,” IEEE Trans. Aerosp. Electron. Syst., vol. 60, no. 2, pp. 2292–2304, Apr. 2024.
    [46]
    H. Chen, X.-M. Wang, and Y. Li, “A survey of autonomous control for UAV,” in Proc. Int. Conf. Artificial Intelligence and Computational Intelligence, Shanghai, China, 2009, pp. 267–271.
    [47]
    Z. Zuo, C. Liu, Q.-L. Han, and J. Song, “Unmanned aerial vehicles: Control methods and future challenges,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 4, pp. 601–614, Apr. 2022. doi: 10.1109/JAS.2022.105410
    [48]
    A. Altan and R. Hacıoğlu, “Model predictive control of three-axis gimbal system mounted on UAV for real-time target tracking under external disturbances,” Mech. Syst. Signal Process., vol. 138, p. 106548, Apr. 2020. doi: 10.1016/j.ymssp.2019.106548
    [49]
    P. Tokekar, V. Isler, and A. Franchi, “Multi-target visual tracking with aerial robots,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Chicago, USA, 2014, pp. 3067–3072.
    [50]
    N. Farmani, L. Sun, and D. J. Pack, “A scalable multitarget tracking system for cooperative unmanned aerial vehicles,” IEEE Trans. Aerosp. Electron. Syst., vol. 53, no. 4, pp. 1947–1961, Aug. 2017. doi: 10.1109/TAES.2017.2677746
    [51]
    S. Boyd and L. Vandenberghe, Convex Optimization. Cambridge, USA: Cambridge University Press, 2004.
    [52]
    L. G. Haçijan, “A polynomial algorithm in linear programming,” Dokl. Akad. Nauk SSSR, vol. 244, no. 5, pp. 1093–1096, 1979.
    [53]
    R. G. Bland, D. Goldfarb, and M. J. Todd, “Feature article–The ellipsoid method: A survey,” Oper. Res., vol. 29, no. 6, pp. 1039–1091, Dec. 1981. doi: 10.1287/opre.29.6.1039
  • 2.AdditionalMaterial.zip

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(12)  / Tables(1)

    Article Metrics

    Article views (186) PDF downloads(36) Cited by()

    Highlights

    • Proposes the optimal positioning of an autonomous aerial agent for multi-target tracking
    • Introduces novel approach by integrating multiple independently-steerable zooming cameras
    • The objective is the effective monitoring of multiple interest targets in a flexible way
    • Formulates a novel optimization problem, trying to balance distance, viewpoint, and focal lengths
    • The proposed convex approximation addresses the resulting non-convex optimization problem

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return