2D Mapping and boundary detection using 2D LIDAR sensor for prototyping Autonomous PETIS (Programable Vehicle with Integrated Sensor)
Abstract views: 402

2D Mapping and boundary detection using 2D LIDAR sensor for prototyping Autonomous PETIS (Programable Vehicle with Integrated Sensor)

Hanugra Aulia Sidharta, Sidharta Sidharta, Wina Permana Sari


PETIS (Programable Vehicle with Integrated Sensor) is a research project with goal make a robot that move independently with specific purpose. Due complexity of PETIS, research divide into several important sequence. In this research author focus on sense of sight for PETIS, LIDAR chosen due flexible and comprehensive. There is many LIDAR sensor in marketplace, LDS-01 as one of commercial LIDAR sensor available on market, produced by ROBOTIS as one of low-cost LIDAR sensor. Compare with another sensor that cost more than $1000, LDS-01 just cost lower than $500. On this research study focus with LDS-01 sensor reading, include hardware, software connection, and data handling. Based on this research LDS-01 as LIDAR sensor can read obstacle with minimum 29,9 cm and maximal 290,7 cm. Comparing with datasheet LDS-01 should work from 12 cm through 350 cm.


Full Text:



[1] S. Y. Lee, K. S. Yeoh, and A. Soetedjo, “Developing a blind robot: Study on 2D mapping,” Proc. 2008 IEEE Conf. Innov. Technol. Intell. Syst. Ind. Appl. CITISIA, no. July, pp. 12–14, 2008.

[2] A. N. Catapang and M. Ramos, “Obstacle Detection using a 2D LIDAR System for an Autonomous Vehicle,” no. November, pp. 25–27, 2016.

[3] P. Moghadam, W. S. Wijesoma, and D. J. Feng, “Improving path planning and mapping based on stereo vision and lidar,” 2008 10th Int. Conf. Control. Autom. Robot. Vision, ICARCV 2008, no. December, pp. 384–389, 2008.

[4] Evan Ackerman, “AI Startup Using Robots and Lidar to Boost Productivity on Construction Sites,” IEEE spectrum, 2018. [Online]. Available: https://spectrum.ieee.org/automaton/robotics/industrial-robots/doxel-ai-startup-using-lidar-equipped-robots-on-construction-sites. [Accessed: 05-Dec-2018].

[5] E. Ackerman, “Sweep is a $250 LIDAR WITH range OF 40 Meters that works Outdoors,” IEEE spectrum, 2016. [Online]. Available: https://spectrum.ieee.org/automaton/robotics/robotics-hardware/sweep-lidar-for-robots-and-drones. [Accessed: 05-Dec-2018].

[6] M. Bošnak, “Evolving principal component clustering for 2-D LIDAR data,” pp. 1–6, 2017.

[7] G. Adamo and A. Busacca, “Time of Flight measurements via two LiDAR systems with SiPM and APD,” AEIT 2016 - Int. Annu. Conf. Sustain. Dev. Mediterr. Area, Energy ICT Networks Futur., 2016.

[8] D. Lv, X. Ying, Y. Cui, J. Song, K. Qian, and M. Li, “Research on the technology of LIDAR data processing,” 1st Int. Conf. Electron. Instrum. Inf. Syst. EIIS 2017, vol. 2018–Janua, pp. 1–5, 2018.

[9] B. M. M. Drayton, “Algorithm and design improvements for indirect time of flight range imaging cameras,” Victoria University of Wellington, 2013.

[10] ROBOTIS, “LDS-01 Datasheet,” 2018. [Online]. Available: http://emanual.robotis.com/docs/en/platform/turtlebot3/appendix_lds_01/. [Accessed: 08-Dec-2018].

[11] B. Gerkey, “ROS, the Robot Operating System, Is Growing Faster Than Ever, Celebrates 8 Years,” 9 Dec 2015, 2015. [Online]. Available: https://spectrum.ieee.org/automaton/robotics/robotics-software/ros-robot-operating-system-celebrates-8-years%0Aspectrum.ieee.org/automaton/robotics/robotics-software/ros-robot-operating-system-celebrates-8-years. [Accessed: 06-Dec-2018].

[12] “1. References As part of the production system for,” 2001.

[13] K. Wyrobek, “The Origin Story of ROS, the Linux of Robotics - IEEE Spectrum,” IEEE Spectrum Automaton, 2017. [Online]. Available: https://spectrum.ieee.org/automaton/robotics/robotics-software/the-origin-story-of-ros-the-linux-of-robotics. [Accessed: 06-Dec-2018].

[14] E. Ackerman and E. Guizzo, “Wizards of ROS: Willow Garage and the Making of the Robot Operating System,” IEEE Spectrum Automaton, 2017. [Online]. Available: https://spectrum.ieee.org/automaton/robotics/robotics-software/wizards-of-ros-willow-garage-and-the-making-of-the-robot-operating-system. [Accessed: 06-Dec-2018].

[15] D. Ratasich, B. Fromel, O. Hoftberger, and R. Grosu, “Generic sensor fusion package for ROS,” IEEE Int. Conf. Intell. Robot. Syst., vol. 2015–Decem, pp. 286–291, 2015.

[16] W. P. N. Dos Reis and G. S. Bastos, “Multi-Robot Task Allocation Approach Using ROS,” Proc. - 12th LARS Lat. Am. Robot. Symp. 3rd SBR Brazilian Robot. Symp. LARS-SBR 2015 - Part Robot. Conf. 2015, pp. 163–168, 2016.

[17] W. Curran, T. Thornton, B. Arvey, and W. D. Smart, “Evaluating impact in the ROS ecosystem,” Proc. - IEEE Int. Conf. Robot. Autom., vol. 2015–June, no. June, pp. 6213–6219, 2015.

[18] M. G. Ocando and N. Certad, “Autonomous 3D mapping of an enviroment , while simultaneously making 2D SLAM , using a single 2D LIDAR and ROS *,” pp. 2–7, 2017.

[19] L. Pyo;YoonSeok, Cho;HanCheol, Jung;RyuWoon, ROS Robot Programing, First Edit. ROBOTIS Co.,Ltd, 2017.


  • There are currently no refbacks.

Indexed by: 


Referencing Software:

Checked by:

Supervised by:


View My Stats

Creative Commons License Kinetik : Game Technology, Information System, Computer Network, Computing, Electronics, and Control by http://kinetik.umm.ac.id is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.