Full Robot Car using Arduino-Based Hand Gestures

Main Article Content

Jati Widyo Leksono
Agung Samudra
Nanndo Yannuansa
Ahmad Fauzi

Abstract

The development of technology robot car from year to year, it turned out continue to experienced the level of rapid progress. Current robots are no longer controlled by using the remote control, but can use command from the man directly. The development of a robot car capable of running in accordance with a hand gesture.
Palm of human hand affixed to a sensor that is able to read movement direction of top, bottom, left side and right side. Sensor readings from the gesture of the hand, module sends data to the receiver that are contained in the robot wirelessly. Robot will receive data input control command logic programming to specify the action. The robot will move forward, backward, turn right, turn left, stop.
Control hand gesture that is using the sensor module MPU-6050 are transmitted through radio frequency waves NRF24L01. Based on the results of testing, module proficiency level sensor capable of reading the position of a point of the axis x, y, and z. The results of response speed of sensor against car is able to connect quickly 0.45 s. Testing the average speed of car robot is capable of driving amounted to 3.496,50 rpm up to 4.596,50 rpm.
 

Downloads

Download data is not yet available.

Article Details

How to Cite
Leksono, J., Samudra, A., Yannuansa, N., & Fauzi, A. (2020). Full Robot Car using Arduino-Based Hand Gestures. Electro Luceat, 6(2), 228-235. https://doi.org/10.32531/jelekn.v6i2.253
Section
Articles
Author Biographies

Agung Samudra, Universitas Hasyim Asy’ari Tebuireng Jombang

Teknik Mesin, Universitas Hasyim Asy’ari Tebuireng Jombang

Nanndo Yannuansa, Universitas Hasyim Asy’ari Tebuireng Jombang

Teknik Elektro, Universitas Hasyim Asy’ari Tebuireng Jombang

Ahmad Fauzi, Universitas Hasyim Asy’ari Tebuireng Jombang

Teknik Elektro, Universitas Hasyim Asy’ari Tebuireng Jombang

References

[1] Kim, H., dan Felner, D.W., 2004, “Interaction with hand gesture for a back projection wall”, CGI ’04: Computer Graphics International.
[2] A. A. Alani and G. Cosma. 2018. Hand Gesture Recognition Using an Adapted Convolutional Neural Network with Data Augmentation,‖ 2018 4th Int. Conf. Inf. Manag., pp. 5–12.
[3] S. Hussain and R. Saxena.2018. Hand Gesture Recognition Using Deep Learning,‖ pp. 48–49.
[4] A. . Arvindan and K. D.2016. “Experimental Investigation of Remote Control Via Android Smart Phone of Arduino-Based Automated Irrigation System Using Moisture Sensor,” pp. 168–175.
[5] Y. Hendriana and R. Hardi.2016. “Remote Control System as Serial Communications Mobile using a Microcontroller,”. pp19–25.
[6] J. P. Ventura, N. A. Cruz, and F. P. Lima, 2016. “A remote monitoring and control system for ecosystem replication experiments,”.
[7] Y. Curiel-razo and O. Icasio-hern.2016. “Leap Motion Controller Three Dimensional Verification and Polynomial Correction,” Measurement.
[8] R. Katahira and M. Soga.2015. “Development and Evaluation of a System for AR enabling Realistic Display of Gripping Motions using Leap Motion Controller,” Procedia - Procedia Comput. Sci., vol. 60, pp. 1595–1603.
[9] K. Fok, N. Ganganath, C. Cheng, and C. K. Tse,2015. “A Real-Time ASL Recognition System Using Leap Sensors,”.
Abstract viewed = 307 times
PDF downloaded = 412 times