Navigation Control of a Multi-Functional Eye Robot
Navigation Control of a Multi-Functional Eye Robot
Keywords:
Eye robot, artificial intelligence, UGV, optical character recognition, and manual navigationAbstract
The advancement in robotic field is enhanced rigorously in the past few decades. Robots are being used in different fields of science as well as warfare. The research shows that in the near future, robots would be able to serve in fighting wars. Different countries and their armies have already deployed several military robots. However, there exist some drawbacks of robots like their inefficiency and inability to work under abnormal conditions. Ascent of artificial intelligence may resolve this issue in the coming future. The main focus of this paper is to provide a low cost and long range most efficient mechanical as well as software design of an Eye Robot. Using a blend of robotics and image processing with an addition of artificial intelligence path navigation techniques, this project is designed and implemented by controlling the robot (including robotic arm and camera) through a 2.4 GHz RF module manually. Autonomous function of the robot includes navigation based on the path assigned to the robot. The path is drawn on a VB based application and then transferred to the robot wirelessly or through serial port. A Wi-Fi based Optical Character Recognition (OCR) implemented video streamingcan also be observed at remote devices like laptops.
References
MIDARS, [Online] https://en.wikipedia.org/wiki/Military_robot
Candarm, [Online] https://en.wikipedia.org/wiki/Canadarm
"Curiosity Rover - Arm and Hand". JPL. NASA. Retrieved 2012-08-21.
Lin, P., & G. Bekey. Autonomous Military Robotics: Risk, Ethics, and Design. US Department of Navy, Office of Naval Research, December 20, 2008.
Smuda, B. et al. Deploying of Omni-directional Inspection System in Iraq and Afghanistan. US Army TARDEC (2004).
Matthies, L., A. Kelly, T. Litwin, & G. Tharp. Obstacle Detection for Unmanned Ground Vehicle: A Progress Report. Jet Propulsion Lab, Pasadena, CA, USA (2005).
Herbert, M.H., C. Thorpe, & A. Stentz. Intelligent Unmanned Ground Vehicles, Autonmous Navigation Research at Carnegie Mellon. Kluwer Academic Publishers (1997).
Han, Byoung-Jin, & Hyuncheol Jeong. The Privacy Protection Framework for Biometric Information in Network Based CCTV Environment. IEEE Conference on Open Systems (ICOS2011), September 25–28, 2011, Langkawi, Malaysia (2011).
Leonard, J.J. & Hugh F. Durrant-Whyte. Directed Sonar Sensing for Mobile Robot Navigation. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.6.4083&rep=rep1&type=pdf
Khajone, S.A., S.W. Mohod, & V.M. Harne. Implementation of a wireless gesture controlled robotic arm. International Journal of Advanced Research in Electronics and Communication Engineering 4(5) May 2015 (2015).
Ha, A.V.M., B. Sweatha, K. Na, & C.S. Kurupa. Optical Character Recognition Based Auto Navigation of Robot. Dept of ECE, MVJ College of Engineering, Bangalore, India (2013).
BOSTONDYNAMICS – www.bostondynamics.com/ robot_rhex.html
Rashid, M.T., H.A. Zaki, & R.J. Mohammed. Simulation of autonomous navigation mobile robot system. Journal of Engineering and Development 18(4), July 2014 (2014).
Texas Instruments. L293, L293D Quadruple Half- H Drivers, SLRS008C − September 1986 −Revised November 2004 (2004).