Repository logo
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Yкраї́нська
  • Log In
    New user? Click here to register.Have you forgotten your password?
Repository logo
  • Communities & Collections
  • All of DSpace
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Yкраї́нська
  • Log In
    New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Quraishi, Imran Akram (11ET41)"

Now showing 1 - 1 of 1
Results Per Page
Sort Options
  • No Thumbnail Available
    Item
    Gesture based wireless single-armed robot in cartesian 3D space using kinect
    (AIKTC, 2015-05) Alvi, Rizwan; Memon, Md. Farhan Md. Fareed (11ET33); Quraishi, Imran Akram (11ET41); Shaikh, Bilal Anees (12ET86)
    Human Machine Interaction (HMI) has always played an important role in everybody’s life motivating research in the area of intelligent service robots. Conventional methods such as remote controllers or wearables cannot cater the high demands in some scenarios. To overcome this situation, the challenge is to develop vision-based gesture recognition techniques. This project describes our work of controlling an Arduino based wheeled, one armed robot, used as a prototype, controlled through various gestures of the arms and legs. For gesture recognition, we make use of skeletal tracing ability of Kinect – a product of Microsoft. Bluetooth is used to make the controls wireless. Since it is not line of sight operation, the robot also captures the environment video and transmits it over radio frequency in real-time and displayed on the screen. On the user end, according to the received video, the operator guides the robot and uses the arm to pick and place objects with the help of predetermined gestures.

DSpace software copyright © 2002-2025 LYRASIS

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback