MiRo-Training
The aim of the project is to develop a software architecture for interacting with Miro using vocal and gestural commands.
 All Classes Files Functions Variables
File List
Here is a list of all documented files with brief descriptions:
[detail level 123]
o-actions
|\-scripts
| o*bad.pyThe node bad.py implements the action corresponding to the command "Bad"
| o*gbb_miro.pyThe node gbb_miro.py subscribes to the linear and angular velocity mapped in the imu_data_map node and publish a platform_control message
A platform_control message contains linear and angular velocities, the lightening pattern of miro's body and other tipes of messages
| o*good.pyThe node good.py implements the action corresponding to the command "Good"
| o*imu_data_map.pyThe node imu_data_map.py subscribes to the smartwatch's accelerometer data and publish the linear and angular velocities conveniently mapped
| o*kill.pyThe node kill.py implements the action corresponding to the command "Kill"
| o*play.pyThe node play.py implements the action corresponding to the command "Play"
| \*sleep.pyThe node sleep.py implements the action corresponding to the command "Sleep"
Miro closes its eyes, lights up in acquamarine, lowers its tail and inclines and moves down its head
\-command_handler
 \-scripts
  \*command_recognition.pyThe node command_recognition.py recognize vocal command converted as String and publish the associated robot's action
The vocal command "Miro" bring the robot in a default configuration and enables the processing of further commands