|
MiRo-Training
The aim of the project is to develop a software architecture for interacting with Miro using vocal and gestural commands.
|
| The node bad.py implements the action corresponding to the command "Bad" | |
| The node gbb_miro.py subscribes to the linear and angular velocity mapped in the imu_data_map node and publish a platform_control message A platform_control message contains linear and angular velocities, the lightening pattern of miro's body and other tipes of messages | |
| The node good.py implements the action corresponding to the command "Good" | |
| The node imu_data_map.py subscribes to the smartwatch's accelerometer data and publish the linear and angular velocities conveniently mapped | |
| The node kill.py implements the action corresponding to the command "Kill" | |
| The node play.py implements the action corresponding to the command "Play" | |
| The node sleep.py implements the action corresponding to the command "Sleep" Miro closes its eyes, lights up in acquamarine, lowers its tail and inclines and moves down its head | |
| The node command_recognition.py recognize vocal command converted as String and publish the associated robot's action The vocal command "Miro" bring the robot in a default configuration and enables the processing of further commands |
1.8.6