MiRo-Training
The aim of the project is to develop a software architecture for interacting with Miro using vocal and gestural commands.
|
The node gbb_miro.py subscribes to the linear and angular velocity mapped in the imu_data_map node and publish a platform_control message
A platform_control message contains linear and angular velocities, the lightening pattern of miro's body and other tipes of messages
.
More...
Go to the source code of this file.
Classes | |
class | gbb_miro.GestureBased |
The class GestureBased implements the Gesture based behavior. More... | |
Variables | |
tuple | gbb_miro.gesture_based = GestureBased() |
The node gbb_miro.py subscribes to the linear and angular velocity mapped in the imu_data_map node and publish a platform_control message
A platform_control message contains linear and angular velocities, the lightening pattern of miro's body and other tipes of messages
.
More in details:
Subscribe to the topic /imu_mapping
Read from that topic the value from the smartwatch correctly mapped into miro body velocity
Publish on /gbb a platform_control msg containing miro body velocity and a lightening pattern based on smartwatch commands
For example: if the command of the smartwatch is a rotation towards right miro will turn right and the right part of its body will lit
Definition in file gbb_miro.py.