3.8 KB
Newer Older
# Choregraphe
Choregraphe uses a simple structure in which you can connect different blocks together to pair actions sequentially or in parallel. However, don’t be fooled by these simple looking blocks. The only reason they are there is to give a better overview of long programs/actions. If you double click on the blocks, they reveal their source code in Python. So in reality, each of the blocks is actually a class that defines a certain action. You can even create your own blocks if you want to.

Each of the blocks has certain inputs and outputs. The outputs are triggered if some criteria is met inside the block, and any block connect to the activated outputs will then be started. The most common output is `onStopped`, which triggers if the block has completed. Blocks can also pass data through these outputs, for example recognized text or the number of recognized faces. Some blocks can be active indefinitely, and will trigger certain output ports regularly. We will see some examples soon. You can also double click on any of the inputs/outputs during runtime to trigger it!

Each program you create is actually a “new” block on its own, with input ports and output ports. Therefore, you can import entire programs as a block into another program. You can directly run your program, but you can also set it to trigger on certain criteria by only connecting blocks to special input ports. When a block connected to the output is triggered, your whole program stops - even if some blocks have not yet finished running. Similar to blocks, you can also output data or output different events if you like. In short, the possibilities are endless!

### Virtual robot
It would be nice to verify certain behavior before running it on the real robot, such as any kind of movement. Choregraphe offers a so-called virtual robot exactly for this purpose. To initialize and connect to your virtual robot, follow these steps:

* `Edit > Preferences > Virtual robot` - Change the robot to Pepper.
* `Connect > Connect to virtual robot`

This robot has limited capabilities compared to the real robot: any visual, auditory or sensory functions do not work. This virtual robot is really only meant for checking if you are not producing any dangerous movement. We will see a 3rd party simulator later on, which allows you to simulate other behavior as well. When you connect to the real robot, this virtual robot actually becomes the real robot and you can toggle the camera views for example. You can also configure detected faces to be shown in the 3D world. 

### Examples
This folder contains three examples. Note that only the `motion` program can run on the virtual robot.

> This program uses facial recognition to determine the age based on someones face. Once the age has been estimated, the robot will say it out loud.

> Shows different types of motion that can be created. Note that special animation blocks appear in this program, which are basically timelines of joint values. Choregraphe offers the functionality to record some movement that you create by manually moving limbs, which you can then save as a block. 

> Pepper will ask you if he/she should dance. You can respond with 'yes' and 'no'. 

### Editing blocks
After you have seen some NAOqi examples, you can start editing and coding your own blocks in Choregraphe. To make your own block, you can insert a template block from `Box library > Programming > Templates > Python Script`. You can add more inputs and outputs by pressing the `+`-button in the inspector on the right. You can also create a diagram to create a custom block out of combination of existing blocks. The structure is intuitive, but if you are struggling: remember to have a look at the source code of existing blocks!

### More information