Commit 154e786a authored by Boksebeld, N.H.J.'s avatar Boksebeld, N.H.J.
Browse files

Added old Choregraphe examples and the introductory README for Choregraphe

parent 04880af7
# Choregraphe
Choregraphe uses a simple structure in which you can connect different blocks together to pair actions sequentially or in parallel. However, don’t be fooled by these simple looking blocks. The only reason they are there is to give a better overview of long programs/actions. If you double click on the blocks, they reveal their source code in Python. So in reality, each of the blocks is actually a class that defines a certain action. You can even create your own blocks if you want to.
Each of the blocks has certain inputs and outputs. The outputs are triggered if some criteria is met inside the block, and any block connect to the activated outputs will then be started. The most common output is `onStopped`, which triggers if the block has completed. Blocks can also pass data through these outputs, for example recognized text or the number of recognized faces. Some blocks can be active indefinitely, and will trigger certain output ports regularly. We will see some examples soon. You can also double click on any of the inputs/outputs during runtime to trigger it!
Each program you create is actually a “new” block on its own, with input ports and output ports. Therefore, you can import entire programs as a block into another program. You can directly run your program, but you can also set it to trigger on certain criteria by only connecting blocks to special input ports. When a block connected to the output is triggered, your whole program stops - even if some blocks have not yet finished running. Similar to blocks, you can also output data or output different events if you like. In short, the possibilities are endless!
### Virtual robot
It would be nice to verify certain behavior before running it on the real robot, such as any kind of movement. Choregraphe offers a so-called virtual robot exactly for this purpose. To initialize and connect to your virtual robot, follow these steps:
* `Edit > Preferences > Virtual robot` - Change the robot to Pepper.
* `Connect > Connect to virtual robot`
This robot has limited capabilities compared to the real robot: any visual, auditory or sensory functions do not work. This virtual robot is really only meant for checking if you are not producing any dangerous movement. We will see a 3rd party simulator later on, which allows you to simulate other behavior as well. When you connect to the real robot, this virtual robot actually becomes the real robot and you can toggle the camera views for example. You can also configure detected faces to be shown in the 3D world.
### Examples
This folder contains three examples. Note that only the `motion` program can run on the virtual robot.
`get_age`
> This program uses facial recognition to determine the age based on someones face. Once the age has been estimated, the robot will say it out loud.
`motion`
> Shows different types of motion that can be created. Note that special animation blocks appear in this program, which are basically timelines of joint values. Choregraphe offers the functionality to record some movement that you create by manually moving limbs, which you can then save as a block.
`speech_dance`
> Pepper will ask you if he/she should dance. You can respond with 'yes' and 'no'.
### Editing blocks
After you have seen some NAOqi examples, you can start editing and coding your own blocks in Choregraphe. To make your own block, you can insert a template block from `Box library > Programming > Templates > Python Script`. You can add more inputs and outputs by pressing the `+`-button in the inspector on the right. You can also create a diagram to create a custom block out of combination of existing blocks. The structure is intuitive, but if you are struggling: remember to have a look at the source code of existing blocks!
### More information
http://doc.aldebaran.com/2-5/software/choregraphe/index.html
<?xml version="1.0" encoding="UTF-8" ?><ChoregrapheProject xmlns="http://www.aldebaran-robotics.com/schema/choregraphe/project.xsd" xar_version="3"><Box name="root" id="-1" localization="8" tooltip="Root box of Choregraphe&apos;s behavior. Highest level possible." x="0" y="0"><bitmap>media/images/box/root.png</bitmap><script language="4"><content><![CDATA[]]></content></script><Input name="onLoad" type="1" type_size="1" nature="0" inner="1" tooltip="Signal sent when diagram is loaded." id="1" /><Input name="onStart" type="1" type_size="1" nature="2" inner="0" tooltip="Box behavior starts when a signal is received on this input." id="2" /><Input name="onStop" type="1" type_size="1" nature="3" inner="0" tooltip="Box behavior stops when a signal is received on this input." id="3" /><Output name="onStopped" type="1" type_size="1" nature="1" inner="0" tooltip="Signal sent when box behavior is finished." id="4" /><Timeline enable="0"><BehaviorLayer name="behavior_layer1"><BehaviorKeyframe name="keyframe1" index="1"><Diagram><Box name="WakeUp" id="2" localization="0" tooltip="Call a Wake Up process.&#x0A;Stiff all joints and apply stand Init posture if the robot is Stand" x="242" y="106"><bitmap>media/images/box/movement/stiffness.png</bitmap><script language="4"><content><![CDATA[class MyClass(GeneratedClass):
def __init__(self):
GeneratedClass.__init__(self)
pass
def onLoad(self):
self.motion = ALProxy("ALMotion")
pass
def onUnload(self):
pass
def onInput_onStart(self):
self.motion.wakeUp()
self.onStopped() #~ activate output of the box
pass
def onInput_onStop(self):
self.onUnload() #~ it is recommended to call onUnload of this box in a onStop method, as the code written in onUnload is used to stop the box as well
pass]]></content></script><Input name="onLoad" type="1" type_size="1" nature="0" inner="1" tooltip="Signal sent when diagram is loaded." id="1" /><Input name="onStart" type="1" type_size="1" nature="2" inner="0" tooltip="Box behavior starts when a signal is received on this input." id="2" /><Input name="onStop" type="1" type_size="1" nature="3" inner="0" tooltip="Box behavior stops when a signal is received on this input." id="3" /><Output name="onStopped" type="1" type_size="1" nature="1" inner="0" tooltip="Signal sent when box behavior is finished." id="4" /><Resource name="All motors" type="Lock" timeout="0" /><Resource name="Stiffness" type="Lock" timeout="0" /></Box><Box name="Basic Awareness" id="4" localization="8" tooltip="This box is an interface to the module ALBasicAwareness.&#x0A;&#x0A;It is a simple way to make the robot establish and keep eye contact with people.&#x0A;&#x0A;V1.1.0" x="408" y="111"><bitmap>media/images/box/tracker/basicawareness.png</bitmap><script language="4"><content><![CDATA[class MyClass(GeneratedClass):
def __init__(self):
GeneratedClass.__init__(self)
def onLoad(self):
#put initialization code here
try:
self.awareness = ALProxy('ALBasicAwareness')
except Exception as e:
self.awareness = None
self.logger.error(e)
self.memory = ALProxy('ALMemory')
self.isRunning = False
self.trackedHuman = -1
import threading
self.subscribingLock = threading.Lock()
self.BIND_PYTHON(self.getName(), "setParameter")
def onUnload(self):
if self.isRunning:
if self.awareness:
self.awareness.stopAwareness()
self.setALMemorySubscription(False)
self.isRunning = False
def onInput_onStart(self):
if self.isRunning:
return # already running, nothing to do
self.isRunning = True
self.trackedHuman = -1
if self.awareness:
self.awareness.setEngagementMode(self.getParameter('Engagement Mode'))
self.awareness.setTrackingMode(self.getParameter('Tracking Mode'))
self.awareness.setStimulusDetectionEnabled('Sound', self.getParameter('Sound Stimulus'))
self.awareness.setStimulusDetectionEnabled('Movement', self.getParameter('Movement Stimulus'))
self.awareness.setStimulusDetectionEnabled('People', self.getParameter('People Stimulus'))
self.awareness.setStimulusDetectionEnabled('Touch', self.getParameter('Touch Stimulus'))
self.setALMemorySubscription(True)
self.awareness.startAwareness()
def onInput_onStop(self):
if not self.isRunning:
return # already stopped, nothing to do
self.onUnload()
self.onStopped()
def setParameter(self, parameterName, newValue):
GeneratedClass.setParameter(self, parameterName, newValue)
if self.awareness:
if parameterName == 'Sound Stimulus':
self.awareness.setStimulusDetectionEnabled('Sound', newValue)
elif parameterName == 'Movement Stimulus':
self.awareness.setStimulusDetectionEnabled('Movement', newValue)
elif parameterName == 'People Stimulus':
self.awareness.setStimulusDetectionEnabled('People', newValue)
elif parameterName == 'Touch Stimulus':
self.awareness.setStimulusDetectionEnabled('Touch', newValue)
# callbacks for ALBasicAwareness events
def onStimulusDetected(self, eventName, stimulusName, subscriberIdentifier):
self.StimulusDetected(stimulusName)
def onHumanTracked(self, eventName, humanID, subscriberIdentifier):
self.trackedHuman = humanID
self.HumanTracked(humanID)
def onHumanLost(self, eventName, subscriberIdentifier):
self.HumanLost(self.trackedHuman)
self.trackedHuman = -1
def setALMemorySubscription(self, subscribe):
self.subscribingLock.acquire()
if subscribe:
self.memory.subscribeToEvent('ALBasicAwareness/StimulusDetected', self.getName(), 'onStimulusDetected')
self.memory.subscribeToEvent('ALBasicAwareness/HumanTracked', self.getName(), 'onHumanTracked')
self.memory.subscribeToEvent('ALBasicAwareness/HumanLost', self.getName(), 'onHumanLost')
else:
self.memory.unsubscribeToEvent('ALBasicAwareness/StimulusDetected', self.getName())
self.memory.unsubscribeToEvent('ALBasicAwareness/HumanTracked', self.getName())
self.memory.unsubscribeToEvent('ALBasicAwareness/HumanLost', self.getName())
self.subscribingLock.release()]]></content></script><Input name="onLoad" type="1" type_size="1" nature="0" inner="1" tooltip="Signal sent when diagram is loaded." id="1" /><Input name="onStart" type="1" type_size="1" nature="2" inner="0" tooltip="Starts the Basic Awareness with the given Engagement and Tracking mode parameters, using the given stimuli." id="2" /><Input name="onStop" type="1" type_size="1" nature="3" inner="0" tooltip="Stops the Basic Awareness." id="3" /><Output name="onStopped" type="1" type_size="1" nature="1" inner="0" tooltip="Signal sent when box behavior is finished." id="4" /><Output name="StimulusDetected" type="3" type_size="1" nature="2" inner="0" tooltip="This output is stimulated when BasicAwareness detects a stimulus amongst the tracked stimulus.&#x0A;&#x0A;The output data is the stimulus&apos; name." id="5" /><Output name="HumanTracked" type="2" type_size="1" nature="2" inner="0" tooltip="This output is triggered when ALBasicAwareness detects a stimulus that is confirmed to be a human.&#x0A;&#x0A;The output data is the ID corresponding to the tracked human. It is shared with PeoplePerception and can be used there. This output is triggered with -1 if ALBasicAwareness tried to detect a human but failed." id="6" /><Output name="HumanLost" type="2" type_size="1" nature="2" inner="0" tooltip="This output is triggered when the human currently tracked is lost.&#x0A;&#x0A; The output data is the ID corresponding to the lost human. It can be reused in PeoplePerception." id="7" /><Parameter name="Engagement Mode" inherits_from_parent="0" content_type="3" value="FullyEngaged" default_value="Unengaged" custom_choice="0" tooltip='The engagement mode specifies how &quot;focused&quot; the robot is on the engaged person.' id="8"><Choice value="Unengaged" /><Choice value="FullyEngaged" /><Choice value="SemiEngaged" /></Parameter><Parameter name="Tracking Mode" inherits_from_parent="0" content_type="3" value="Head" default_value="Head" custom_choice="0" tooltip="The tracking mode describes how the robot keeps eye contact with an engaged person." id="9"><Choice value="Head" /><Choice value="BodyRotation" /><Choice value="WholeBody" /></Parameter><Parameter name="Sound Stimulus" inherits_from_parent="0" content_type="0" value="0" default_value="1" tooltip="" id="10" /><Parameter name="Movement Stimulus" inherits_from_parent="0" content_type="0" value="0" default_value="1" tooltip="" id="11" /><Parameter name="People Stimulus" inherits_from_parent="0" content_type="0" value="1" default_value="1" tooltip="" id="12" /><Parameter name="Touch Stimulus" inherits_from_parent="0" content_type="0" value="0" default_value="1" tooltip="" id="13" /></Box><Box name="Get Age" id="1" localization="8" tooltip="This box returns the age of the person in front of the robot.&#x0A;The detection fails when there are more or less than one person in front of the robot or when the timeout is exceeded.&#x0A;&#x0A;It is possible to set up the Confidence Threshold and the Timeout parameters for this box. " x="592" y="110"><bitmap>media/images/box/interaction/age.png</bitmap><script language="4"><content><![CDATA[class MyClass(GeneratedClass):
def __init__(self):
GeneratedClass.__init__(self)
def onLoad(self):
try:
self.faceC = ALProxy("ALFaceCharacteristics")
except Exception as e:
raise RuntimeError(str(e) + "Make sure you're not connected to a virtual robot." )
self.confidence = self.getParameter("Confidence Threshold")
self.age = 0
self.counter = 0
self.bIsRunning = False
self.delayed = []
self.errorMes = ""
def onUnload(self):
self.counter = 0
self.age = 0
self.bIsRunning = False
self.cancelDelays()
def onInput_onStart(self):
try:
#start timer
import qi
import functools
delay_future = qi.async(self.onTimeout, delay=int(self.getParameter("Timeout (s)") * 1000 * 1000))
self.delayed.append(delay_future)
bound_clean = functools.partial(self.cleanDelay, delay_future)
delay_future.addCallback(bound_clean)
self.bIsRunning = True
while self.bIsRunning:
if self.counter < 4:
try:
#identify user
ids = ALMemory.getData("PeoplePerception/PeopleList")
if len(ids) == 0:
self.errorMes = "No face detected"
self.onUnload()
elif len(ids) > 1:
self.errorMes = "Multiple faces detected"
self.onUnload()
else:
#analyze age properties
self.faceC.analyzeFaceCharacteristics(ids[0])
time.sleep(0.1)
value = ALMemory.getData("PeoplePerception/Person/"+str(ids[0])+"/AgeProperties")
if value[1] > self.confidence:
self.age += value[0]
self.counter += 1
except:
ids = []
else:
#calculate mean value
self.age /= 4
self.onStopped(int(self.age))
self.onUnload()
return
raise RuntimeError(self.errorMes)
except Exception as e:
raise RuntimeError(str(e))
self.onUnload()
def onTimeout(self):
self.errorMes = "Timeout"
self.onUnload()
def cleanDelay(self, fut, fut_ref):
self.delayed.remove(fut)
def cancelDelays(self):
cancel_list = list(self.delayed)
for d in cancel_list:
d.cancel()
def onInput_onStop(self):
self.onUnload()]]></content></script><Input name="onLoad" type="1" type_size="1" nature="0" inner="1" tooltip="Signal sent when diagram is loaded." id="1" /><Input name="onStart" type="1" type_size="1" nature="2" inner="0" tooltip="Box behavior starts when a signal is received on this input." id="2" /><Input name="onStop" type="1" type_size="1" nature="3" inner="0" tooltip="Box behavior stops when a signal is received on this input." id="3" /><Output name="onStopped" type="2" type_size="1" nature="1" inner="0" tooltip="Returns a number between 0 and 75 indicating the age of the person in front of the robot.&#x0A;&#x0A;Tip:&#x0A;Connect this output to If box to compare the age with a defined value" id="4" /><Output name="onError" type="3" type_size="1" nature="1" inner="0" tooltip='Triggered when age detection failed. &#x0A;Possible error messages:&#x0A;- &quot;No face detected&quot;&#x0A;- &quot;Multiple faces detected&quot;&#x0A;- &quot;Timeout&quot;' id="5" /><Parameter name="Confidence Threshold" inherits_from_parent="0" content_type="2" value="0.35" default_value="0.6" min="0" max="1" tooltip="Set the confidence threshold for the age detection." id="6" /><Parameter name="Timeout (s)" inherits_from_parent="0" content_type="2" value="10" default_value="5" min="1" max="60" tooltip="" id="7" /></Box><Box name="Say Text" id="3" localization="8" tooltip="Say the text received on its input." x="787" y="110"><bitmap>media/images/box/interaction/say.png</bitmap><script language="4"><content><![CDATA[import time
class MyClass(GeneratedClass):
def __init__(self):
GeneratedClass.__init__(self, False)
self.tts = ALProxy('ALTextToSpeech')
self.ttsStop = ALProxy('ALTextToSpeech', True) #Create another proxy as wait is blocking if audioout is remote
def onLoad(self):
self.bIsRunning = False
self.ids = []
def onUnload(self):
for id in self.ids:
try:
self.ttsStop.stop(id)
except:
pass
while( self.bIsRunning ):
time.sleep( 0.2 )
def onInput_onStart(self, p):
self.bIsRunning = True
try:
sentence = "\RSPD="+ str( self.getParameter("Speed (%)") ) + "\ "
sentence += "\VCT="+ str( self.getParameter("Voice shaping (%)") ) + "\ "
sentence += "You look like you are %s years old" % str(p)
sentence += "\RST\ "
id = self.tts.post.say(str(sentence))
self.ids.append(id)
self.tts.wait(id, 0)
finally:
try:
self.ids.remove(id)
except:
pass
if( self.ids == [] ):
self.onStopped() # activate output of the box
self.bIsRunning = False
def onInput_onStop(self):
self.onUnload()]]></content></script><Input name="onLoad" type="1" type_size="1" nature="0" inner="1" tooltip="Signal sent when Diagram is loaded." id="1" /><Input name="onStart" type="3" type_size="1" nature="2" inner="0" tooltip="Box behavior starts when a signal is received on this Input." id="2" /><Input name="onStop" type="1" type_size="1" nature="3" inner="0" tooltip="Box behavior stops when a signal is received on this Input." id="3" /><Output name="onStopped" type="1" type_size="1" nature="1" inner="0" tooltip="Signal sent when Box behavior is finished." id="4" /><Parameter name="Voice shaping (%)" inherits_from_parent="1" content_type="1" value="100" default_value="100" min="50" max="150" tooltip='Used to modify at runtime the voice feature (tone, speed). In a slighty&#x0A;different way than pitch and speed, it gives a kind of &quot;gender or age&#x0A;modification&quot; effect.&#x0A;&#x0A;For instance, a quite good male derivation of female voice can be&#x0A;obtained setting this parameter to 78%.&#x0A;&#x0A;Note: For a better effect, you can compensate this parameter with the&#x0A;speed parameter. For example, if you want to decrease by 20% the voice&#x0A;shaping, you will have to increase by 20% the speed to keep a constant&#x0A;average speed.' id="5" /><Parameter name="Speed (%)" inherits_from_parent="1" content_type="1" value="100" default_value="100" min="50" max="200" tooltip="Changes the speed of the voice.&#x0A;&#x0A;Note: For a better effect, you can compensate this parameter with the voice&#x0A;shaping parameter. For example, if you want to increase by 20% the speed, you&#x0A;will have to decrease by 20% the voice shaping to keep a constant average&#x0A;speed." id="6" /><Resource name="Speech" type="Lock" timeout="0" /></Box><Link inputowner="2" indexofinput="2" outputowner="0" indexofoutput="2" /><Link inputowner="4" indexofinput="2" outputowner="2" indexofoutput="4" /><Link inputowner="1" indexofinput="2" outputowner="4" indexofoutput="6" /><Link inputowner="3" indexofinput="2" outputowner="1" indexofoutput="4" /><Link inputowner="0" indexofinput="4" outputowner="3" indexofoutput="4" /></Diagram></BehaviorKeyframe></BehaviorLayer></Timeline></Box></ChoregrapheProject>
\ No newline at end of file
<?xml version="1.0" encoding="UTF-8" ?>
<Package name="get_age" format_version="4">
<Manifest src="manifest.xml" />
<BehaviorDescriptions>
<BehaviorDescription name="behavior" src="behavior_1" xar="behavior.xar" />
</BehaviorDescriptions>
<Dialogs />
<Resources />
<Topics />
<IgnoredPaths />
<Translations auto-fill="en_US">
<Translation name="translation_en_US" src="translations/translation_en_US.ts" language="en_US" />
</Translations>
</Package>
<?xml version='1.0' encoding='UTF-8'?>
<package uuid="get_age-e0a234" version="0.0.0">
<names>
<name lang="en_US">Untitled</name>
</names>
<supportedLanguages>
<language>en_US</language>
</supportedLanguages>
<descriptionLanguages>
<language>en_US</language>
</descriptionLanguages>
<contents>
<behaviorContent path="behavior_1">
<userRequestable/>
<nature>interactive</nature>
<permissions/>
</behaviorContent>
</contents>
</package>
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE TS>
<TS version="2.1" language="en_US"/>
<?xml version="1.0" encoding="UTF-8" ?><ChoregrapheProject xmlns="http://www.aldebaran-robotics.com/schema/choregraphe/project.xsd" xar_version="3"><Box name="root" id="-1" localization="8" tooltip="Root box of Choregraphe&apos;s behavior. Highest level possible." x="0" y="0"><bitmap>media/images/box/root.png</bitmap><script language="4"><content><![CDATA[]]></content></script><Input name="onLoad" type="1" type_size="1" nature="0" inner="1" tooltip="Signal sent when diagram is loaded." id="1" /><Input name="onStart" type="1" type_size="1" nature="2" inner="0" tooltip="Box behavior starts when a signal is received on this input." id="2" /><Input name="onStop" type="1" type_size="1" nature="3" inner="0" tooltip="Box behavior stops when a signal is received on this input." id="3" /><Output name="onStopped" type="1" type_size="1" nature="1" inner="0" tooltip="Signal sent when box behavior is finished." id="4" /><Timeline enable="0"><BehaviorLayer name="behavior_layer1"><BehaviorKeyframe name="keyframe1" index="1"><Diagram><Box name="WakeUp" id="2" localization="0" tooltip="Call a Wake Up process.&#x0A;Stiff all joints and apply stand Init posture if the robot is Stand" x="146" y="251"><bitmap>media/images/box/movement/stiffness.png</bitmap><script language="4"><content><![CDATA[class MyClass(GeneratedClass):
def __init__(self):
GeneratedClass.__init__(self)
pass
def onLoad(self):
self.motion = ALProxy("ALMotion")
pass
def onUnload(self):
pass
def onInput_onStart(self):
self.motion.wakeUp()
self.onStopped() #~ activate output of the box
pass
def onInput_onStop(self):
self.onUnload() #~ it is recommended to call onUnload of this box in a onStop method, as the code written in onUnload is used to stop the box as well
pass]]></content></script><Input name="onLoad" type="1" type_size="1" nature="0" inner="1" tooltip="Signal sent when diagram is loaded." id="1" /><Input name="onStart" type="1" type_size="1" nature="2" inner="0" tooltip="Box behavior starts when a signal is received on this input." id="2" /><Input name="onStop" type="1" type_size="1" nature="3" inner="0" tooltip="Box behavior stops when a signal is received on this input." id="3" /><Output name="onStopped" type="1" type_size="1" nature="1" inner="0" tooltip="Signal sent when box behavior is finished." id="4" /><Resource name="All motors" type="Lock" timeout="0" /><Resource name="Stiffness" type="Lock" timeout="0" /></Box><Box name="Hands" id="1" localization="8" tooltip="the robot stiffens the motors of one or both of his hands so that he can open or close&#x0A;it/them. Then he relaxes the motors of his hand(s)." x="297" y="256"><bitmap>media/images/box/movement/move_arm.png</bitmap><script language="4"><content><![CDATA[class MyClass(GeneratedClass):
def __init__(self):
GeneratedClass.__init__(self, False)
def onLoad(self):
self.motion = ALProxy( "ALMotion" )
self.bIsRunning = False
def onUnload(self):
self.bIsRunning = False
def onInput_onStart(self):
if( self.bIsRunning ):
return
self.bIsRunning = True
try:
hands = []
if( self.getParameter("Side") in ["Left", "Both"] ):
hands.append( "LHand" )
if( self.getParameter("Side") in ["Right", "Both"] ):
hands.append( "RHand" )
ids = []
for hand in hands:
if( self.getParameter("Action") == "Open the hand" ):
ids.append( self.motion.post.openHand(hand) )
else:
ids.append( self.motion.post.closeHand(hand) )
for id in ids:
self.motion.wait( id, 0 )
finally:
self.bIsRunning = False
self.onDone() # activate output of the box]]></content></script><Input name="onLoad" type="1" type_size="1" nature="0" inner="1" tooltip="Signal sent when diagram is loaded." id="1" /><Input name="onStart" type="1" type_size="1" nature="2" inner="0" tooltip="Box behavior starts when a signal is received on this input." id="2" /><Output name="onDone" type="1" type_size="1" nature="1" inner="0" tooltip="Signal sent when box behavior is finished." id="3" /><Parameter name="Side" inherits_from_parent="0" content_type="3" value="Both" default_value="Both" custom_choice="0" tooltip="Choose the hand to move or both of them." id="4"><Choice value="Both" /><Choice value="Left" /><Choice value="Right" /></Parameter><Parameter name="Action" inherits_from_parent="0" content_type="3" value="Open the hand" default_value="Open the hand" custom_choice="0" tooltip="Action you want to make on the robot&apos;s hand." id="5"><Choice value="Open the hand" /><Choice value="Close the hand" /></Parameter></Box><Box name="Move To" id="3" localization="8" tooltip="Make the robot move to a configured point relative to its current location." x="454" y="255"><bitmap>media/images/box/movement/walk_forward.png</bitmap><script language="4"><content><![CDATA[
class MyClass(GeneratedClass):
def __init__(self):
GeneratedClass.__init__(self, False)
self.motion = ALProxy("ALMotion")
self.positionErrorThresholdPos = 0.01
self.positionErrorThresholdAng = 0.03
def onLoad(self):
pass
def onUnload(self):
self.motion.moveToward(0.0, 0.0, 0.0)
def onInput_onStart(self):
import almath
# The command position estimation will be set to the sensor position
# when the robot starts moving, so we use sensors first and commands later.
initPosition = almath.Pose2D(self.motion.getRobotPosition(True))
targetDistance = almath.Pose2D(self.getParameter("Distance X (m)"),
self.getParameter("Distance Y (m)"),
self.getParameter("Theta (deg)") * almath.PI / 180)
expectedEndPosition = initPosition * targetDistance
enableArms = self.getParameter("Arms movement enabled")
self.motion.setMoveArmsEnabled(enableArms, enableArms)
self.motion.moveTo(self.getParameter("Distance X (m)"),
self.getParameter("Distance Y (m)"),
self.getParameter("Theta (deg)") * almath.PI / 180)
# The move is finished so output
realEndPosition = almath.Pose2D(self.motion.getRobotPosition(False))
positionError = realEndPosition.diff(expectedEndPosition)
positionError.theta = almath.modulo2PI(positionError.theta)
if (abs(positionError.x) < self.positionErrorThresholdPos
and abs(positionError.y) < self.positionErrorThresholdPos
and abs(positionError.theta) < self.positionErrorThresholdAng):
self.onArrivedAtDestination()
else:
self.onStoppedBeforeArriving(positionError.toVector())
def onInput_onStop(self):
self.onUnload()]]></content></script><Input name="onLoad" type="1" type_size="1" nature="0" inner="1" tooltip="Signal sent when Diagram is loaded." id="1" /><Input name="onStart" type="1" type_size="1" nature="2" inner="0" tooltip="Box behavior starts when a signal is received on this input." id="2" /><Input name="onStop" type="1" type_size="1" nature="3" inner="0" tooltip="Box behavior stops when a signal is received on this input." id="3" /><Output name="onArrivedAtDestination" type="1" type_size="1" nature="1" inner="0" tooltip="Signal sent when the robot arrives at its destination." id="4" /><Output name="onStoppedBeforeArriving" type="0" type_size="1" nature="1" inner="0" tooltip="Signal sent when the robot stops before arriving to its destination. Returns a vector [x (m), y (m), theta(rad)] with the remaining distance up to the destination. This distance is expressed in the ROBOT frame." id="5" /><Parameter name="Distance X (m)" inherits_from_parent="0" content_type="2" value="1" default_value="0.2" min="-5" max="10" tooltip="The distance in meters for forward/backward motion. Positive value&#x0A;means forward, negative value means backward." id="6" /><Parameter name="Distance Y (m)" inherits_from_parent="0" content_type="2" value="0" default_value="0" min="-5" max="5" tooltip="The distance in meters for lateral motion. Positive value means left, negative&#x0A;value means right." id="7" /><Parameter name="Theta (deg)" inherits_from_parent="0" content_type="2" value="0" default_value="0" min="-180" max="180" tooltip="The orientation in degrees for final rotation. Positive value means anticlockwise,&#x0A;negative value means clockwise." id="8" /><Parameter name="Arms movement enabled" inherits_from_parent="0" content_type="0" value="1" default_value="1" tooltip="Enables natural motion of the arms." id="9" /><Resource name="Legs" type="Lock" timeout="0" /></Box><Box name="Look At" id="9" localization="-1" tooltip="This box makes the robot look at a desired position." x="641" y="411"><bitmap>media/images/box/movement/move_head.png</bitmap><script language="4"><content><![CDATA[import time
class MyClass(GeneratedClass):
def __init__(self):
GeneratedClass.__init__(self)
self.tracker = ALProxy( "ALTracker" )
self.x = 0.0
self.y = 0.0
self.z = 0.0
self.maxSpeed = 0.5
self.useWholeBody = False
self.frame = 0 #FRAME TORSO
def onLoad(self):
self.BIND_PYTHON(self.getName(), "setParameter")
def onUnload(self):
pass
def onInput_onStart(self):
self.x = self.getParameter("X (m)")
self.y = self.getParameter("Y (m)")
self.z = self.getParameter("Z (m)")
self.maxSpeed = self.getParameter("Speed (%)") / 100.0
self.useWholeBody = self.getParameter("WholeBody")
frameStr = self.getParameter("Frame")
if frameStr == "Torso":
self.frame = 0
elif frameStr == "World":
self.frame = 1
elif frameStr == "Robot":
self.frame = 2
self.tracker.lookAt([self.x, self.y, self.z], self.frame, self.maxSpeed, self.useWholeBody)
self.onStopped()
def onInput_onStop(self):
self.onUnload()
pass
def setParameter(self, parameterName, newValue):
GeneratedClass.setParameter(self, parameterName, newValue)
if (parameterName == "X (m)"):
self.x = newValue
self.tracker.lookAt([self.x, self.y, self.z], self.frame, self.maxSpeed, self.useWholeBody)
self.onStopped()
return
if (parameterName == "Y (m)"):
self.y = newValue
self.tracker.lookAt([self.x, self.y, self.z], self.frame, self.maxSpeed, self.useWholeBody)
self.onStopped()
return
if (parameterName == "Z (m)"):
self.z = newValue
self.tracker.lookAt([self.x, self.y, self.z], self.frame, self.maxSpeed, self.useWholeBody)
self.onStopped()
return
if (parameterName == "Speed (%)"):
self.maxSpeed = newValue / 100.0
return
if (parameterName == "WholeBody"):
self.useWholeBody = newValue
return
if (parameterName == "Frame"):
if(newValue == "Torso"):
self.frame = 0
elif newValue == "World":
self.frame = 1
elif newValue == "Robot":
self.frame = 2]]></content></script><Input name="onLoad" type="1" type_size="1" nature="0" inner="1" tooltip="Signal sent when diagram is loaded." id="1" /><Input name="onStart" type="1" type_size="1" nature="2" inner="0" tooltip="Box behavior starts when a signal is received on this input." id="2" /><Input name="onStop" type="1" type_size="1" nature="3" inner="0" tooltip="Box behavior stops when a signal is received on this input." id="3" /><Output name="onStopped" type="1" type_size="1" nature="1" inner="0" tooltip="Signal sent when box behavior is finished." id="4" /><Parameter name="X (m)" inherits_from_parent="0" content_type="2" value="1" default_value="1" min="0.001" max="10" tooltip="X coordinate of the target to look at." id="5" /><Parameter name="Y (m)" inherits_from_parent="0" content_type="2" value="1" default_value="0" min="-10" max="10" tooltip="Y coordinate of the target to look at." id="6" /><Parameter name="Z (m)" inherits_from_parent="0" content_type="2" value="0" default_value="0" min="-10" max="10" tooltip="Z coordinate of the target to look at." id="7" /><Parameter name="Speed (%)" inherits_from_parent="0" content_type="1" value="26" default_value="50" min="1" max="100" tooltip="Speed to move the head towards the desired position." id="8" /><Parameter name="WholeBody" inherits_from_parent="0" content_type="0" value="0" default_value="0" tooltip="Use whole body constraints" id="9" /><Parameter name="Frame" inherits_from_parent="0" content_type="3" value="Torso" default_value="Torso" custom_choice="0" tooltip="Select the frame of target." id="10"><Choice value="Torso" /><Choice value="World" /><Choice value="Robot" /></Parameter></Box><Box name="Point At" id="10" localization="-1" tooltip="This box makes the robot point to a desired position." x="642" y="129"><bitmap>media/images/box/movement/move_arm.png</bitmap><script language="4"><content><![CDATA[import time
class MyClass(GeneratedClass):
def __init__(self):
GeneratedClass.__init__(self)
self.tracker = ALProxy( "ALTracker" )
self.x = 0.0
self.y = 0.0
self.z = 0.0
self.maxSpeed = 0.5
self.effector = "Arms"
self.frame = 0 #FRAME TORSO
def onLoad(self):
self.BIND_PYTHON(self.getName(), "setParameter")
def onUnload(self):
pass
def onInput_onStart(self):
self.x = self.getParameter("X (m)")
self.y = self.getParameter("Y (m)")
self.z = self.getParameter("Z (m)")
self.maxSpeed = self.getParameter("Speed (%)") / 100.0
self.effector = self.getParameter("Effector")
frameStr = self.getParameter("Frame")
if frameStr == "Torso":
self.frame = 0
elif frameStr == "World":
self.frame = 1
elif frameStr == "Robot":
self.frame = 2
self.tracker.pointAt(self.effector, [self.x, self.y, self.z], self.frame, self.maxSpeed)
self.onStopped()
def onInput_onStop(self):
self.onUnload()
pass
def setParameter(self, parameterName, newValue):
GeneratedClass.setParameter(self, parameterName, newValue)
if (parameterName == "X (m)"):
self.x = newValue
self.tracker.pointAt(self.effector, [self.x, self.y, self.z], self.frame, self.maxSpeed)
self.onStopped()
return
if (parameterName == "Y (m)"):
self.y = newValue
self.tracker.pointAt(self.effector, [self.x, self.y, self.z], self.frame, self.maxSpeed)
self.onStopped()
return
if (parameterName == "Z (m)"):
self.z = newValue
self.tracker.pointAt(self.effector, [self.x, self.y, self.z], self.frame, self.maxSpeed)
self.onStopped()
return
if (parameterName == "Speed (%)"):
self.maxSpeed = newValue / 100.0
return
if (parameterName == "Effector"):
self.effector = newValue
self.tracker.pointAt(self.effector, [self.x, self.y, self.z], self.frame, self.maxSpeed)
self.onStopped()
return
if (parameterName == "Frame"):
if(newValue == "Torso"):
self.frame = 0
elif newValue == "World":
self.frame = 1
elif newValue == "Robot":
self.frame = 2]]></content></script><Input name="onLoad" type="1" type_size="1" nature="0" inner="1" tooltip="Signal sent when diagram is loaded." id="1" /><Input name="onStart" type="1" type_size="1" nature="2" inner="0" tooltip="Box behavior starts when a signal is received on this input." id="2" /><Input name="onStop" type="1" type_size="1" nature="3" inner="0" tooltip="Box behavior stops when a signal is received on this input." id="3" /><Output name="onStopped" type="1" type_size="1" nature="1" inner="0" tooltip="Signal sent when box behavior is finished." id="4" /><Parameter name="X (m)" inherits_from_parent="0" content_type="2" value="1" default_value="1" min="0.001" max="10" tooltip="X coordinate of the target to point at." id="5" /><Parameter name="Y (m)" inherits_from_parent="0" content_type="2" value="1" default_value="0" min="-10" max="10" tooltip="Y coordinate of the target to point at." id="6" /><Parameter name="Z (m)" inherits_from_parent="0" content_type="2" value="0" default_value="0" min="-10" max="10" tooltip="Z coordinate of the target to point at." id="7" /><Parameter name="Speed (%)" inherits_from_parent="0" content_type="1" value="49" default_value="50" min="1" max="100" tooltip="Speed to look at the desired position" id="8" /><Parameter name="Effector" inherits_from_parent="0" content_type="3" value="Arms" default_value="Arms" custom_choice="0" tooltip="Effector to use" id="9"><Choice value="Arms" /><Choice value="LArm" /><Choice value="RArm" /></Parameter><Parameter name="Frame" inherits_from_parent="0" content_type="3" value="Torso" default_value="Torso" custom_choice="0" tooltip="Select the frame of target." id="10"><Choice value="Torso" /><Choice value="World" /><Choice value="Robot" /></Parameter></Box><Box name="WideBothArmsCircle_LeanLeft_01" id="5" localization="8" tooltip="ID : #01F 0029&#x0A;&#x0A;===================&#x0A;&#x0A;Tags : &#x0A;- Wow&#x0A;- Awesome&#x0A;- Great!&#x0A;- Yoohoo&#x0A;- Impressive&#x0A;&#x0A;===================&#x0A;&#x0A;Common dialog : No&#x0A;&#x0A;===================&#x0A;&#x0A;Start stance : LeanLeft&#x0A;End stance : LeanLeft" x="1009" y="270"><bitmap>media/images/box/movement/move.png</bitmap><script language="4"><content><![CDATA[]]></content></script><Input name="onLoad" type="1" type_size="1" nature="0" inner="1" tooltip="Signal sent when diagram is loaded." id="1" /><Input name="onStart" type="1" type_size="1" nature="2" inner="0" tooltip="Box behavior starts when a signal is received on this input." id="2" /><Input name="onStop" type="1" type_size="1" nature="3" inner="0" tooltip="Box behavior stops when a signal is received on this input." id="3" /><Output name="onStopped" type="1" type_size="1" nature="1" inner="0" tooltip="Signal sent when box behavior is finished." id="4" /><Timeline enable="1" fps="25" start_frame="1" end_frame="-1" size="57"><ActuatorList model=""><ActuatorCurve name="value" actuator="HeadPitch" mute="0" unit="0"><Key frame="15" value="-5.77728" /><Key frame="22" value="0.838254" /><Key frame="32" value="-33.6618" /><Key frame="38" value="-17.8618" /><Key frame="55" value="-19.5763" /></ActuatorCurve><ActuatorCurve name="value" actuator="HeadYaw" mute="0" unit="0"><Key frame="15" value="3.95273" /><Key frame="22" value="4.83163" /><Key frame="32" value="7.11683" /><Key frame="38" value="5.88634" /><Key frame="55" value="5.18321" /></ActuatorCurve><ActuatorCurve name="value" actuator="LElbowRoll" mute="0" unit="0"><Key frame="14" value="-79.1002" /><Key frame="23" value="-88.5" /><Key frame="30" value="-80.3306" /><Key frame="40" value="-53.3" /><Key frame="57" value="-46.4923" /></ActuatorCurve><ActuatorCurve name="value" actuator="LElbowYaw" mute="0" unit="0"><Key frame="14" value="-80.1254" /><Key frame="23" value="-90.8344" /><Key frame="30" value="-111.712" /><Key frame="40" value="-98.5203" /><Key frame="57" value="-100.462" /></ActuatorCurve><ActuatorCurve name="value" actuator="LHand" mute="0" unit="1"><Key frame="14" value="0.2484" /><Key frame="23" value="0.2496" /><Key frame="30" value="0.86" /><Key frame="40" value="0.76" /><Key frame="57" value="0.7656" /></ActuatorCurve><ActuatorCurve name="value" actuator="LShoulderPitch" mute="0" unit="0"><Key frame="14" value="58.8851" /><Key frame="23" value="46.2286" /><Key frame="30" value="26.7" /><Key frame="40" value="14.6" /><Key frame="57" value="43.4161" /></ActuatorCurve><ActuatorCurve name="value" actuator="LShoulderRoll" mute="0" unit="0"><Key frame="14" value="24.673" /><Key frame="23" value="18.4308" /><Key frame="30" value="50.8141" /><Key frame="40" value="62.3702" /><Key frame="57" value="61.5592" /></ActuatorCurve><ActuatorCurve name="value" actuator="LWristYaw" mute="0" u