Each robot comes equipped with a suite of sensors. For example, the Pioneer robots have sonar sensors, bumpers, and a camera. The Khepera robots have infrared sensors, light sensors, and a camera. The Pioneer's sonar sensors and the Khepera's IR sensors are both active range finders which can be used to sense distance to obstacles. Although these two robots have different kinds of sensors, Pyro unifies the way you will use them in your programs. In this section, we will first introduce Pyro range sensor abstractions, and then we will also present the basic sensors present on Kheperas and Pioneers and get you familiarized with these. We will also include the interfaces for vision sensing.
Let's start up Pyro in the Stage simulator with the tutorial world. You can do this, as you've seen earlier, by selecting a server, a world, and a robot through the GUI. It is also possible to configure this on the command line, as shown below:
pyrobot -s StageSimulator -w tutorial.cfg -r Player6665
First, let's see what devices are available on the robot. Press the "View" button next to the "Robot" section of the GUI. Notice that the robot has three types of devices: position, sonar, and simulation. Open up the sonar devices folder. A robot may have many sets of a particular type of device. In this case there is only one set of sonars, known as sonar.
Whether a robot has sonars or IR sensors for range sensing, you can access range information from a robot by using the range abstraction. All robots have a predefined range attribute. Without knowing about the specific robot you intend to use, you can access range information by asking the robot to supply you the attributes of range sensors it has.
On a Pioneer-sytle robot, the range attribute is always associated with sonar. To see that this is the case, first look at the title attribute in the "View" window. Then at the command line of the GUI, type:
You can see that robot.range is simply a reference to sonar.
The power of Pyro comes from these abstractions. You should always use the range abstraction rather than refering to sonar directly. Why? Because if you refer to sonar], then your program will only work on a robot with sonar-style range sensing. However, if you use the range abstraction, then your program can run on any robot with a range device. All range devices have a distance() method that returns the estimated distance measure to the nearest obstacle.
From the "View" window we can see that sonar includes 16 sensors numbered 0 through 15. So we can read one of these sensors by typing the following at the command line:
which in this case returned 1.63157630574. Before we go any further, we should figure out how to interpret the value returned by this distance() method.
In Pyro, each robot has a set of customizable sensor units. Each sensor has a setting called, units that specifies the units used in reporting sensor values. There are six choices of units:
METERS: The values reported by sensors are in meters.
CM: The values reported are in centimeters.
MM: The values reported are in millimeters.
SCALED: The values reported are scaled in the range -1.0..1.0
ROBOTS: The values reported are in proportion to the robot's
physical diameter. Thus a distance equal to the robot's diameter is 1.
RAW: The values reported are the raw values from the sensors.
By default, the robot's range values are set to ROBOTS. Therefore the value returned above indicates that there is an obstacle approximately 1.6 robot units away. This is another important abstraction. By using robot units, rather than specific measures such as meters, we can write one program that will work in a similar way on a tiny Khepera robot as well as the much larger Pioneer robot.
Note that if no obstacles are detected, the range sensor will return its maximum distance. This value varies from sensor to sensor and can be seen using robot.range.getMaxvalue() and reported in the current units.
You can change the units of the value(s) reported by using setting the units as shown below:
robot.range.units = 'SCALED' robot.range.getMaxvalue()
Next time we get a value from any of this range's sensors, it will be reported in SCALED units, i.e. it will be a value between -1.0 and 1.0. Let us try this out:
robot.range.distance() => 0.174168110034
You can also get the scaled value for just the current reading. First, we reset the default units to 'ROBOTS', and then dynamically get a reading in 'SCALED' units:
robot.range.units = 'ROBOTS' robot.range.distance() => 1.73560416834 robot.range.distance(unit = 'SCALED') => 0.172406364997
Try another example:
robot.range.distance(unit = 'CM') => 132.269729654
That is, the object is 132.3 centimeters away.
Using "list comprehension", you can also issue a query to the robot to obtain the values of all sensors as an array. This is done as follows:
[s.distance() for s in self.robot.range["all"]] ==> [0.44869176209701611, 0.60781655423635583, 1.0306817554080976, 1.6410707396430828, 1.6409799177705537, 3.356085956557429, 4.5856911255788146, 5.5352539385124544, 2.2594415826628422, 2.9820154908926448, 3.2250401855054704, 2.9169203443595531, 2.7410097599697201, 0.9660035806647439, 0.59385700232428629, 0.41781141653037679]
You will soon realize that this interface is very versatile and can be used to retrieve or change various other attributes of the robot.
Given the above information regarding the distance from sensor
number 3 to an obstacle in various units, what is the diameter of the robot?
Why might range sensors return a large number if there is no obstacle? Shouldn't they return 0 if there is nothing there?
While it is good to be able to access values of various range sensors, it is also essential to know the location of all the sensors on a robot's body. So, in addition to knowing that a Pioneer has 16 sensors, it would also be useful to know where they are located and how they are numbered. This is shown below for both the Pioneer and Khepera robots:
|Pioneer Sensor Topology||Khepera Sensor Topology|
One could ask the robot to tell you where it hit something. For example, you could write robot.range.hit() which will return the (x,y,z) of the hit:
robot.range.hit # returns x, y, z
and the geometry of the originating ray of the sensor:
robot.range.geometry # returns x, y, z, theta in radians, arc width
Find out the range of values (max and min) reported by the range sensors of a robot of your choosing. Try them for simulated as well as real robots (if you can). Compare the values.
Named Sensor Groups
For convenience in programming, each robot class also defines named sensor groups for all of the basic sensors available on the robot. The range sensors on the Khepera are IR sensors while they are sonar on the Pioneer. Given the direction in which a particular set of sensors is pointing, a logical sensor group provides access to the values reported by each group. The 15 groups and their names for any robot are shown in the table below.
all - all sensors front-all - all front sensors front - the very front, center sensors front-left - front sensors on the left front-right - front sensors on the right left - all sensors on the left (Pioneer has 2, Khepera has 1) left-front - on the left in the front left-back - on the left in the back right - all sensors on the right (Pioneer has 2, Khepera has 1) right-front - on the right in the front right-back - on the right in the back back-all - all of the back sensors back - the very back, center sensors back-left - back sensors, on the left back-right - back sensors, on the right
By using these names (rather than specific sensor positions) you can write behaviors that can run on different kinds of robots. Combine these abstract names with the units set to "ROBOTS" and you can almost ignore the underlying hardware.
The diagram below shows the locations of these sensor groups on a robot body.
The specific sensors that make up a particular sensor group vary depending on the type of robot being used. This is shown below:
|Pioneer Sonar Sensor Groups||Khepera IR/Light Sensor Groups||Pioneer Laser Sensor Groups|
List comprehension can be used to retrieve sensor values from sensor groups. For example:
[s.distance() for s in robot.range["front"]]
You may want to perform some computation on the list of values returned. Using just Python, you can get the min or max of the values, or the average:
min([s.distance() for s in robot.range["front"]]) max([s.distance() for s in robot.range["front"]]) from pyrobot.brain import avg avg([s.distance() for s in robot.range["front"]])
Consider finding out the direction of the nearest obstacle around a robot. Pyro knows the angle of each sensor (often called theta, and abbreviated "th"), and you can retrieve that information using list comprehension:
[s.angle() for s in robot.range["all"]] [s.angle(unit="radians") for s in robot.range["all"]] # in radians
These commands will return the angles of the sensors in degrees and radians, respectively. But that doesn't tell you which direction has the smallest range reading. You can get that information with:
min([s.distance() for s in robot.range["all"]])
But that doesn't give you which sensor had that reading. You can get the position using the "pos" keyword, like so:
[s.pos for s in robot.range["all"]]
That just shows you the positions of each sensor associated with the keyword "all". You can get both the angle and the position:
[(s.angle(unit="radians"),s.pos) for s in robot.range["all"]]
You can then easily pullout the min or max:
min([(s.angle(unit="radians"),s.pos) for s in robot.range["all"]]) max([(s.angle(unit="radians"),s.pos) for s in robot.range["all"]])
The range attribute nicely generalizes the functionality of various robots that may use different sensing devices for range finding. However, it is also nice to be able to access the values of a robot's specific sensors since each type of sensor may be able to provide additional information. This tends to make things robot-specific, but sometimes that is necessary.
Regardless of the kind of robot, and hence the kind of sensor used, the way to access these values is quite uniform. The simplest command to get the value of a specific sensor is:
SENSOR_NUMBER could be a named sensor group, but could also be a specific sensor number ([0..15] for Pioneer). Thus, on a Khepera robot you can use the 'ir' SENSOR_TYPE to access the IR sensors directly, or you can use the 'light' SENSOR_TYPE to get the values of its light sensors. INDEX will be 0, unless you have more than one SENSOR_TYPE (for example, two sonar rangers).
robot.range is the only device that you don't need to follow it with the INDEX. robot.range is a device object, whereas robot.sonar is a list of device objects (albeit typically just one of them).
SENSOR_NUMBER could also be a range (a "slice" in Pythonese) or comma-delimited set:
robot.range.distance() # a single distance value [s.distance() for s in robot.range[1:5]] # list of data for 1-4 [s.distance() for s in robot.range[3,4,6]] # list of data for 3, 4, and 6
Similarly, for a Pioneer, you can access the values of its sonars by using the SENSOR_TYPE 'sonar'. For example:
returns the range distance of the sonar sensor in position 4.
Where am I?
Another use of the robot object is to ask where it thinks that it is. When you get these data, you don't specify '.distance()' as that is the not the only option. So you could:
robot.x robot.y robot.th
These values are similar to the ones that we saw in the simulator, but they are only where the robot 'thinks' that it is, and what absolute direction it 'thinks' that it is facing. Because there is some play in the motors, gears and wheels, any time the robot moves, it must compute (based on some math, physics, and heuristics) where it thinks it is now. This is called dead reckoning and is used quite often in robotics.
You could ask the robot to return max([s.distance() for s in robot.sonar]). But why would that be asking Pyro to do something silly?
Start the robot in a place in the simulator. Note the simulator's x, y, and th, and the robot's idea of x, y, and th. Now, move the robot by using the move() command, clicking on the move buttons or the "Joystick..." button under the "Robot" menu option. How good a job does the robot do of dead reckoning? In what situations does it get the most off from where it really is? Some simulations don't have any variation (sometimes called noise) between where it thinks it is, and where it really is. Others have quite a bit of variation.
How could a robot figure out where it is other than using dead reckoning? How do you keep track of where you are as you move about the world?
What's the difference between:
min([s.distance() for s in robot.range["all"]])and
min([(s.distance(), s.angle()) for s in robot.range["all"]])Try them both and report the differences.
You can read the bump sensors (if a robot or simulation has them) with the following:
where the number can be a value from 0-4 (back only), with 0 back right, 2 is the back middle, and 4 is the back left. Currently Stage has not yet included bump sensing in its latest version.
[s.value for s in robot.bumper["all"]] => [0,0,0,0,0] //if none are pressed [s.value for s in robot.bumper["all"]] => [1,1,1,1,1] //if all are pressed.