calibrationviews Module
Calibrates a camera attached on a robot by moving it around a pattern.
Running the Example:
openrave.py --example calibrationviews
Description
The pattern is attached to the robot gripper and robot uses moves it to gather data. Uses visibilitymodel to determine which robot configurations make the pattern fully visible inside the camera view.
It is also possible to calibrate an environment camera with this exapmle using:
openrave.py --example calibrationviews --scene=data/pa10calib_envcamera.env.xml --sensorrobot=ceilingcamera
Calibration
Although this example does not contain calibration code, the frames of reference are the following:
T_pattern^world and T_camera^link are unknown, while T_pattern^camera and T_link^world are known.
Command-line
Usage: openrave.py [options]
Views a calibration pattern from multiple locations.
Options:
-h, --help show this help message and exit
--scene=SCENE Scene file to load (default=data/pa10calib.env.xml)
--sensorname=SENSORNAME
Name of the sensor whose views to generate (default is
first sensor on robot)
--sensorrobot=SENSORROBOT
Name of the robot the sensor is attached to
(default=none)
--norandomize If set, will not randomize the bodies and robot
position in the scene.
--novisibility If set, will not perform any visibility searching.
--noshowsensor If set, will not show the sensor.
--posedist=POSEDIST An average distance between gathered poses. The
smaller the value, the more poses robot will gather
close to each other
OpenRAVE Environment Options:
--loadplugin=_LOADPLUGINS
List all plugins and the interfaces they provide.
--collision=_COLLISION
Default collision checker to use
--physics=_PHYSICS physics engine to use (default=none)
--viewer=_VIEWER viewer to use (default=qtcoin)
--server=_SERVER server to use (default=None).
--serverport=_SERVERPORT
port to load server on (default=4765).
--module=_MODULES module to load, can specify multiple modules. Two
arguments are required: "name" "args".
-l _LEVEL, --level=_LEVEL, --log_level=_LEVEL
Debug level, one of
(fatal,error,warn,info,debug,verbose,verifyplans)
--testmode if set, will run the program in a finite amount of
time and spend computation time validating results.
Used for testing
Main Python Code
def main(env,options):
"Main example code."
env.Load(options.scene)
robot = env.GetRobots()[0]
sensorrobot = None if options.sensorrobot is None else env.GetRobot(options.sensorrobot)
env.UpdatePublishedBodies()
time.sleep(0.1) # give time for environment to update
self = CalibrationViews(robot,sensorname=options.sensorname,sensorrobot=sensorrobot,randomize=options.randomize)
attachedsensor = self.vmodel.attachedsensor
if options.showsensor and attachedsensor.GetSensor() is not None and attachedsensor.GetSensor().Supports(Sensor.Type.Camera):
attachedsensor.GetSensor().Configure(Sensor.ConfigureCommand.PowerOn)
attachedsensor.GetSensor().Configure(Sensor.ConfigureCommand.RenderDataOn)
while True:
print 'computing all locations, might take more than a minute...'
self.computeAndMoveToObservations(usevisibility=options.usevisibility,posedist=options.posedist)
if options.testmode:
break
Class Definitions
-
class openravepy.examples.calibrationviews.CalibrationViews(robot, sensorname=None, sensorrobot=None, target=None, maxvelmult=None, randomize=False)[ソース]
Starts a calibration sequencer using a robot and a sensor.
The minimum needed to be specified is the robot and a sensorname. Supports camera sensors that do not belong to the current robot, in this case the IK is done assuming the target is grabbed by the active manipulator of the robot
Can use the visibility information of the target.
パラメタ: | sensorrobot – If specified, used to determine what robot the sensor lies on. |
-
computeAndMoveToObservations(waitcond=None, maxobservations=inf, posedist=0.050000000000000003, usevisibility=True, **kwargs)[ソース]
Computes several configuration for the robot to move. If usevisibility is True, will use the visibility model of the pattern to gather data.
Otherwise, given that the pattern is currently detected in the camera, move the robot around the local neighborhood. This does not rely on the visibiliy information of the pattern and does not create a pattern
-
computelocalposes(maxconeangle=0.5, maxconedist=0.14999999999999999, averagedist=0.029999999999999999, angledelta=0.20000000000000001, **kwargs)[ソース]
Computes robot poses using a cone pointing to the negative z-axis of the camera
-
computevisibilityposes(dists=array([ 0.05, 0.25, 0.45, 0.65, 0.85, 1.05, 1.25, 1.45]), orientationdensity=1, num=inf)[ソース]
Computes robot poses using visibility information from the target.
Sample the transformations of the camera. the camera x and y axes should always be aligned with the
xy axes of the calibration pattern.
-
static gatherCalibrationData(robot, sensorname, waitcond, target=None, **kwargs)[ソース]
function to gather calibration data, relies on an outside waitcond function to return information about the calibration pattern
-
moveToConfiguration(config, waitcond=None)[ソース]
moves the robot to a configuration
-
moveToObservations(poses, configs, waitcond=None, maxobservations=inf, posedist=0.050000000000000003)[ソース]
order the poses with respect to distance
-
viewVisibleConfigurations(**kwargs)[ソース]
-
openravepy.examples.calibrationviews.main(env, options)[ソース]
Main example code.
-
openravepy.examples.calibrationviews.run(*args, **kwargs)[ソース]
Command-line execution of the example.
パラメタ: | args – arguments for script to parse, if not specified will use sys.argv |