Become the ultimate master of bin picking with the new Bin Picking Studio 1.4.0!

Become the ultimate master of bin picking with the new Bin Picking Studio 1.4.0!

By Andrea Ferkova || March 23, 2020

Our team of experts invested months of endeavor to push automated bin picking to completely new realms – making the impossible real. Are you ready to enter the future of smart automation with the highest-level, market-leading technology?

We present you the Bin Picking Studio 1.4.0 – with new features, upgrades, and enhancements.

Collision checking

Collision checking algorithms were completely rewritten and enhanced with the following new abilities: 

  • BPS now computes collisions with the grasped part
  • Collision checking with other objects localized in the bin – the localization can detect objects even if they are not entirely visible in the scene. The entire model is rendered into the scene and used as a collision object.
  • Gripper collision sensitivity – when picking with a two-finger gripper, the robot sometimes collides with the neighboring objects. As a solution to this problem, users are now allowed to define the gripper sensitivity and set what percentage of the gripper volume may get in a collision with the scanned point cloud or with any currently detected objects.

Speed up

Speed is a high priority for every customer; therefore, we made the following speed optimizations:

  • The waypoint computation is 100 times faster for most of our robots thanks to the brand new inverse kinematics calculation algorithm
  • Connections to cameras are faster
  • The deployment startup time was reduced by 50%


Together with the new version of Bin Picking Studio, we also release a new version of Localisation SDK numbered 1.3.0. Besides general stabilization and speed improvements, the users can look forward to the following features:

  • The Localization GUI now allows the definition of a bounding box where the detection will take place, leading to higher speed and increased safety
  • A segmented image is now available, which is crucial for proper configuration of the localization settings

Scanner on the robot

The possibility to mount a scanner on the robot, a so-called hand-eye approach, was frequently requested by our partners and customers. This allows the user to use a smaller scanner for bigger bins and thus get a higher resolution and better details. Another significant benefit is provided by the option of variable viewpoints, which comes very handy in case one needs to take a closer look at the corners of the bin. Attach the scanner to the end of the robotic arm or gripper, calibrate them with a brand new marker pattern calibration and enjoy the benefits of scanning from variable viewpoints. 


We pushed “pure” debugging to another level. Besides marking the localized parts with color, we now also display complete trajectories in 3D space. The user can click on the detected objects and inspect:

  • computed grasping positions 
  • picking path stages
  • trajectories

You can literally replay the robot’s planned motion. Position by position.

Environment builder

  • Added support for drawing simple collision objects directly in the GUI
  • Added support for resizing and scaling the environment objects
  • The robot is now displayed in the environment configuration GUI
  • Connected and calibrated scanners are now automatically displayed in the environment
  • Connected vision systems can be triggered and the point cloud will be displayed, enabling a quick verification of the position of objects in the scanner’s view
  • It has never been easier to set up the scanners’ field of view thanks to the possibility to display their scanning volumes

Environment builder

Environment collision checking

Environment correction with the scan


The process of alignment of the scanning volume with the robot’s working volume is crucial for successful bin picking and we will continuously work on improving the calibration pipeline. Besides higher user-friendliness, users can look forward to the following features:

  • Added pre-configured calibration ball, with its diameter being remembered by the system
  • Possibility to visually verify the calibration results in the embedded visualizer
  • Supported hand-eye calibration

Robot motion description

Fluent movement of the robotic arm is a basic requirement of any customer. Therefore we made a few tweaks to the path stage configuration:

  • The approach and de-approach waypoints are now mandatory
  • Motions planned between approach and de-approach path stages are planned as linear motions
  • The maximum number of waypoints is 11
  • The sampling step in linear trajectories can be configured in User settings


In the previous version, only joint jogging was available in the visualization. Users can now jog either in reference to the coordinate system of the robot or the coordinate system of the tool. All movements respect the kinematics of the robot according to the datasheet, taking joint limits into consideration too.

Robot modules

Robot modules were also updated to improve error handling and connection reliability. In addition, new, more complete examples were added for every brand.

Convenience features

Besides all the features described above, we made a few smart enhancements across the entire system. These include:

  • Solution attachments to keep all data in one place. Users can attach additional files, such as robot program backups or additional CAD files, into their solutions. The files will be embedded directly into PBCF solution exports.
  • Possibility to choose between a titles view or table view in the section of robots, grippers, and picked objects
  • New user roles and a user access management section to assign different access rights to different users of the BPS.
  • Welcome page for the initial, quick setup of:
    • the installation timezone, 
    • customer’s accounts 
  • If you have a picking preference, you can now let the system know by setting your Favorite gripping point – by assigning it a high, medium or low status
  • Troubleshooting of network connections is now easier due to new, descriptive error handling


Set new, handy parameters to make your software more efficient:

  • Inhibit the robot from picking partially occluded parts which often causes blocking
  • Specify the direction in which the parts are to be picked. You can now configure the picking priority vector in the robot space.
  • Disable or enable a collision with the picked part, depending on the object user handle

Documentation extensions

For easier understanding, the documentation is full of pictures and descriptions supporting texts. It has been extended with new sections, mainly related to path planning, the environment builder, calibration process as well as user roles and permissions.

About the Author

Andrea Ferkova
Andrea Ferkova
Sr. PR Specialist || Website

Andrea Ferkova is senior public relations specialist at Photoneo and writer of technological articles on smart automation powered by robotic vision and intelligence. She has a master’s from the University of Vienna.

Recent blog posts