By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: How PAUL the Robot Tracks Its Own Movements Using Cameras and LEDs | HackerNoon
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > Computing > How PAUL the Robot Tracks Its Own Movements Using Cameras and LEDs | HackerNoon
Computing

How PAUL the Robot Tracks Its Own Movements Using Cameras and LEDs | HackerNoon

News Room
Last updated: 2025/02/14 at 7:20 PM
News Room Published 14 February 2025
Share
SHARE

Authors:

(1) Jorge Francisco Garcia-Samartın, Centro de Automatica y Robotica (UPM-CSIC), Universidad Politecnica de Madrid — Consejo Superior de Investigaciones Cientıficas, Jose Gutierrez Abascal 2, 28006 Madrid, Spain ([email protected]);

(2) Adrian Rieker, Centro de Automatica y Robotica (UPM-CSIC), Universidad Politecnica de Madrid — Consejo Superior de Investigaciones Cientıficas, Jose Gutierrez Abascal 2, 28006 Madrid, Spain;

(3) Antonio Barrientos, Centro de Automatica y Robotica (UPM-CSIC), Universidad Politecnica de Madrid — Consejo Superior de Investigaciones Cientıficas, Jose Gutierrez Abascal 2, 28006 Madrid, Spain.

Table of Links

Abstract and 1 Introduction

2 Related Works

2.1 Pneumatic Actuation

2.2 Pneumatic Arms

2.3 Control of Soft Robots

3 PAUL: Design and Manufacturing

3.1 Robot Design

3.2 Material Selection

3.3 Manufacturing

3.4 Actuation Bank

4 Data Acquisition and Open-Loop Control

4.1 Hardware Setup

4.2 Vision Capture System

4.3 Dataset Generation: Table-Based Models

4.4 Open-Loop Control

5 Results

5.1 Final PAUL version

5.2 Workspace Analysis

5.3 Performance of the Table-Based Models

5.4 Bending Experiments

5.5 Weight Carrying Experiments

6 Conclusions

Funding Information

A. Conducted Experiments and References

4 Data Acquisition and Open-Loop Control

4.1 Hardware Setup

In order to provide the manipulator with a solid and stable fastening system, which would also allow reliable and predictable data capture of the positions and orientations of its end, the metal structure shown in Figure 10 was built. It is a cube made of steel profiles with methacrylate sheets on the walls. The pneumatic bench, the power supply and the microcontroller were placed on top of the structure.

The aim of the data acquisition system is to be able to measure, whenever required, the position and orientation of the end of the robot in order to be able to relate it to the inflation times of each bladder and thus be able to create an open-loop model of PAUL. For this purpose, three elements are available: the cameras, the calibration grid and the trihedron.

Figure 10. PAUL and its working environment. The manipulator can be seen, with three segments, inside the box that contains it, on top of which the pneumatic bench and the power source have been placed. At the end of PAUL is placed the trihedron that allows the positions to be captured with the cameras seen in the foreground. Source: authors.Figure 10. PAUL and its working environment. The manipulator can be seen, with three segments, inside the box that contains it, on top of which the pneumatic bench and the power source have been placed. At the end of PAUL is placed the trihedron that allows the positions to be captured with the cameras seen in the foreground. Source: authors.

Two Spedal AF926H USB cameras with 1920 x 1080 px, 80◦ field of view and a frequency of 60 fps are used to capture the images. These have been placed on two tripods external to the robot’s structure. They are calibrated with a checkerboard of 11 x 8 squares of 20 mm each, which can be seen in Figure 11a.

The vision beacon, on the other hand, has the task of being recognised in space to determine the position and orientation of the mobile system with respect to the fixed system. The trihedron, displayed in Figure 11b, consists of three spheres, manufactured by 3D printing in PLA, inside which three LED diodes have been embedded. Thanks to these, it is possible to vary the luminosity of the spheres by means of software, keeping the system functioning correctly when the workplace or the environmental or lighting conditions vary.

The existence of the central rod, which moves the luminous spheres away from the base of the robot end, makes possible the spheres to be visible to the cameras in all the poses that the robot can adopt. If the spheres were otherwise directly attached to the

Figure 11. Elements of the vision system. (a) Calibration grid. (b) Beacon to allow the capture of position and orientation. Source: authors.Figure 11. Elements of the vision system. (a) Calibration grid. (b) Beacon to allow the capture of position and orientation. Source: authors.

end of the robot, there would be numerous poses in which it would not be possible to determine the position, as the spheres would be hidden by the robot itself.

4.2 Vision Capture System

Because coordinates of the real world are independent of the camera, if Equation (1) is applied for both cameras and rk vector cleared in the two equations, it can be said that:

System of equations (2) can be solved using the Least Squares Method:

and then use the Rodrigues’ rotation formula to obtain it, respect to the real world base in the form of a rotation matrix:

and I denotes the identity matrix of size 3.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article 'Captain America: Brave New World': Is There a Post-Credits Scene?
Next Article Report: Dell close to inking $5B+ AI server deal with xAI – News
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

NBA Playoffs 2025: TV Channel, Game Times and How to Watch or Stream Today's Games
News
Amazon makes ‘fundamental leap forward in robotics’ with device having sense of touch
News
Google’s Chrome Browser Taps On-Device AI to Catch Tech Support Scams
News
New Purpose-Built Blockchain T-Rex Raises $17 Million to Transform Attention Layer In Web3 | HackerNoon
Computing

You Might also Like

New Purpose-Built Blockchain T-Rex Raises $17 Million to Transform Attention Layer In Web3 | HackerNoon

8 Min Read
Computing

Ninja Deep Research: The AI Agent Everyone Can Actually Start Using Now | HackerNoon

10 Min Read
Computing

If You’re an Amazon Ring Owner, You May Be an Accidental Spy | HackerNoon

21 Min Read
Computing

15 Best Online Collaboration Tools in 2025 (Free & Paid)

68 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?