top of page

Haptic Sampling Robot

Traditionally, environmental sampling is done by sending researchers out into the field to collect samples by hand. This can be potentially dangerous to the sampler, may risk contamination of the sampled environment, and disturb vulnerable habitats. 

The goal of our project was to design and build a robotic arm that used gesture-based controls and haptic feedback to conduct remote environmental sampling. It could be mounted onto a rover and sent out into the field to mitigate the aforementioned risks and provide a more immersive, intuitive experience than existing solutions.

The robot was comprised of several subsystems that we developed independently before integrating them. These include mechanical components, controls, and haptics.

pic2_edited_edited_edited.png

Mechanical Design

We designed the robotic arm using a parallel kinematic chain of aluminum linkages. The motors were mounted in the base, as shown in the model on the right. This minimized the moment on each joint and required torque, enabling us to use smaller, cheaper motors. This not only increased our allowed payload but also cut costs on larger motors and linkages.

I went through several iterations of gripper designs, referencing existing solutions on the market that used servo motors. However, we planned to use stepper motors since they deliver higher torque and confidence in motor position, while also keeping the controls consistent across all degrees of freedom of the arm. Therefore, I had to consider more robust designs that could support the weight of these motors.

 

I settled on a rack and pinion design that translated the rotational motion of the stepper motor to the linear actuation of the grip plates. This design best secured our payload through the stability of the parallel plates and the high torque transmission from the stepper motors to the rack, which other designs did not allow.

CAD Model of Arm Design

CAD Model of Robotic Arm

Gripper_Rack&Pinion.png

CAD Model of Rack and

Pinion Gripper

Motor Controls and Degrees of Freedom of the Robotic Arm

pic4.png

Full System Controls in use during Capstone Day Demo

IMG_9867.jpeg

Control System

The motion of the user's hand was tracked using two Pixy2 cameras that tracked x-, y-, and z-coordinates of distinct color blocks mounted to a wearable glove. On the robot were five stepper motors - one in the base for rotation, two above for vertical and horizontal reach, and two in the end effector for wrist twist and grip. These are shown in the figure to the left. We developed an algorithm in Matlab that calculated the necessary angle of rotation for each motor based off of the tracked position of the user's hand in space. 

 

The Pixy2 cameras transmitted their coordinate outputs to a RaspberryPi, which ran a Python script with the algorithm. We wrapped these calculated angle values into a single string to be sent over serial communication to the Arduino that was running the motor controllers. We found that this was the best was to decrease the latency of the system since the Arduino was not powerful enough to run the script and send motor commands on its own. 

We were able to simplify the control system by using closed-loop stepper motor controllers that automatically checked and corrected any missed steps during each cycle of operation.

Haptic System

The haptic system was comprised of two parts - the force sensor on the gripper plates and the silicone bladders mounted on the user's fingers.

The system was coded so that, depending on the level of force applied to the gripper, the bladder would inflate by a certain amount to translate the feeling of pressure to the user. This could be especially useful in research applications where samples may be delicate and need to be handled with care.

There was an air pump and solenoid valve mounted to the user's hand that were wired out to a board next to the user. Using relays to toggle the state of the components, I used Arduino software to code in the force thresholds for each level of inflation. We ran into electromagnetic interference issues from the unshielded force sensor wire and magnets in the stepper motors, but mitigated this by adding delays and checks in the code to ignore noise. 

The bladders were poured into 3D-printed molds and glued together using sil-poxy. We used 30 durometer silicone for its flexibilty, but found that they were still quite delicate and had to be careful to not pop them during initial testing with the air pumps.

We also created a demo board so our system could be felt by everyone at Capstone Day without having to put on the haptic glove, as shown in the image to the right.

pic9.png
pic8.png

Haptic Glove with Arm-mounted Pump System

Functional Force Sensor and Haptic Bladder

Haptic Demo Board on Capstone Day

IMG_9878.jpeg

Results

After a successful systems integration phase, we were able to achieve motion in five degrees of freedom and accurately translate the user's hand position to the robotic arm's end effector within 5 millimeters in all cartesian directions. The full range of motion of the system is shown in the video below. 

For our next steps, we plan to continue work on the prototype and have begun the process of writing and publishing a paper with our findings in Spring 2024.

bottom of page