tingeypa

[ROBOTICS] Delta Robot Arm driven by AI Camera

Recommended Posts

I am new here so hello all! I am a software engineer with a long history in Lego and Mindstorms.

Recently I have been working on an entry for an AI competition which I thought may be of interest.  The competition was to make something related to industrial automation that used a DepthAI camera (Oak-D-Lite) combined with Lego.  I have been wanting to build a Delta Robot for a while and here's my submission.  This uses a Raspberry Pi BuildHAT to control 3 Large Angular motors to drive the arms and one Technic motor for the gripper pump.  The BuildHAT is a great way to control Lego motors and its cheap and open source.  The DepthAI camera (which is an amazing bit of kit!) detects objects and sends their location to the Robot for pickup.  You can do lots more with Oak-D cameras, but I am just doing basic object locations.

There are quite a few other entries in the competition that are worth a look.  Search #oakdlitecontest, or you can see the Top10 here - https://form.jotform.com/221086334784156

 

Share this post


Link to post
Share on other sites

The contest is very interesting. Wish I knew about it during the submission phase. Good excuse to try vision AI and combine it with LEGO.  I’ll check out the rest and cast my vote.  Good luck with your submission!

Share this post


Link to post
Share on other sites

Very nice! I can imagine that programming that would be quite challenging.

One question I have is how you're able to adapt the Lego servo wires to the Raspberry Pi? I was under the impression that that plug type was proprietary for Lego, and not something I'd expect to see for a Pi.

Share this post


Link to post
Share on other sites
57 minutes ago, 2GodBDGlory said:

Very nice! I can imagine that programming that would be quite challenging.

One question I have is how you're able to adapt the Lego servo wires to the Raspberry Pi? I was under the impression that that plug type was proprietary for Lego, and not something I'd expect to see for a Pi.

Thank you!  The Pi foundation recently created the BuildHAT which can control four Lego motors/sensors using LPF2 connectors.  It has an API in Python and C# and appears to be supported by Lego.  The LEGO Education SPIKE Prime Expansion Set has a new PCB Panel designed specifically to mount a Pi (or similar PCB). 

The robot programming was not too hard as the OAK-D APIs (DepthAI) and the BuildHAT both have Python APIs.  That said, I had to learn Python.

Edited by tingeypa

Share this post


Link to post
Share on other sites
11 hours ago, tingeypa said:

Thank you!  The Pi foundation recently created the BuildHAT which can control four Lego motors/sensors using LPF2 connectors.  It has an API in Python and C# and appears to be supported by Lego.  The LEGO Education SPIKE Prime Expansion Set has a new PCB Panel designed specifically to mount a Pi (or similar PCB). 

The robot programming was not too hard as the OAK-D APIs (DepthAI) and the BuildHAT both have Python APIs.  That said, I had to learn Python.

Thanks for pointing me to that information! It's pretty intriguing that Lego is actually partnering with the Raspberry Pi people to make a lower-cost (higher skill-level, I assume) alternative to Mindstorms.

Share this post


Link to post
Share on other sites
8 hours ago, 2GodBDGlory said:

Thanks for pointing me to that information! It's pretty intriguing that Lego is actually partnering with the Raspberry Pi people to make a lower-cost (higher skill-level, I assume) alternative to Mindstorms.

You guys should hang out in the TrainTech forum from time to time. Probably mostly boring (as judged from a train head = me) - but when it comes to any kind of automation - this is a serious resource.

RPi is nice, but PyBricks runs >on< the TLG hubs as well - and you can revert to using a web interface. And then there is Legoino ... get a $10 ESP32 and do what you were never even dreaming of. And use your cell phone for what cell phones are for.

And again: All this is basically for some sort of autonomous operation rather than remote control. As discussed elsewhere.

Best
Thorsten

Share this post


Link to post
Share on other sites
On 4/23/2022 at 10:07 AM, Toastie said:

You guys should hang out in the TrainTech forum from time to time. Probably mostly boring (as judged from a train head = me) - but when it comes to any kind of automation - this is a serious resource.

RPi is nice, but PyBricks runs >on< the TLG hubs as well - and you can revert to using a web interface. And then there is Legoino ... get a $10 ESP32 and do what you were never even dreaming of. And use your cell phone for what cell phones are for.

And again: All this is basically for some sort of autonomous operation rather than remote control. As discussed elsewhere.

Best
Thorsten

Thanks Thorsten.  Those sound like good alternatives.  I will take a look.  

PI4 was good for this application for a few reasons.  It can connect to the AI camera directly over USB3.  The screens showing the camera output can be connected to the PI, or it can be controlled via VNC Viewer which avoids having many cables going to the robot.  Lastly, for this sort of robot you need to coordinate the three motors to move in sync, ideally starting and ending each movement together moving at controlled speeds.  If they don't move in sync the gripper may move through a path that bumps into things.  These are a slightly unusual requirements, but they could be achieved with the PI + Buildhat.  

Share this post


Link to post
Share on other sites

Thanks!  I have lots of pics of a small set of pieces, but they are not the ones in your batches.  I think your project is a great idea!  I was considering just making a fully automated scanner for my competition submission, but I had already spent time on the arms.   The oak-d cameras would be easy to automate together with a rotating/rocking platform, or a rotating platform with an arm to arc the camera over the subject. You could automate scanning, uploading, and running the model training.  You could even use OpenCV or other AI models on the camera to detect the location of pieces (i.e. define the bounding box) to assist in a detection model (rather than classification).

Share this post


Link to post
Share on other sites

Rebrickable is not my project! :-D

I thought you could share your images so people who are interested could create videos out of them and upload them to rebrickable, which would help to improve the part detector.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.