Jump to content
Issues with Images is known, we are working on it. ×

ninoguba

Eurobricks Vassals
  • Posts

    66
  • Joined

  • Last visited

Everything posted by ninoguba

  1. Sharing here my latest MOC of WALL-E powered by LEGO Control+ hubs and motors and controlled by LEGO Powered Up app. This is actually a recolor mod of my original b-model MOC for set 42100. All play features remain the same as seen in this demo: Full details, build instructions, and programming guide available on Rebrickable. WALL-E - https://rebrickable.com/mocs/MOC-70839/gubsters/wall-e-technic-rc-powered-by-lego-powered-up-control/ WALL-L (Liebherr-Class) - https://rebrickable.com/mocs/MOC-68302/gubsters/wall-e-phantom-b-model-for-liebherr-excavator-42100/ Let me know if you have any questions or any feedback. Appreciate it, thanks!
  2. Yes, I recall being able to intercept events from the accelerometer input but haven’t gone far as interpreting them for control of the arm. I’ll check if I kept some notes and share with you the event codes.
  3. Good to know Pybricks is viable! I'm finding it a test of patience to boot up 2 ev3s running ev3dev. Also looking to see if hub-to-hub communication will be better on Pybricks than with RPyC. With my current implementation using RPyC, the slave has an inherent delay and for some reason resetting the "zero" positions of the motors doesn't work reliably. I recommend you go get yourself a PS4 controller, it has enough joysticks and buttons to control a robot arm for sure! I also found out that you can "read" all the inputs from the joypad, touchpad, and accelerometer on the PS4 controller all at the same time. Just imagine the control you can have with your robot arm with the combination of these inputs in your hand. =) Check this out and follow on the comments on Github: https://www.ev3dev.org/projects/2018/09/02/PS4Explor3r/ Good find and thanks for sharing! I'll be trying this library out when I can go back to working on my robot arm! For the meantime, I've only been able to make some tweaks to my build - less rare parts at the base, fixed some clicking issue at the waist, and some body panel color change. With the gripper end-effector No end-effector
  4. My latest creation for the Mindstorms Robot Inventor set is a model of a Formula 1 race car. I've integrated it with the color sensor facing downward to track a line: And the distance sensor facing forward in the front to detect obstacles: But rather than simply implementing the usual line following which works for small ordinary mobile robots, I'm envisioning something geared more for race car models. I believe this video explains the concept of racing lines concisely and this is what I want my race car model to follow. So how? I'm thinking, first we would still use the black line on a white surface to draw the outline of our desired race track. The idea is the bot will first follow this line and "store" the coordinates to build a model of the track virtually. Assumption is these coordinates form the median/center of the race track and the boundaries can be 10-15cm from center to each opposite direction. Once the virtual track is modeled, the actual race can begin where the bot will analyze and determine the optimal racing line to follow to yield the best lap times. Object avoidance for stationary obstacles or other race cars on the track will add more complexity and surely will add to the fun of this all! This concept, in some way, is very similar to Anki Overdrive's AI-driven race cars. Who else here wants to meet this coding challenge?
  5. Glad to know of fellow 6DoF arm builders ! I've actually stumbled on the concept of "Gradient Descent" for implementing inverse kinematics. But same as before, it is still making my head spin trying to understand and implement it in my code. Here's the link, in case, it helps you on your projects too. https://www.marginallyclever.com/2020/04/gradient-descent-inverse-kinematics-for-6dof-robot-arms/ Thanks for sharing you code! I'll take a look one of these days...
  6. Glad you found the thread and I'm looking forward to your updates as well. Would you be interested in collaborating on the programming? I envision something that can be adapted to any model of LEGO EV3 robot arms, perhaps by parameterizing the number of joints, dimensions, etc. and the type of end-effectors used. If you don't mind sharing your code on Github, I'm interested to compare it with what I have already. Something I haven't shared publicly before is what got me started with making my robot arm. A year or so ago, I had an idea I wanted to do for an entry to the LEGO Mindstorms + Alexa Contest to combine my fondness for LEGO, Nerf and Alexa. That idea was a robotic arm that has a target for Nerf blasters on its end-effector. The movement of the target will be randomized so that it's quite a challenge for target practice kinda like skeet shooting. Alexa will be used for the voice commands and sound effects. That idea proved hard to complete due to the limited time to meet the contest deadline so I've pivoted to a simpler entry for the contest here - https://www.hackster.io/nino-guba/lego-target-practice-powered-by-mindstorms-ev3-and-alexa-fd1af8. Then at the beginning of the year and due to the pandemic quarantine, I had the chance to get back to working on my robotic arm again. Initially planned to attach the ultrasonic sensor at the end-effector to track when a Nerf Rival round passed through the target. Also thought of having a touch sensor being activated when a physical target on the end-effector is hit. Went through several iterations of the arm, starting from 3 degrees of freedom, to 4, to 5, then eventually to 6 thanks to Akiyuki's wrist design for inspiration. At each iteration, I also was able to improve the range of motion and strength of the most critical joints - shoulder and elbow. Then lastly, I was able to add motorization of the end-effector using another medium EV3 motor. Most I've seen, like yours, used pneumatics at the end-effector which I don't have. My robot arm at its current form, models industrial robot arms such as the Kuka Agilus. I think it's cool that I was able to build such a moc! At some point, I still would like complete my original application for the project but this would really require finishing the implementation of inverse kinematics algorithm which is also making my head spin. =) Good luck to us in completing our projects! Anybody else working on their 6DoF robot arms are welcome to post their progress here.
  7. I think it would be better if the color dial and the distance sensor are placed on the opposite side so you can easily operate them with your other hand.
  8. Looks like you and I are on a LEGO Ideas Contest submission spree. =) Best of luck!
  9. Thanks for this! I could definitely go down do the math. But I was more leaning toward making use of existing libraries so I won't have to reinvent the wheel. I actually already found this - https://github.com/nnadeau/pybotics - that I may be able use with my program running on ev3dev. Wonder if anyone here has used it before...
  10. Thanks. I've seen that and it actually helped me with the wrist mechanism of my arm, build-wise. Unfortunately, only sparse details were shared about how he programmed his arm other than this statement on his 2015 blog post. It is actually trivial to hardcode pre-determined movements into your program. For example, when moving the arm from a fixed starting point and onto a fixed destination to do something. The real challenge on the programming side is implementing RK and RRT so it will be a breeze to make the arm do more concerted movements like complex pick-and-place routines. This is what I'm hoping to get a collaboration started and open-source it to share.
  11. I found this for using Matlab with python. Haven't tried it yet though. https://www.mathworks.com/products/matlab/matlab-and-python.html
  12. If you want your build to react to music, you can use Alexa with your EV3. See here for detailed instructions on how it can be done. https://www.hackster.io/alexagadgets/lego-mindstorms-voice-challenge-mission-2-599072
  13. Hello, First topic post here. =) So I've built a 6DoF robot arm with a motorized end effector and have so far only done the basic programming of the joint movements using python (ev3dev). I have looked at implementations of reverse kinematics and RRT path planning on other robot platforms and would like to implement it for my arm. Anyone on here has done it before and would like to share how they've done it? Looking for collaborators on the programming side if there is any interest. Here's the link to my very basic code so far: https://github.com/ninoguba/ev3-robotic-arm Thanks, Nino
  14. I understand, however my post was really in response to a recent question asked related to Akiyuki’s robot arm he used for his gbc. I apologize if it somewhat off-topic. And by the way I really appreciate all the great info you all have collected here in this mega thread. Kudos!
  15. Hi! I'm new here and just stumbled on this great thread thanks to watching Beyond The Brick's video featuring Akiyuki's GBC modules. I'll certainly be coming here often for knowledge and inspiration for building GBC modules for my next projects.
×
×
  • Create New...