Jim

[EV3DEV] Venturing into the World of EV3Dev and Python

Recommended Posts

Since you are into MQTT you might like this project: David Lechner made a micro-python program for Jason' telegraph and printer, using test.mosquitto.org (or any other broker you want) so several EV3 telegraphs can talk all over the world.

I added two bash scripts to allow use a text console if you don't have a telegraph.

I still have my telegraph assembled somewhere (used on PLUG exhibition some months ago, kids like to test morse code on it) so if you decide to give it a try just send me a note.

 

Share this post


Link to post
Share on other sites

Sweet! I will definitely try it out. Not sure when, but I will look into it.

So much interesting stuff happening in this forum :wub:

Shall I split the MQTT discussion into its own topic?

Share this post


Link to post
Share on other sites
7 hours ago, MajorAlvega said:

It was a great idea to emancipate Robotics from Technic. Not much active users yet but at least one seems very happy :)

I am a very happy camper. Basically, I created my own Helpdesk :laugh:

I sure hope more people will find this forum in the future. And maybe it will get a boost when a new Mindstorms sets is released.

Share this post


Link to post
Share on other sites

Oh, the new MINDSTORMS with an 8-core Raspberry Pi inside, 1 TB of RAM and Bluetooth 5 and 5G :D  :D :D

Everybody is talking about it but I wouldn't put much hope on it yet... probably Technic will get revamped first, with some Powered Up new devices, and then we will might start dreaming on the successor of the EV3.  

Share this post


Link to post
Share on other sites

Yup hehe. And in the meantime I will have my BrickPi’s and robot fully functional and I probably don’t want to switch to a new system :laugh: 

So I am not in hurry. Let’s wait a year or two :wink: 

Share this post


Link to post
Share on other sites

8 core RPi inside might be useful.  I have this crazy idea of making our train layout's windmills follow people as they walk by.  I am not sure if I should let the EV3 do image processing or a have Raspberry Pi do imaging work and tell the EV3 where to aim. Which version of OpenCV works with ev3dev?

 

Share this post


Link to post
Share on other sites

Good luck @Jim! As I say I've only used Event Hubs in Azure bit happy to help if I can.

I know it's a bit frustrating starting with MQTT but I guarantee this will be better in the long run than using web sockets. It's how all modem IOT applications are architecture, because you don't need a constant connection so it can deal with drops outs easier.

If I had the hardware I'd give it a bash myself (not that I'm suggesting I'd do it any faster than you, but two of us working together would surely be faster) but I am totally skint this month to buy anything.

Share this post


Link to post
Share on other sites
7 hours ago, dr_spock said:

8 core RPi inside might be useful.  I have this crazy idea of making our train layout's windmills follow people as they walk by.  I am not sure if I should let the EV3 do image processing or a have Raspberry Pi do imaging work and tell the EV3 where to aim. Which version of OpenCV works with ev3dev?

 

Current version, I believe. But don't try to make the image processing on the EV3, use a RPi - some 2 years ago I installed the (then) current release of OpenCV and made some face recognition... it took 4 minutes on the EV3 (or more, not sure now).
To follow people with just the EV3 processor you need to offload the processing, like using a Pixycam or a smartphone or "the cloud". Or forget image at all and use something else like sound or temperature or even an iBeacon.

Share this post


Link to post
Share on other sites
20 minutes ago, MajorAlvega said:

Current version, I believe. But don't try to make the image processing on the EV3, use a RPi - some 2 years ago I installed the (then) current release of OpenCV and made some face recognition... it took 4 minutes on the EV3 (or more, not sure now).
To follow people with just the EV3 processor you need to offload the processing, like using a Pixycam or a smartphone or "the cloud". Or forget image at all and use something else like sound or temperature or even an iBeacon.

Does face recognition work well with Raspberri Pi 3 and a camera like this one?

52 minutes ago, Basiliscus said:

Good luck @Jim! As I say I've only used Event Hubs in Azure bit happy to help if I can.

I know it's a bit frustrating starting with MQTT but I guarantee this will be better in the long run than using web sockets. It's how all modem IOT applications are architecture, because you don't need a constant connection so it can deal with drops outs easier.

If I had the hardware I'd give it a bash myself (not that I'm suggesting I'd do it any faster than you, but two of us working together would surely be faster) but I am totally skint this month to buy anything.

You are already helping me a lot, by giving me pointers like Azure Event Hubs and MQTT etc. 

You don't have any hardware for ev3dev? Or do you have a Raspberry Pi?

I would love to work on someting together. Maybe the @MajorAlvega can join us. I will be very busy with reviews during the next couple of weeks, so I won't have much spare time. Maybe we can work something out for next year. A Raspeberry Pi won't break the bank, so maybe we can work something out.

Share this post


Link to post
Share on other sites

Can't follow the link but a few years ago the Raspberry Pi magazine had a project of a security lock with a RPi and the RPi camera, opening the lock with the face in real time.

A regular USB webcam will work fine with the RPi, you don't need super bandwidth nor resolution for face recognition. Like Logitech C170/C270.

Share this post


Link to post
Share on other sites

It's a Raspberry Pi 3 Model B+ Camera Kit 5MP Focal Adjustable Night Version Camera+Holder +IR Light +FFC Cable for Raspberry Pi Zero W

Share this post


Link to post
Share on other sites

So like this ?

Yes, 5 MP is more than enough for facial recognition and the CSI connection has lots of bandwidth (but you really just need a single shot for facial recognition, unless you want to track someone in realtime).

Usually night vision cameras aren't good for daylight pictures but for AI it really doesn't matter.

Share this post


Link to post
Share on other sites

@Jim: Yeah it's just earlier this year I bought a new car, got our bathroom done and our garden landscaped. But this is the last month of austerity, and starting in January 2019 I'll have disposable income again.

I don't have an EV3 brick or a Raspberry Pi, although I have considered buying the latter. I'm not much of a gears and Technic guy so the EV3 doesn't immediately appeal, especially given its price, but maybe one day.

Share this post


Link to post
Share on other sites
50 minutes ago, Basiliscus said:

@Jim: Yeah it's just earlier this year I bought a new car, got our bathroom done and our garden landscaped. But this is the last month of austerity, and starting in January 2019 I'll have disposable income again.

I don't have an EV3 brick or a Raspberry Pi, although I have considered buying the latter. I'm not much of a gears and Technic guy so the EV3 doesn't immediately appeal, especially given its price, but maybe one day.

Sounds like you have done all right in 2018 :wink: 

For most of the projects, a Raspberry will suffice. Would be great if you get a Pi so we can do some testing.

Share this post


Link to post
Share on other sites
3 hours ago, Jim said:

Sounds like you have done all right in 2018 :wink: 

For most of the projects, a Raspberry will suffice. Would be great if you get a Pi so we can do some testing.

Thanks!

It's been on my to-do list for a while now, so I'll get one in January and then at least I'll have one of the components required to keep going with this project.

It might spur me on to finish my photo categoriser too!

Edited by Basiliscus

Share this post


Link to post
Share on other sites

I use a $5 pi cam on my Pi.  I got it to use with Octoprint on the Pi to monitor my 3D printer.  I 3D printed a Pi camera holder for it..  You could probably tether your DSLR too.

picam.jpg

 

18 hours ago, MajorAlvega said:

Current version, I believe. But don't try to make the image processing on the EV3, use a RPi - some 2 years ago I installed the (then) current release of OpenCV and made some face recognition... it took 4 minutes on the EV3 (or more, not sure now).
To follow people with just the EV3 processor you need to offload the processing, like using a Pixycam or a smartphone or "the cloud". Or forget image at all and use something else like sound or temperature or even an iBeacon.

Thanks for the tips.  I'll offload the processing from the EV3.  OpenCV 4 compiled on my Pi after adding a USB hard drive for more swap space.  When I tried Import cv in python, it couldn't find the open cv module.  More stuff to debug... 

Thermal imaging and tracking human shapes sound like a Predator movie.  Maybe I can also turn it into an anti-squirrel water cannon to keep them from eating my pear tree.  :devil_laugh:

 

Share this post


Link to post
Share on other sites
15 hours ago, Basiliscus said:

It's been on my to-do list for a while now, so I'll get one in January and then at least I'll have one of the components required to keep going with this project.

It might spur me on to finish my photo categoriser too!

Sounds like a plan! :sweet:

Share this post


Link to post
Share on other sites

I haven't tried it yet but OpenCV is supported in python3 now.  Agreed on the statement of not attempting to run OpenCV on an EV3...it is painfully slow :( Offloading to pi3, laptop, etc is well worth it.

Share this post


Link to post
Share on other sites

What is the best way to synchronize two motors. I am now using MoveTank, which does work. But I do see some unexpected behavior when I test it.

Motors running at 50% power. It looks like the motors are speeding up and down and it takes some time for them to synchronize (make sure to turn on the noise).

Motors running at 100% power. At 100% things seem to run smoother.

 

Share this post


Link to post
Share on other sites

That is weird...it is like they are fighting each other. If you remove the big black gear connecting them does that make any difference? (just as an experiment).

MoveTank is the way to go though, it will start the motors as close together as possible.

Share this post


Link to post
Share on other sites
1 hour ago, dwalton76 said:

That is weird...it is like they are fighting each other. If you remove the big black gear connecting them does that make any difference? (just as an experiment).

That is indeed what it looks like. Good point, I can test without the turntable.

I am not sure whether I will use this gearing btw. Maybe I will switch back to smaller gears, since the final robot will be quite heavy. But that is a different discussion.

I will do some testing tonight.

Share this post


Link to post
Share on other sites

Thanks for the tip. I made the most obvious mistake. Only one of the motors was turning * apply face to palm *

Yesterday wasn't my brightest day :(

Is it possible to change the polarity of one of the motors, since they are countering each others movement now?

Share this post


Link to post
Share on other sites

This works like a charm:

mb = LargeMotor("spi0.1:MF")
mc = LargeMotor("spi0.1:MG")

mb.on(speed=-50, block=False)
mc.on(speed=50, block=False)

But I am wondering if I can use MoveTank with one reversed polarity motor.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.