-
Posts
252 -
Joined
-
Last visited
Content Type
Profiles
Forums
Gallery
Everything posted by MajorAlvega
-
Boost pairing trouble
MajorAlvega replied to Manor's topic in LEGO Technic, Mindstorms, Model Team and Scale Modeling
Hi. The BLE protocol used by BOOST and Powered Up doesn't use the pairing mechanism of conventional Bluetooth. Just in case, close the Bluetooth menu and disable and re-enable bluetooth on your device before starting the LEGO App. -
Not for ev3dev specifically. There are several basic packages in the Debian repositories. There is at least one (bwbasic) still available for the architecture of the EV3 ('armel', not the modern 'armhf' like Raspberry Pi). So you can run bwbasic on the EV3. But since there are no bindings for the EV3 hardware you cannot make direct use of things like motors or sensors, you will need to make system calls to external commands (like bash scripts or python scripts). Probably not what you are interested. if you really need BASIC look for ev3basic.
-
Great to watch your progress. About the wi-fi dongles, I mostly use Enermax and ASUS N10 Nano. There are a few more - most (but not all) that work with the Raspberry Pi native OS (Raspbian) will also work. A few based on a specific Realtek chipset have some long run problems but recent kernel updates have made things better.
-
Maybe... but I still need to decide what this POC is about :) Perhaps just "[WIP] Automated Imperial Conveyex Transport" for now is enough? That's the only thing I'm certain - automation. It will have MINDSTORMS, Powered Up, most probably Power Functions, probably RFID... a big mess for sure.
- 6 replies
-
- mindstorms
- poweredup
-
(and 2 more)
Tagged with:
-
Great - welcome to the Dark Side ;) About SD cards, there are some differences but not so important unless you are going to make programs that read and write a lot on card or use so much memory that need a big swapfile on disk (the default swapfile is compressed, on RAM, to speed things up). David Lechner made a small benchmark and I also made a test with 2 or 3 of the available here in Portugal, inline with his findings. The available bindings for ev3dev, as far as I know, do not support daisy chainning. The python documentation sugests RPyC and I've used it two or three times, it's powerfull but requires python on all sides so for my needs I prefer to use mosquitto (MQTT) to pass data to/from EV3, Arduino, laptop, Android... It's a light message queuing protocol that was rediscovered recently on IoT projects. I wrote a small tutorial for the ev3dev site and if you search Youtube for ev3dev and MQTT you will find a few examples. MQTT is asynchronous so it might not suit your needs if you require fine synchronization . But you can use other libraries and try UDP... open an issue in ev3dev, someone more expert will help. Another note for python... the armel 1-core CPU and just 64 MB are bad for the initialization times of python... some programs can take 15 to 30 seconds to initialize before you can use them. David Lechner also ported a micropython version, less complete than the full python but much more lighter and faster... have a look.
-
This is a new project I just started. As most of my projects it will probably never get completed but will be a good showcase of some ideas for other projects. I usually don't do MOCs, just like to automate things and add some unusual effects. Last two years I've been collecting some monorail parts with the idea of making something monorail-based, with a few 4DBrix parts to control my budget and also to let me control the movement of the train. When LEGO launched the LEGO 75217 Imperial Conveyex Transport I had the idea of using it as a monorail but adding the new Powered Up hub so I can control it with a MINDSTORMS EV3. So this is the first proof of concept: the EV3 pikcs the Conveyex and make it move until an ultrasonic sensor detects it: To connect the monorail motor to the PUp hub I modded a PUp lights cable, soldering an old 9V plug to the power pins (1 and 2) of the cable: This makes the monorail motor be seen as a Light by the Powered Up (and BOOST) hub so it only works with apps that accept Lights but for me it doesn't matter since I intend to use my own bash/python script. But for manual control and my kids pleasure it also works fine with the PUp train handset: Roadmap: - add a micro-motor to the Conveyex to also control the linked chain track - use EV3 motors (or perhaps 4DBrix motors) to control the monorail Y's - sell a kidney to get a few more monorail parts - integrate the monorail circuit in a Star Wars diorama, with the help of a PLUG mate that is an expert in Star Wars universe The MINDSTORMS EV3 is running ev3dev (linux). This allows me to use a Bluetooth USB dongle to talk BLE with the Powered Up hub. And will also allow me to use some gadgets like NFC sensors to detect the train position or IRLink to control some Power Functions automations (like the AT-ST head). Questions and suggestions are welcome.
- 6 replies
-
- mindstorms
- poweredup
-
(and 2 more)
Tagged with:
-
Thanks! There were two LEGOcams: https://www.bricklink.com/v2/catalog/catalogitem.page?P=x86&idColor=14#T=C&C=14 https://www.bricklink.com/v2/catalog/catalogitem.page?P=x86px1#T=C They were electrical the same, just athe color of the top cover and the "focus" mechanism were different. The first was part of the MINDSTORMS RCX Vision Command: And the second one was part of LEGO Studios (DACTA Education division) and also Steve Spielber Moviemaker set: You might have problems using them in current Windows but they still work with linux, including the Raspberry Pi. Just 2 notes: - the EV3 USB port is 1.x so bandwidth is not enough for full webcam capabilities. I'm not sure now but I think the best you can get from it is 320x160 or so. - image quality is bad - you need good light (I used Power Functions lights) and also needd to manual focus to get something useful
-
This is a robot I made almost two years ago as a proof of concept after a talk with a friend. He wanted to use a Raspberry Pi to extract some data from invoices written on paper and I searched a few OCR tools for him to use - he took a photo with his mobile phone and the the tool extracted the text. We also used Google Docs for the same purpose. Then I realized that if we could do it with a RPi we could also do it with an EV3: The robot waits for the user to press the touch sensor then turns the Power Function lights ON over the cartridge and uses an old LEGOcam to take a photo of the inner part. Common linux tools are used: fswebcam to take the phot, ImageMagick and textclean to improve the image and tesseract-ocr to get the text. In between it says a few things with espeak to prevent us to fall asleep because the ARM processor of the EV3 is slow. There is a USB hub because I was using Wi-Fi and the LEGOcam at the same time but it isn't really needed if you choose to use Bluetooth (or choose to use no communications at all). Being linux, you can use other webcams instead of the LEGOcam with it's long and thick USB cable (like the Logitech C170) as long as the kernel recognizes them (most UVC-based work).
-
ev3dev as the OS, undoubtedly. I am not a programmer - I am an electronics engineer with a background on robotics and had to program at university but I really don't program on a regular basis. But I'm a good googler and can tweak my scripts to do what I need them to do, even if ugly and/or inefficiently. I started using ev3dev because it is Debian based like Raspbian (the OS for Raspberry Pi) and Ubuntu (the OS I use on my job's computer). So I can test almost everything on my laptop, transfer to ev3dev or the Pi and and with a few minor adjusts it works. Also ev3dev is a full OS - if you have some gadget or protocol you want to use with your robot/automated MOC you probably can use it with ev3dev. Fingerprint reader? NFC reader? webcam? printer? chromecast? bluetooth speaker? Google Cloud API? Bluetooth BLE devices like the SBrick, the WeDo 2.0, the BOOST, Powered Up? Arduino? IoT? Been there, done that (OK, I might be bragging... a bit :D ) Honestly, I wish the next MINDSTORMS would have just a few more MHz and something like the ev3dev inside... it would be an awesome platform.
-
I think so. But I never used... i have used leJOS 2 or 3 times (with Eclipse) but I have a long-lasting problem with Java since the 90's that never really went away :) There is a Facebook group for discussing leJOS and evdev-lang-java, Juan Antonio Breña Moral is active there and he is responsible for ev3dev-lang-java. And there is also a Facebook group for ev3dev but not so active - github site is much more active.
-
That's that part of bringing leJOS into ev3dev :) The ideia is implementing a Java runtime environment over the linux operating system, like leJOS does. But leJOS uses a very light and limited version of a linux system (like EV3 itself does when you are using native firmware only). No problem with that... unless you want to use a device or a library not supported by leJOS. Also leJOS uses a rather old JRE version, this project tries to use a much more recent (last month they announced OpenJDK 12). https://ev3dev-lang-java.github.io/docs/support/about/ev3dev-java.html one of the interesting things on this project is the inclusing of OpenCV (Open Source Computer Vision Library). Since ev3dev natively supports the old LEGO USB camera (and also modern USB webcams, with some limitations) you can use Java on your EV3 robot to recognize objects with just LEGO components. Of course you can use a webcam and OpenCV without Java, just install pure ev3dev and then install OpenCV and use your the language of your choice like python or C... or even just bash scripting if you are already a linux average user. Almost everything that you can do with a Raspberry Pi or a a linux laptop is possible on the EV3 (and if you have a BrickPi then everyhting is really possible since the CPU is much more powerfull and the better architecture means much more libraries/packages are alreasdy available).
-
Hi Jim. Could you please add ev3dev website (https://www.ev3dev.org/) to the Index? ev3dev is a linux distribution initially built for the EV3 but now also supporting Raspberry Pi and Beagle Bone and some 3rd party MINDSTORMS-compatible platforms like the BrickPi. ev3dev is also much more than 'just' an operating system - it includes several projects like 'lego-linux-drivers' that allows any (modern) linux system to use LEGO devices (for instance using WeDo USB Hub on a laptop) and also some language bindings to allow using python, java, C, javascript with MINDSTORMS hardware. There is even a project that brings leJOS it, allowing leJOS users to escape from some limitations of the leJOS main project. I'm not a member of the ev3dev team but use it for some projects and I'll be happy to answer (of at least forward) any questions.
-
Fresh new Energizer AA alkaline batteries. ev3dev reports: cat /sys/class/power_supply/lego-ev3-battery/voltage_now 9177466 when the micro-motor is running my voltmeter reads 9.02 V The voltage drop is really small but the micro-motor only uses a few mA so I would expect just a little more than 8 Volt with a medium EV3 servo motor and a moderate load.
-
I am not sure... the last page of the main hardware schematics has "Battery Positive" directly connected to "VCC9VBAT" and after that, through R245 (1.1A/16V proteccion component, resetable fuse?) to VCC9V. And the motor drivers (LB1836M and not 1936 as posted above) use this voltages (VS1 and VS2). That part of 5V Buck Boost 2A is the TPS40210DRCR circuit, it converts VCC9V in 5V. But now I got curious and decided to measure. With the EV3 LiPo, I get 6.9V (cheap chinese multimeter) with the old 9V micro motor connected to OUT A through a NXT/EV3 to RCX cable adapter. Don't have fresh AA batteries to test, will have to wait a day or two.
-
Sure! I have two different Pololu Maestro: https://www.pololu.com/category/102/maestro-usb-servo-controllers They have several models, I have 24-channel and a 6-channel. Really only used the 24-ch and never used all the channels, don't have enough RC servos for that (but I'm collecting some chinese small servos with LEGO form factor, Kittenbot Geek Servo). You can see both (Maestro and Kittenbot) in action here: The Maestro is USB based but it works like a COM port and accepts simple messages. There is at least one program for Windows and other for Linux but I use simple my own (linux) bash script or python script, they work seamless on my laptop, my Raspberry Pi and my EV3 (ev3dev). Now for the EV3 output port... no I didn't measure. I know that it uses an h-bridge circuit but didn't find the reference, just found this post from Dexter where they state they used a SN754410NE on their first version of the BrickPi (https://www.dexterindustries.com/howto/lego-mindstorms-motors-with-raspberry-pi-brickpi-0-1/) but the SN754410NE isn't an efficient circuit, if I read the datasheet correctly it typically drops 1 volt on each transistor so with a 7.4 Volt LiPo you would get just 5 Volt, that's terrible. I have the idea (probably wrong) that both EV3 and Power Functions IR receiver share the same output circuit, the LB1936 that has a lower saturation voltage.
-
Nice. Since I use ev3dev I prefer to use a USB servo controller so I don't need to use a EV3 OUT port and if/when I'm crazy enough I can use more than 8 RC servos. But I keep looking for this Mindsensors device, one of these days I might get one. Just a note: the EV3 servos use more than 5V volt. The output ports can give almost the full battery voltage so around 7V with the EV3 LiPo battery and around 9V with a good pack of 6 Alkaline AA batteries.
-
A little out of topic but lejOS is not the only option for a programming environment. ev3basic has a good IDE and if you are running ev3dev (linux distro for EV3) you can use Microsoft Visual Code as an iDE for several languages, python probably the best supported - or use your favorite iDE like PyCharm and then transfer the program to the EV3 through SSH.
-
[EV3] Mindstorms Daisy Chaining
MajorAlvega replied to Jim's topic in LEGO Technic, Mindstorms, Model Team and Scale Modeling
Try the MINDSTORMS EV3 Facebook group, there are a few guys there with a background on big models and daisy chainning who can help you. Unfortunately I don't use EV3-G so I can't help you.