Search the Community

Showing results for tags 'ev3dev'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Frontpage, Forum Information and General LEGO Discussion
    • Guest Section - PLEASE READ BEFORE YOU REGISTER!
    • New Member Section - PLEASE READ BEFORE STARTING!
    • Frontpage News
    • Forum Information and Help
    • General LEGO Discussion
  • Themes
    • LEGO Licensed
    • LEGO Star Wars
    • LEGO Historic Themes
    • LEGO Action and Adventure Themes
    • LEGO Pirates
    • LEGO Sci-Fi
    • LEGO Town
    • LEGO Train Tech
    • LEGO Technic, Mindstorms, Model Team and Scale Modeling
    • LEGO Action Figures
    • Special LEGO Themes
  • Special Interests
    • The Military Section
    • Minifig Customisation Workshop
    • Digital LEGO: Tools, Techniques, and Projects
    • Brick Flicks & Comics
    • LEGO Mafia and Role-Play Games
    • LEGO Media and Gaming
  • Eurobricks Community
    • Hello! My name is...
    • LEGO Events and User Groups
    • Buy, Sell, Trade and Finds
    • Community
    • Culture & Multimedia

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


What is favorite LEGO theme? (we need this info to prevent spam)


Which LEGO set did you recently purchase or build?


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests


Country


Special Tags 1


Special Tags 2


Special Tags 3


Special Tags 4


Special Tags 5


Special Tags 6


Country flag

Found 8 results

  1. I'm busy with automating a LEGO railway crossing and have therefore written the code below. This code works fine, but motor D starts only when motor A is finished. Are there possibilities to start both motors at the same time? #!/usr/bin/env pybricks-micropython from pybricks.hubs import EV3Brick from pybricks.ev3devices import (Motor, TouchSensor, ColorSensor, InfraredSensor, UltrasonicSensor, GyroSensor) from pybricks.parameters import Port, Stop, Direction, Button, Color from pybricks.tools import wait, StopWatch, DataLog from pybricks.robotics import DriveBase from pybricks.media.ev3dev import SoundFile, ImageFile # Initialize the EV3 Brick. ev3 = EV3Brick() sensor_1 = UltrasonicSensor(Port.S1) motor_A = Motor(Port.A) motor_D = Motor(Port.D) # parameters speed = 6*165*10 # [deg/s] rotation_angle = 24*90 # gear ratio 24:1 dis_track_1 = 65 # [mm] t1 = 10*1000 # [sec] while True: if sensor_1.distance() < dis_track_1: # close railway crossing ev3.light.on(Color.RED) motor_A.run_angle(speed=speed, rotation_angle=-rotation_angle, then=Stop.HOLD, wait=True) motor_D.run_angle(speed=speed, rotation_angle=-rotation_angle, then=Stop.HOLD, wait=True) # wait 10 seconds wait(t1) # open railway crossing motor_A.run_angle(speed=speed, rotation_angle=rotation_angle, then=Stop.HOLD, wait=True) motor_D.run_angle(speed=speed, rotation_angle=rotation_angle, then=Stop.HOLD, wait=True) ev3.light.on(Color.GREEN)
  2. Did you know that any standard EV3 brick is capable of controlling the LEDS separately? And that the display is capable of displaying 4 shades of gray? No, this extra functionality is not available via the standard programming environment that Lego provides. But if you use low level programming (I used EV3DEV in combination with C++), you have. See the example below. You can read my article at our blog here: https://siouxnetontrack.wordpress.com/2020/04/25/lego-mindstorms-ev3-with-an-image-with-4-different-shades-of-gray/ Or have a look at the Youtube videos. Enjoy. Hans
  3. Hello, I have a LEGO robot arm Assembly, my project also uses kinect from xbox, the code processes the position of my hand and sends it to the ev3 control unit, how can I implement the motor lift by a certain degree? Let's say I set y>0 my package went, and the robot raised its arm to a certain height, then I lowered my arm y became < 0, and the arm went down too. The code through which I tried to do this is attached, but in this case it infinitely turns the motor. - client(my pc) -server(ev3)
  4. Techster14

    HELP!!!

    I get that this isn't the topic forum to ask this under but, I need help with this idea. so I did research and look for how to use a webcam with ev3 and ev3dev_Python, but couldn't seem to find the camera model I'm using. My camera is literally called usb2.0 pc camera if that helps and have no clue about anything else. it works just fine to the only issue is how do I use any kind of USB camera with EV#Dev_Python?
  5. This is a new project I just started. As most of my projects it will probably never get completed but will be a good showcase of some ideas for other projects. I usually don't do MOCs, just like to automate things and add some unusual effects. Last two years I've been collecting some monorail parts with the idea of making something monorail-based, with a few 4DBrix parts to control my budget and also to let me control the movement of the train. When LEGO launched the LEGO 75217 Imperial Conveyex Transport I had the idea of using it as a monorail but adding the new Powered Up hub so I can control it with a MINDSTORMS EV3. So this is the first proof of concept: the EV3 pikcs the Conveyex and make it move until an ultrasonic sensor detects it: To connect the monorail motor to the PUp hub I modded a PUp lights cable, soldering an old 9V plug to the power pins (1 and 2) of the cable: This makes the monorail motor be seen as a Light by the Powered Up (and BOOST) hub so it only works with apps that accept Lights but for me it doesn't matter since I intend to use my own bash/python script. But for manual control and my kids pleasure it also works fine with the PUp train handset: Roadmap: - add a micro-motor to the Conveyex to also control the linked chain track - use EV3 motors (or perhaps 4DBrix motors) to control the monorail Y's - sell a kidney to get a few more monorail parts - integrate the monorail circuit in a Star Wars diorama, with the help of a PLUG mate that is an expert in Star Wars universe The MINDSTORMS EV3 is running ev3dev (linux). This allows me to use a Bluetooth USB dongle to talk BLE with the Powered Up hub. And will also allow me to use some gadgets like NFC sensors to detect the train position or IRLink to control some Power Functions automations (like the AT-ST head). Questions and suggestions are welcome.
  6. This was one of those crazy ideas we have when are discussing with other LUG fellows. This guy from PLUG defied me to show a LEGO robot that translates conversation, much like the C3PO protocol droid from Star Wars. As usual, he wasn't really expecting ti could be possible with LEGO. I only had a couple of hours so I decided to copy a Raspsberry Pi approach of using “the Cloud”. Google offers a one year free trial so I registered and tried a few examples on my Ubuntu laptop, amazing what one can do with just a few curl commands! I wrote a sort of short tutorial. It is now obsolete but helps showing the way for further attempts. I used an USB microphone because I assumed there was no LEGO microphones. But there is - the LEGOCam (MINDSTORMS RCX Vision) has an embedded mic, sound quality isn't great but might (big might) be used - that would make a 100% LEGO hardware solution. By that time I had problems with initial authentication times - it took almost a minute before I can start using the Google services and it needed to be renewed after an hour. EV3 CPU is slow but probably not the only reason here so one of these days I might try it again. Of course, using a Raspberry Pi and a BrickPi would give much better results.
  7. Hi again. Another project I showed live last weekend at PLUG Braga BRInCKa 2016 - a LEGO Laser Harp: It uses a Mindstorms EV3 to read the light intensity on 8 color sensors. Each color sensor has a 1mW red laser pointing to it so my instrument has 8 "strings" or "chords" (I use two 3-to-1 input multiplexers in order to achieve 8 sensors). All 8 lasers are controled from one EV3 output. I started with 8 LEGO Power Function LEDs and it worked fine... at dark. But at a live show room I knew it would be impossible to use the LEDs (unless for very very short distances) so I opted for lasers. The EV3 runs ev3dev, a linux distro for the EV3. A python script controls the lasers and reads the sensors, sending their state to a linux laptop where another python script plays the notes on a software MIDI synth (EV3 with ev3dev can play MIDI but has not enough power to polyphonic sound so I had to use this client-server configuration). This allows the "instrument" to scale out so I can had more EV3 and more "strings". I don't know now how to play so during the exhibition my wife, when present, played some 8-note children tunes for the public. I have no live video but I have this one at home, still with LEDs and just 7-chords: Sound still needs some improvements (I'm not controlling note length) and hopefully in a later version I will read hands distance to control note amplitude. I will also use some kind of Human Interface Device to change the MIDI soundfont intrument "on the fly" so the artist can change from an harp to a piano or a drum whenever he/she wants (I'm planning to use LEGO Dimensions ToyPad since I can already read NFC tags with it on the EV3). Some technical details (and code) at my blog.