Search the Community

Showing results for tags 'python'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Frontpage, Forum Information and General LEGO Discussion
    • Guest Section - PLEASE READ BEFORE YOU REGISTER!
    • New Member Section - PLEASE READ BEFORE STARTING!
    • Frontpage News
    • Forum Information and Help
    • General LEGO Discussion
  • Themes
    • LEGO Licensed
    • LEGO Star Wars
    • LEGO Historic Themes
    • LEGO Action and Adventure Themes
    • LEGO Pirates
    • LEGO Sci-Fi
    • LEGO Town
    • LEGO Train Tech
    • LEGO Technic, Mindstorms, Model Team and Scale Modeling
    • LEGO Action Figures
    • Special LEGO Themes
  • Special Interests
    • The Military Section
    • Minifig Customisation Workshop
    • Digital LEGO: Tools, Techniques, and Projects
    • Brick Flicks & Comics
    • LEGO Mafia and Role-Play Games
    • LEGO Media and Gaming
  • Eurobricks Community
    • Hello! My name is...
    • LEGO Events and User Groups
    • Buy, Sell, Trade and Finds
    • Community
    • Culture & Multimedia

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


What is favorite LEGO theme? (we need this info to prevent spam)


Which LEGO set did you recently purchase or build?


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests


Country


Special Tags 1


Special Tags 2


Special Tags 3


Special Tags 4


Special Tags 5


Special Tags 6


Country flag

Found 16 results

  1. Hey everyone, I'm really really excited to share this one with you. It's a Lego Technic swerve drive! This is the first time I've truly combined both software and Lego. The swerve drive sports 2 EV3s equipped with EV3Python V2, along with 8 motors. 1 EV3 controls steering and the other controls driving. Both are sent instructions from my computer (or yours because the code is public) which runs a custom python SSH. The swerve drive can be controlled by a joystick (which is written in with the SSH) or it can run autonomously and detect objects (the 2 EV3s communicate through my computer (another SSH) and send each other instructions). The swerve drive is built with 4 identical modules connected together. It is super robust and runs at a 1:1 gear ratio, but there are 2 places (for each module) where the gear ratio can be changed. If there's one drawback, I'd have to say it's that this thing eats battery like crazy. Each EV3 has 6 batteries, and since they each power 4 motors, battery drops very quickly to unusable rates. This is most likely caused by the weight of the model but other optimisations could be made too. OK, I'll stop talking. There's a document with further explanation here. Here's the video! (The code is in the description of the video) Pictures: That's all from me. This project was a blast, so expect more (like lots more) EV3 soon. Thanks for reading! C&C welcome. BbBT Code is here: https://github.com/AbhiJain16/swerve-drive-ev3python
  2. This tool Includes all possible unique orientations of right triangles, and a portion of scalene ones that arent necessarily right angle triangles by subtracting and verifying it is scalene. All sides are whole numbers. I didnt add a button to change the range from the current 1-10 but its in the first couple lines of code so you could change it yourself if you wanted to. Code: https://gist.github.com/mcneds/d6bd36ac808970184bb0c6295acc902c Demonstration: https://youtu.be/TkhLsFSIIME
  3. Hi all, I've been working on a Lego 51515 robot that uses computer vision to detect an object and then pick it up. Here is a video: The robot works as follows: 1) Objects are detected by a free TensorFlow Android App, available on Google Play (ArdObjectTracker) 2) The detection data is sent from the mobile via Bluetooth Classic and picked up on the Arduino board by an HC-05 module 3) The object detection data is then processed and converted to robot movement instructions which are then sent to the Lego hub via the BluetoothLE module (HM-10) 4) Finally, the Lego hub (running a Python program) receives the instructions via a BluetoothLE notification and then moves to pick up the object I have shared the code and more details on this Github page: https://github.com/osullivp/Lego51515BLEArduinoTensorFlow Hope you all find this helpful! Regards, Paul.
  4. the IDE complains over the rem's it is translated from c++ throu basic i can not test this let me know if you got it working ''' bluatigro 14 nov 2021 ann try based on : http://code.activestate.com/recipes/578148-simple-back-propagation-neural-network-in-python-s/ ''' import random , math # consts NI = 2 # number of inputs NH = 2 # number of hidden cels NO = 1 # number of outputs # cels ai = list() for i in range( NI ) : ai.append( 0.0 ) ah = list() for h in range( NH ) : ah.append( 0.0 ) ao = list() for o in range( NO ) : ao.append( 0.0 ) # wished output wish = list() for w in range( NO ) : wish.append( 0.0 ) # weights wih = list() ci = list() for i in range( NI ) : wih[ i ] = list() ci[ i ] = list() for h in range( NH ) : wih[ i ].append( random.random() ) ci[ i ].append( random.random() ) who = list() co = list() for h in range( NH ) : who[ h ] = list() co( h ) = list() for o in range( NO ) : who[ h ].append( random.random() ) co[ h ].append( random.random() ) od = list() hd = list() for h in range( NH ) : od.append( random.random() ) hd.append( random.random() ) # input output training data : XOR function paterns = 4 pin = {{0.0,0.0},{1.0,0.0},{0.0,1.0},{1.0,1.0}} pout = { 0.0 , 1.0 , 1.0 , 0.0 } def tanh( x ) : return ( 1 - math.exp( -2 * x ) ) \ / ( 1 + math.exp( -2 *x ) ) def dsignoid( x ) : return 1 - x * x def calc( p ) : for i in range( NI ) : ai[ i ] = pin[ i ][ p ] for h in range( NH ) : som = 0.0 for i in range( NI ) : som += ai[ i ] * wih[ i ][ h ] ah[ h ] = tanh( som / NI ) for o in range( NO ) : som = 0.0 for h in range( NH ) : som += ah[ h ] * who[ h ][ o ] ao[ o ] = tanh( som / NH ) def backprop( n , m ) : ''' http://www.youtube.com/watch?v=aVId8KMsdUU&feature=BFa&list=LLldMCkmXl4j9_v0HeKdNcRA calc output deltas we want to find the instantaneous rate of change of ( error with respect to weight from node j to node k) output_delta is defined as an attribute of each ouput node. It is not the final rate we need. To get the final rate we must multiply the delta by the activation of the hidden layer node in question. This multiplication is done according to the chain rule as we are taking the derivative of the activation function of the ouput node. dE/dw[j][k] = (t[k] - ao[k]) * s'( SUM( w[j][k]*ah[j] ) ) * ah[j] ''' totfout = 0 for k in range( NO ) : totfout += math.abs( ao( k ) - wish( k ) ) # update output weights for j in range( NH ) : for k in range( NO ) # output_deltas[k] * self.ah[j] # is the full derivative of # dError/dweight[j][k] c = od[ k ] * ah[ j ] wo[ j ][ k ] += n * c + m * co[ j ][ k ] co[ j ][ k ] = c # calc hidden deltas for j in range( NH ) : fout = 0 for k in range( NO ) : fout += od[ k ] * wo[ j ][ k ) hd[ j ] = fout * dsignoid( ah[ j ] ) # update input weights for i in range( NI ) : for j in range( NH ) : c = hd[ j ] * ai[ i ] wi[ i ][ j ] += n * c + m * ci[ i ][ j ] ci[ i ][ j ] = c return totfout / 2 for e in range( 10000 ) : fout = 0 # fill input cel's for p in range( paterns ) : for i in range( NH ) : ai[ i ] = pin[ i ][ p ] # fill output wish for o in range( NO ) : wish[ o ] = pout[ p ] fout += backprop( .5 , .5 ) print( 'generatie ' , e , ' error ' , fout )
  5. Hey I purchased 60198 City cargo train set and I run may train either with the remote or with my smartphone powered up application. For me powered up has much more potential when used with pybricks. So I tried to use my city/train hub with pybricks to write some programs to run my train. Unfortunately, I got stuck on installing firmware for pybricks (bootloader). I followed guidelines from their webpage, so was holding green button till light started blinking in purple and then clicked "Install Pybricks firmware" button in Pybricks code webpage opened in my browser. Light on hub is blinking red/green/blue and there is shown progress in % (Pybricks code webpage) after it reaches 100% error message occurs: "Hub took too long to respond" and hub led blinks red/green/blue in infinite mode - you need to put off batteries to stop it. On Github pybricks/support I found whole topic regarding that error message "Hub took too long to respond" and probably there is solution to that problem but I am beginner at Python programming so don't know how to solve this problem. Should I write some kind of input file to get through that issues with bluetooth or other? They put some code file there but I totally do not know how to use it. Here is link to that support page: https://github.com/pybricks/support/issues/256 Any help welcome :) Regards
  6. My current project uses "run_to_position(0, direction='shortest path')" to set two motors to a known position before the rest of the code runs. However, this doesn't always work. If I run another of my creations, usually by mistake, before running my current project the motors can end up in some odd position and running code that begins with "run_to_position(0, direction='shortest path')" does not correct the situation. I can correct this either by switching the hub off and then back on and select the correct programme or run the Charlie scratch version provided by Lego. The scratch code contains a "calibration block". I'm attempting to simulate that block by running "run_to_position(0, direction='shortest path')". It seems that the block contains more that run_to_position and that I need a piece of Python code that is the equivalent of the "calibration block. Can anyone offer any suggestions?
  7. hello legofans i fount a playlist on youtube type : python for 51515 and you can see some python for 51515 greatings
  8. Hi, this question could be most probably answered by David Lechner. In my EV3 project, I was trying to increase the speed of the robot gripper movements. It is controlled by a medium EV3 motor and set as follows: motorGrip = Motor(Port.B, positive_direction=Direction.COUNTERCLOCKWISE, gears=[1,24]) Increasing the speed value in the statement motor[m].run_target(speed,...) does not help when a certain speed level is reached. From the documentation I understand that this threshold level is set by the statement motor.control.limits()) So I tried this: print (motor['Grip'].control.limits()) motorTestRunTarget ('Grip', doReset=False, waitTime=1000, safetyFactor=1) motor['Grip'].control.limits (35, 67, 100) motorTestRunTarget ('Grip', doReset=False, waitTime=1000, safetyFactor=1) sys.exit () The first line prints these values: (33, 67, 100). The second line is executed well, but the third one where I try to increase (only slightly) the speed limit, rises Error: File "/home/robot/Client/main.py", line 111, in <module> OSError: [Errno 1] EPERM: Is there anything wrong in the way I try to use the control settings? And another question: how the internal speed limit is set? Does it depend on the used motor (medium, large) or on the specified gear rates, is it based on a safety factor? When I tried to calculate the physical speed limit from the known revolution/s (260 for medium motor), I got a value which is larger than that read in the above piece of program.
  9. I'm busy with automating a LEGO railway crossing and have therefore written the code below. This code works fine, but motor D starts only when motor A is finished. Are there possibilities to start both motors at the same time? #!/usr/bin/env pybricks-micropython from pybricks.hubs import EV3Brick from pybricks.ev3devices import (Motor, TouchSensor, ColorSensor, InfraredSensor, UltrasonicSensor, GyroSensor) from pybricks.parameters import Port, Stop, Direction, Button, Color from pybricks.tools import wait, StopWatch, DataLog from pybricks.robotics import DriveBase from pybricks.media.ev3dev import SoundFile, ImageFile # Initialize the EV3 Brick. ev3 = EV3Brick() sensor_1 = UltrasonicSensor(Port.S1) motor_A = Motor(Port.A) motor_D = Motor(Port.D) # parameters speed = 6*165*10 # [deg/s] rotation_angle = 24*90 # gear ratio 24:1 dis_track_1 = 65 # [mm] t1 = 10*1000 # [sec] while True: if sensor_1.distance() < dis_track_1: # close railway crossing ev3.light.on(Color.RED) motor_A.run_angle(speed=speed, rotation_angle=-rotation_angle, then=Stop.HOLD, wait=True) motor_D.run_angle(speed=speed, rotation_angle=-rotation_angle, then=Stop.HOLD, wait=True) # wait 10 seconds wait(t1) # open railway crossing motor_A.run_angle(speed=speed, rotation_angle=rotation_angle, then=Stop.HOLD, wait=True) motor_D.run_angle(speed=speed, rotation_angle=rotation_angle, then=Stop.HOLD, wait=True) ev3.light.on(Color.GREEN)
  10. Hello, I have a LEGO robot arm Assembly, my project also uses kinect from xbox, the code processes the position of my hand and sends it to the ev3 control unit, how can I implement the motor lift by a certain degree? Let's say I set y>0 my package went, and the robot raised its arm to a certain height, then I lowered my arm y became < 0, and the arm went down too. The code through which I tried to do this is attached, but in this case it infinitely turns the motor. - client(my pc) -server(ev3)
  11. I've been working on computer generated Lego landscapes and I thought I'd share my progress here. This landscape was generated in Python 3.7, which created the LDraw file which was then rendered in Studio 2 - Hope you like the images
  12. Techster14

    HELP!!!

    I get that this isn't the topic forum to ask this under but, I need help with this idea. so I did research and look for how to use a webcam with ev3 and ev3dev_Python, but couldn't seem to find the camera model I'm using. My camera is literally called usb2.0 pc camera if that helps and have no clue about anything else. it works just fine to the only issue is how do I use any kind of USB camera with EV#Dev_Python?
  13. Hi there, I've been working on some code to allow of the conversion of textured 3D models into coloured ldr files (with brick optimisation - so it creates the ldr model with 10 different types of brick). I've put my python code and instructions on how to use it on GitHub, you can get all the details from my blog...https://cultofthebrick.blogspot.com/2019...ithub.htmlHope you like it
  14. A recreation of popular design using 688 stormtrooper minifigures. It uses a mosiac function and a Lego crowd generation system that I created in python - hope you like it.
  15. ClassicLook

    Monty Python Black Knight

    Hi Everyone, Let me share with you my latest creation. I designed it to my brother for his birthday. He's a big Monty Python fan. It's the the Black Knight from the Monty Python and the Holy Grail movie. The knight is a simple black figure (with black head) with a custom printed torso. I've placed it into an IKEA Ribba picture frame. Hope you like it!
  16. Update I rewrote the script in C++ and built 2 binary versions of the tool - 1 for windows and 1 for Linux. I have not been able to test them extensively and they do have bugs which I hope to fix soon. I could not attach them to this post as the file size is too large. Windows version tested on a fresh install of Windows 7 Professional (don't have anything more recent), Linux version tested on a fresh install of Ubuntu 16.04 technicAngles_Windows technicAngles_Linux Original post: Hey all Hope this is the right place to post this, if not, moderators please move to wherever it should go. Also, long post alert! I was recently looking for a way the recreate an angle (within a certain range) using technic bricks and after some twiddling around by stacking lots of technic bricks and rotating another brick fixed to the stack with a pin in order to find where the holes would line up I realised a) that this was not very precise (I did find 1 illegal connection that seemed to work perfectly) and b) that it might take forever. What to do? Automate! I quickly hacked a small Python script together that would calculate all possible pythagorean triangles by stacking bricks - done. Or was I? Well, for my original purpose I was. But then I looked at the list of angles the script had returned and realised that I was having trouble imagining what a certain angle/triangle actually looked like. This is where ambition took over. I started to add a graphical user interface which would draw the triangles calculated by my script - this took a lot longer than the mere calculation script. A LOT longer Anyway, it works nicely now. Some screenshots (click on them to view a larger version) Now, I don't know if this tool would be of any use to anyone (maybe everyone knows these angles by heart) but I thought I'd share it anyway. The problem here is that it requires Python and Qt4 and anyone who'd like to use the script should have these installed and while I can and will give every bit of information and help concerning my script I simply do not have the time to help people to set up Python and Qt4 (and PyQt) on their computers. Anyone who has these prerequisites fulfilled should be able to download the attached files and run it without a problem. Alright then, some pointers on how to use it: Checkbox "with half offsets": when checked the calculations will include technic brick 1 x 2 with 2 holes which basically means half a hole offset. Checkbox "with multiples": when checked the same angle may appear multiple times as doubling the length and height of a triangle will result in the same angle. Fields "Max. length" and "Max width": when left empty the script will use a maximum length of 15 holes (16 stud brick), use any other value you like - the larger, the more different angles you will get. When you've made your choices click "Calculate" or press "ENTER". The script will generate a list of triangles and display it in the table. Click on any entry in the table to see a graphical representation of the triangle on the right side of the window. The currently displayed triangle can be saved as an image (.png) by clicking on, yes you guessed it, "Save Image". The image will be saved in the same location as the script resides. That's about it. I have tested this on Ubuntu Linux 14.04 using Python 2.7.6 and Qt4 and on Windows 7 using Python 2.7.5 and Qt4. If you think you've found a bug please email me (address is in the source code or the about window) with a description and a way to reproduce it and I'll see what I can do. The reason I chose Python/Qt is that it is platform independent and the script should run on any operating system which supports the 2 - also because I'm lazy and I have the 2 already installed. At the same time I realise that it is not for everyone because of the need to have them both installed. Here's hoping that at least 1 person finds this useful Feedback is always welcome of course. Cheers technic-angles.zip