Lyichir Posted July 1, 2024 Posted July 1, 2024 1 hour ago, evank said: By saying "those with actual talent do not want or need generative AI," you're insulting me and my team. And by saying that anything created with the help of AI is "a cheap, dubiously-ethical shortcut," you're just being incredibly closed-minded. Word processors didn't destroy writing, CAD didn't destroy drafting, and Photoshop didn't destroy art. I'm getting tired of arguing this. The creators of many of these corporate AI products have been practically bragging about how they can replace workers in these sorts of creative industries, so forgive me for not believing the comparison to things like word processors, photoshop or CAD (tortured comparisons because again, those things you describe gave creators more control over their craft instead of actively taking it away from them like generative AI does). You seem to be completely suckered by the AI hype cycle so I doubt I can convince you that, at least in this case and most likely in many others, these are solutions in search of a problem. Quote
evank Posted July 1, 2024 Posted July 1, 2024 It sounds like you don't know how to disagree without insulting someone. I am not "suckered by" AI. It is a modern tool that's here to stay. People can learn to use it, or lose their jobs to those who will. I choose the former. Quote
Mylenium Posted July 1, 2024 Posted July 1, 2024 10 hours ago, Toastie said: Yeah, true, of course. But how about having experienced humans looking at all the trash? Define exorbitant - maybe your AI should not crank out trash all the time. The current ratio is something like 10% to 90% and that's IMO not feasible in the long term. Sorting out those 90% trash would consume much more resources than producing the actual result. Even ChatGPT or those image generating A.I.s are pretty bad, they just filter out a lot of stuff already. LEGO being a quantized system with tons of extra constraints would probably be even worse. Such things tend to be a lot tougher to crack from a mere math point of view. Again, I remain skeptical we'll see anything really usable soon that reaches the level of even a mediocre MOC design. Mylenium   Quote
Artanis I Posted July 1, 2024 Posted July 1, 2024 On 6/28/2024 at 11:53 AM, icm said: I wonder if the OP is a bot:/ Same! No other posts, maybe trying to scope out the industry to see if there's a job vacancy for up-and-coming young bots, haha Quote
Mylenium Posted July 1, 2024 Posted July 1, 2024 (edited) 9 hours ago, Lyichir said: And that is where AI does not (and arguably CAN not) exceed human creativity, because ultimately all it is attempting to do is IMITATE human creativity, and (so far) generally doing it remarkably poorly. Yes/ No/ Perhaps. I'm decidedly undecided on the matter. As I wrote, the unexpected results an A.I. may produce can be inspiring. At the same time I do agree that so far we do not have a genuinely "creative" A.I.. It's all based on statistical models and algorithms and there's a lot of built-in bias due to the training data already being biased an often not as comprehensive as it should be. 9 hours ago, Lyichir said: Expecting an AI to come up with an artistic design, even in the concept stages, is a misaimed pursuit because a neural network is incapable of knowing what works aesthetically and what doesn't, let alone why certain things work and certain things don't. You can learn artsy stuff at art school or uni. Color theory, image composition, design principles are all long explored and written down. If a human can be taught these things, then so can be an A.I.. It just doesn't say anything about how good of an artist you are. You can study these things for years and be a mediocre designer and at the same time you must not have studied at all and can be a great artist merely based on your intuition and experience. That is the real distinction. That said, not everything needs to be "art", especially in a commercial context. A lot of design work is just implementing basic ideas and then chewing through it. LEGO could just as well intentionally hire certain mid-tier designers that may never produce something outstanding, but are just right for this type of work. Why then should an A.I. not producing a Mona Lisa level "art" every time be a hinderance to not use it? 9 hours ago, Lyichir said: And ultimately, it's trying to solve a problem that doesn't exist. Human concept artists already exist and for the most part, those with actual skill and talent do not want or need generative AI to automate away the work that they both specialize in and genuinely enjoy (especially given how unethical all major art-based GPTs are when it comes to plagiarizing training data). If I do hear Lego designers praising AI for concept art generation, I will indeed be disappointed, not because I was wrong about its use case or lack thereof, but because it would represent a cheap, dubiously-ethical shortcut for something Lego has proven perfectly capable of doing with human talent for decades upon decades. You get too strung up on some specifics and ultimately I feel you have a fundamental misunderstanding about how even those concept artists work or for that matter a considerable part of the creative industries. You are also wrongly assuming that creatives are just dying to take every shitty job just so they can tell the world how great they are. I can think of lots of stuff I would gladly turn over to an A.I. if it freed me up to focus on my other work that's really important to me. At least I don't enjoy fixing botched wedding photos or other such nonsense. And even if you wanted to just talk about "concept artists" in the strictest meaning of the word - why should it matter? You can throw away 300 hand-drawn sketches, you can throw away 300 A.I. generated drawings and you can throw away hundreds of steps inbetween based on a hybrid approach of drawing, Photoshop work and A.I.. None of that is a statement of quality or artfulness, it just means that none of those sketches were the right ones for a given job. All the same even a human artist might not be upset if an A.I. wins because it still could mean he gets to work on the concept and refine it until it is actually ready for production. 8 hours ago, Lyichir said: I'm getting tired of arguing this. The creators of many of these corporate AI products have been practically bragging about how they can replace workers in these sorts of creative industries, so forgive me for not believing the comparison to things like word processors, photoshop or CAD (tortured comparisons because again, those things you describe gave creators more control over their craft instead of actively taking it away from them like generative AI does). See above. Not every task is worth chasing. And it's even kinda funny to even mention Photoshop as a positive. Back in the day traditional artists also thought it would be the end of the world, but ultimately things merely have changed and the process has been democratized. Yes, there will be people on the losing side of A.I. and it will suck for them, but it's not the end of the world. It's also a wrong assumption that people who never had an interest in being creative will suddenly swarm to A.I. and steal other people's job. Likewise, those corporate boneheads raving about A.I. will soon enough realize that they still need humans. Things just will be different and people will have different jobs than they have now. Mylenium Edited July 1, 2024 by Mylenium Fixed typos Quote
Toastie Posted July 1, 2024 Posted July 1, 2024 2 hours ago, Mylenium said: The current ratio is something like 10% to 90% I have no clue what the ratio is, that may very well be; I believe "current" is the key word here. 2 hours ago, Mylenium said: we'll see anything really usable soon that reaches the level of even a mediocre MOC design. Me too - but again, there is a key word here: "soon". I don't care abut the time frame. AI is a "process" that has started some 70 years ago. Call it "hype" or whatever, disruptive processes always begin slowly, like exponential functions do. Here is what the "chief data officer for the LEGO Group" had to say about TLG and AI back in December 2022 (which is a long time ago in the data industry): "It [AI] could be about helping our molding machines work more effectively, or it could be about more effective customer engagement, or it could be about just creating fantastic online building experiences to help kids play together when they’re using the physical product." (Cited from that article referenced below) Here is the entire article, it is an interview style piece: https://www.mckinsey.com/capabilities/quantumblack/our-insights/how-lego-plays-with-data-an-interview-with-chief-data-officer-orlando-machado Best, Thorsten Quote
MAB Posted July 1, 2024 Posted July 1, 2024 On 6/30/2024 at 8:58 AM, Mylenium said: But how many actual models are out there, including LEGO's competitors and MOCs? Arguably an infinite number, but at the same time not enough. A considerable part won't even be available digitally. And even if you assume there would be enough digital models to train an A.I., you'd have to vet them beforehand or else the old "Garbage in, garbage out." bites you in the butt. Unless someoen already has been working on this for the last three years or so I don't expect any results soon.  This is where having scoring functions that automatically determine whether it is a good model come into play. They are the difficult bit, but if they can come up with an algorithm to score models based on whatever criteria they want to score on, then the AI model can train itself. It designs new models and scores them, and keeps the high scores in the model to learn from. Then this undergoes many iterations, always learning from the high scoring results. Even though they start from a small number, they can generate millions to learn from and increasingly they learn from high scoring models to generate other high scoring models. That is why getting the scoring algorithm right is important as the AI model learns from the designs it designs. This is how AI works in some fields with relatively sparse initial data but reasonably well defined targets. Quote
evank Posted July 1, 2024 Posted July 1, 2024 This is the original poster: they just joined June 25, this is their only post, and they're certainly not a young blonde woman named Kerry Ball:Â https://www.eurobricks.com/forum/index.php?/profile/207202-kerryball/. Here in our little Lego world, we've been pranked. I don't know if the account owner is a real person or a bot, but clearly it's not legitimate. That being said, the discussion is fascinating! Quote
Mylenium Posted July 1, 2024 Posted July 1, 2024 17 minutes ago, MAB said: They are the difficult bit, but if they can come up with an algorithm to score models based on whatever criteria they want to score on, then the AI model can train itself. The criteria themselves should be simple enough, but I doubt you would be able to e.g. develop a whole static structural analysis A.I. on top of your actual model building A.I.. I could see it working if "A.I. as a service" becomes a thing and Autodesk or another CAD provider offer such tools, but otherwise you could probably spend years just finishing the foundations before you even get to train your A.I., much less get any results out of it. sorry to be so pessimistic, but I see complication stacked on complication and it is not making it easier to be convinced about the benefits of an A.I. for LEGO designs. ;-) Mylenium 6 minutes ago, evank said: That being said, the discussion is fascinating! Exactly. At this point I don't care if the OP is just spam seeding and pre-registering accounts, if it's a researcher anonymously collecting opinions or some commercial player having fun with our views on A.I. in the hopes of one day selling us something. Mylenium Quote
Lion King Posted July 1, 2024 Posted July 1, 2024 19 hours ago, evank said: It sounds like you don't know how to disagree without insulting someone. I am not "suckered by" AI. It is a modern tool that's here to stay. People can learn to use it, or lose their jobs to those who will. I choose the former. Eventually you will lose your job when AI takes over… Quote
Clone OPatra Posted July 2, 2024 Posted July 2, 2024 The OP does seem like a bot, but the discussion is legit. Carry on. Quote
Mylenium Posted July 2, 2024 Posted July 2, 2024 8 hours ago, Lion King said: Eventually you will lose your job when AI takes over… A.I. doesn't "take over" nor is it going to kill every job. It's one of those weird misconceptions people seem to have. Since @evank apparently is some sort of tech doc/ scientific writer, often dealing with the latest shiz that hasn't been written down before, how would any algorithm even know what to write? It still needs the human input a.k.a. in Newspeak a "prompt writer". The A.I. would merely be an assistant to find nice words, check grammar and spelling, format the text nicely, generate a synopsis/ abstract, do the indexing and whatnot. A large part of A.I. is going to work this way and will require some guidance. Maybe not an awful lot and maybe not all the time, but every now and then there's going to be some sort of intervention akin to how you have team meetings in a company to steer direction. Mylenium Quote
Lion King Posted July 2, 2024 Posted July 2, 2024 2 hours ago, Mylenium said: A.I. doesn't "take over" nor is it going to kill every job. It's one of those weird misconceptions people seem to have. Since @evank apparently is some sort of tech doc/ scientific writer, often dealing with the latest shiz that hasn't been written down before, how would any algorithm even know what to write? It still needs the human input a.k.a. in Newspeak a "prompt writer". The A.I. would merely be an assistant to find nice words, check grammar and spelling, format the text nicely, generate a synopsis/ abstract, do the indexing and whatnot. A large part of A.I. is going to work this way and will require some guidance. Maybe not an awful lot and maybe not all the time, but every now and then there's going to be some sort of intervention akin to how you have team meetings in a company to steer direction. Mylenium Oh I ws just sarcastic to evank… Sure, AI is a good tool as of but I’m not a huge fan of people depending on AI heavily. AI is on the rise in the future. Quote
DelQuinn Posted July 2, 2024 Posted July 2, 2024 I'm not going to weigh in too much more on this issue. Don't worry, it's not for the comments I've received. This is a very complicated issue and I do see and agree with both sides of the argument. I am not good with words and have a hard time expressing what I intend to. there are dangers with AI but in a lot of ways those dangers are from plagiarism and putting the wrong people out of jobs. I am sure that at some point TLG will determine how to use AI in their creations, but I can also see AI destroying and shutting down stuff like the Lego Ideas line, a platform that I am active and enjoy participating in (And yes I know about the complaints, plagiarism and issues with that platform as well, without the help of AI getting into the mix) I just want to sum things up with a quote that one of my coworkers heard awhile ago. "AI originally promised us that it would do the everyday chores like cleaning and laundry and repetitive tasks so that I could have more time to do art and be creative. Instead, we have AI doing the art while we are forced to do the cooking and cleaning and laundry." Yes, AI is definitely here to stay, we need to find a way to keep it balanced in our lives Quote
evank Posted July 2, 2024 Posted July 2, 2024 4 hours ago, Lion King said: Oh I ws just sarcastic to evank… Sure, AI is a good tool as of but I’m not a huge fan of people depending on AI heavily. AI is on the rise in the future. Who said anything about depending on it heavily? I'm the one who said it's just another tool, like every other technology tool that we already have. 2 hours ago, DelQuinn said: "AI originally promised us that it would do the everyday chores like cleaning and laundry and repetitive tasks so that I could have more time to do art and be creative. Instead, we have AI doing the art while we are forced to do the cooking and cleaning and laundry." LOL! Quote
Space Coyote Posted July 3, 2024 Posted July 3, 2024 On 7/1/2024 at 10:23 AM, evank said: Â That being said, the discussion is fascinating! You seem like a level headed AI supporter so I'd be interested to hear what your thoughts are about the issue of how AI learned to be creative enough to be used as an art generator - namely, the training data. Some folks feel this is close to if not flat out theft. The way I understand it this training data was fed to AI without the consent of the people who actually created the art. Do you feel like it's stealing, or does the end result justify the means. It just feels a bit unscrupulous to me, but I don't know all the details so I'd be interested to hear what you have to say! Â Quote
Mylenium Posted July 3, 2024 Posted July 3, 2024 3 hours ago, Space Coyote said: You seem like a level headed AI supporter so I'd be interested to hear what your thoughts are about the issue of how AI learned to be creative enough to be used as an art generator - namely, the training data. Some folks feel this is close to if not flat out theft. The way I understand it this training data was fed to AI without the consent of the people who actually created the art. Do you feel like it's stealing, or does the end result justify the means. It just feels a bit unscrupulous to me, but I don't know all the details so I'd be interested to hear what you have to say! I'm not @evank, apparently, but I can give you my take: Yes, it's theft, morally and legally. Most people never agreed to have their images scanned for that purpose and just because you already have certain data doesn't mean you can do whatever you want with it. A lot of that is a clear violation of the GDPR and other rules here in Europe. The companies just seem to get away with it because they're to big and have all the resources in the world to fight this legally and on the other side authorities and lawmakers are working way too slowly. The more important part, though, is that those companies don't pay any of the creators or they pay scraps while at the same time making billions off other people's work. It's nothing new, however. Facebook and Google refusing to pay publishers for showing news articles just because they are supposed to pay a few pennies is another example of this. Another aspect to consider is that the financial damage is not just the immediate loss of compensation for something you already have created, but there's more prospective losses due to people just using the generative A.I. instead of coming directly back to the original creator. I think in particular if that getting paid for your work part would be handled better creators would not be as pissed... Point in case: It's a problem of the abusive practices of those controlling the A.I., not that A.I. exists per se. Mylenium Quote
Space Coyote Posted July 3, 2024 Posted July 3, 2024 41 minutes ago, Mylenium said: I'm not @evank, apparently, but I can give you my take: Yes, it's theft, morally and legally. Most people never agreed to have their images scanned for that purpose and just because you already have certain data doesn't mean you can do whatever you want with it. A lot of that is a clear violation of the GDPR and other rules here in Europe. The companies just seem to get away with it because they're to big and have all the resources in the world to fight this legally and on the other side authorities and lawmakers are working way too slowly. The more important part, though, is that those companies don't pay any of the creators or they pay scraps while at the same time making billions off other people's work. It's nothing new, however. Facebook and Google refusing to pay publishers for showing news articles just because they are supposed to pay a few pennies is another example of this. Another aspect to consider is that the financial damage is not just the immediate loss of compensation for something you already have created, but there's more prospective losses due to people just using the generative A.I. instead of coming directly back to the original creator. I think in particular if that getting paid for your work part would be handled better creators would not be as pissed... Point in case: It's a problem of the abusive practices of those controlling the A.I., not that A.I. exists per se. Mylenium Thank you for the clarification about the legality about the way this training data was collected. That's the part that makes me the most uneasy. This data was forcibly seized and then used to create a lucrative product. This does already happen a lot but the scale of it feels intrusive and bereft of any care given to the value of art. I would hate to see thousands of folks' beautiful mocs gobbled up and turned into sets down the line via a similar method. I think AI could be ok for maybe optimizing part usage but the data can't come from the entirety of the community it's drawing from without their consent. I'm trying to be open minded and positive about the value AI can bring to Lego and more but it's hard not to be a little creeped out by the way it was developed. Quote
ukbajadave Posted July 3, 2024 Posted July 3, 2024 (edited) I reckon this thread will end up as a data set for AI.  This seems to be an increasing trend on a number of forums I frequent. A poster with zero post count trying hard to appear like a human (re: choice of username, avatar etc.) pops up and starts a thread with a generic style of question (Who's the best Star Wars ship?) and never posts again as the discussion goes on. WAKE UP SHEEPLE!! Edited July 3, 2024 by ukbajadave Typo, I'm only human :) Quote
Toastie Posted July 3, 2024 Posted July 3, 2024 11 hours ago, ukbajadave said: WAKE UP SHEEPLE!! Well, you know what? I don't care whether a random bot, a random AI (whoohoo), or simply Google randomly pops in here with an - as it appears - relevant question, at least to some here. I certainly don't learn anything from the OP's question - but I do from educated replies, discussions, ideas, fears, projections, conclusions - from the (I guess) human forum members. Whether an AI learns from this thread - so be it! Google essentially learns from every click I make. Do I care? No, I just use Google. Should they use me, well, that is the game, isn't it - nothing comes for free. And when it seems feasible, I use AI. Use as in: To my research group's benefit. I simply like to engage in discussions relevant to what is actually happening now or may (or not) happen in the future. On a "level" I don't find that often anywhere else in this virtual social world; others may do and also know better places. I don't. Well, I do happen to have other means for such discussions as well - in human worlds. In conclusion: Why should I want to wake up? This is a so conscious and well alert discussion of humans, thanks to the AI's question. Think about it ... Best, Thorsten    Quote
Classic_Spaceman Posted July 4, 2024 Posted July 4, 2024 21 hours ago, ukbajadave said: I reckon this thread will end up as a data set for AI. How can we get AI to train on the DC Superheroes 2024 discussion thread? 😈  Quote
evank Posted July 5, 2024 Posted July 5, 2024 (edited) On 7/2/2024 at 10:18 PM, Space Coyote said: You seem like a level headed AI supporter so I'd be interested to hear what your thoughts are about the issue of how AI learned to be creative enough to be used as an art generator - namely, the training data. Some folks feel this is close to if not flat out theft. The way I understand it this training data was fed to AI without the consent of the people who actually created the art. Do you feel like it's stealing, or does the end result justify the means. It just feels a bit unscrupulous to me, but I don't know all the details so I'd be interested to hear what you have to say!  It's complicated. One point of view is, it's theft. People profited from other people's work without permission. As someone who creates content for a living, I understand this perspective. (Although my employer is a state university, so everything I create at work is owned by the taxpayers.) Another point of view is, once you put something on a non-firewalled web page, then it's public. Advocates would claim 'fair use'. Still another perspective is, technology advancements stop for nobody, so the world needed to know what generative AI can do, just like animal cloning etc. My prediction is it'll end up like streaming media: programs like Napster showed us the possibilities, and then grown-ups entered the picture and made it legal and respectable. OpenAI (and all the others) will strike licensing deals for training data, and then they'll probably insert advertisements and/or subscriptions so they can pay for the licenses. Otherwise, all the generative AI services will start using synthetic (AI-generated) data, which according to researchers isn't nearly as good as human-created data. We might end up with two-tiered systems. Free services give you ads and/or results based on only synthetic data. Paid services give you the good stuff. By 2034, to quote Springsteen, "Someday we'll look back on this, and it will all seem funny." Edited July 5, 2024 by evank Quote
Space Coyote Posted July 5, 2024 Posted July 5, 2024 16 minutes ago, evank said: It's complicated. One point of view is, it's theft. People profited from other people's work without permission. As someone who creates content for a living, I understand this perspective. (Although my employer is a state university, so everything I create at work is owned by the taxpayers.) Another point of view is, once you put something on a non-firewalled web page, then it's public. Advocates would claim 'fair use'. Still another perspective is, technology advancements stop for nobody, so the world needed to know what generative AI can do, just like animal cloning etc. My prediction is it'll end up like streaming media: programs like Napster showed us the possibilities, and then grown-ups entered the picture and made it legal and respectable. OpenAI (and all the others) will strike licensing deals for training data, and then they'll probably insert advertisements and/or subscriptions so they can pay for the licenses. Otherwise, all the generative AI services will start using synthetic (AI-generated) data, which according to researchers isn't nearly as good as human-created data. We might end up with two-tiered systems. Free services give you ads and/or results based on only synthetic data. Paid services give you the good stuff. By 2034, to quote Springsteen, "Someday we'll look back on this, and it will all seem funny." Thanks for the informative reply! You make a good point about fair use, and you are also right about the need for a better infrastructure to protect the individual's privacy and rights in regards to the data that's collected. Even a eula type notice on sites and apps that ai uses for data training would help. Quote
Mylenium Posted July 5, 2024 Posted July 5, 2024 4 hours ago, evank said: Advocates would claim 'fair use'. That in itself is problematic, though, since here in Europe in most countries there is no generalized fair use clause. Not trying to shoot down your argument and I do get your point, it's just that it's even more complicated. And as recent events show with A.I. providers not even respecting robots.txt rules, one could argue there's a clear malicious intent behind it... Mylenium Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.