Kostq Posted May 20, 2020 Posted May 20, 2020 (edited) Hi, Started using stud.io, even made some instructions with it. I'm a total noob in rendering and am VERY HAPPY how this program handles rendering except for the render times. I've got a GTX 1060 3GB that is utilised about 10% or a bit more while rendering using the PHOTOREALISTIC option. I'm wondering how the POV ray can churn out the same render settings /1920x1080, max details on everything, floor shadows/ in 2 minutes @100% LOAD and spank the sh!t out of my 3770 CPU. Photorealistic using the GTX 1060 lands between 10 and 12 minutes for the same render settings and GPU usage is 10-12%. And then I like to do some animations with bricks falling down effect and for a 20-30 second video /30 fps/ I've got to wait whole day and watch how the VGA is underutilized. I can play Hearthstone during the render without hiccups tho, but it's very odd that I'm able to do that at all... Do you have any advice on this? P.s. actually looking forward to thinkering with the renderer's settings files, might find something there >.> Edited May 20, 2020 by GTS Quote
Mylenium Posted May 23, 2020 Posted May 23, 2020 On 5/20/2020 at 4:10 PM, GTS said: GTX 1060 Could be something very specific with CUDA. You may need to manually install an older version and enable the compatibility settings. Fiddling with the renderers settings is unlikely to improve anything. The old "Either it works or it doesn't." adage applies. CUDA and OpenCL are built to handle this automatically and if there's an issue, something is misconfigured or the app not compiled with specific features in mind. You could try and download NVidias Profiling tools to at least see which features the renderer is actually requesting and where the probing may fail and then fiddle around with the details. Mylenium Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.