Jump to content

mattie

Limited Member
  • Posts

    38
  • Joined

  • Last visited

Everything posted by mattie

  1. Is there any difference when selecting emit from polygon area or emit from polygons. I mean... isn't emitting from all polygons the same as emitting from polygon area? The only difference that I can imagine happens is that on the line that connects two polygons - no particles will be released... so maybe if your getting a blocky appearance when the particles emit using the emit from polygons selection then you can try polygon area? Any thoughts?
  2. alright everyone, so i figured it out... for everyone's future edification here's the soln and all those blender crazy pants people commenting in the link above send you on goosechases. when using the latest hardware such as a 3090, it simply requires the latest cycles download... the kernel issue was happening because the cycles plugin folder didn't have the file it needed to run the realtimerenderer (ie kernel_sm_86.cubin.) You simply need to install the latest cycles experimental build to resolve it as it has the kernel_sm_86.cubin fix... I had clipped my wings when I chose to use the version cycles_4d_509 (i think it was) when I should of been using the latest because I was running 3090 which is cycles4d_541 build. and fixed. funny though, I had thought that I could simply copy the ext for the kernel here... https://archlinux.org/packages/community/x86_64/blender/ and add the files it needed to the cycles plugin folder which actually could work in theory but there were lines missing in the code when I tried it. So i ended up reinstalling the latest plugin to cycles. Feeling like a silly pants in hindsight.
  3. note that the top line in my previous post included 3090 because I copied and pasted the sentence from my google search and I am running a 3090. the error message is simply cuda binary kernel for this graphics card compute capability (8.6) not found
  4. cuda binary kernel for this graphics card compute capability (8.6) not found 3090 I have checked almost every other forum and the ones that I've seen on blender have not been helpful. I've checked these sources: CPU+GPU rendering issues with RTX-3090 and AMD 3970X - Support / Technical Support - Blender Artists Community but this is no help... I tried running the realtimerender preview in optix mode but when loading up kernels the error message showed: failed loading pre-complied cuda kernel lib/kernel_sm_86.cubin opening blender's app and checking their preferences under openCL - it says no compatible gpus found for path tracing cycles will render on the cpu. Any ideas are appreciated bc in essence I think that I am running renders without a gpu until I fix this.
  5. @Smolak and @Cerbera so corrected the issue. thanks for the help. the wacom tablet area setting defaulted to full despite having defined portion. So resolved. Thanks for your help!
  6. what is windows scaling? but hey, nice thought. I do have tablet area set but noticed for some reason that it defaulted back to full after I made the changes for some reason... once I'm finished with what I'm working on, I'll check and see if it corrects itself 🙂 It is set to Touch devices with graphics tablet checked and hi-rers tablet checked. psr cursor tools cursors and mouse move activation all unchecked.
  7. It is set to Touch devices with graphics tablet checked and hi-rers tablet checked. psr cursor tools cursors and mouse move activation all unchecked.
  8. components - are they placed within the mesh of the character polygon obj or on top?
  9. Using a wacom intuos pen... Modeling using line cut when I am moving from defining one line cut to the next the point between the orange click point and my mouse is different... Is this an error that I'm facing or is this the way it's supposed to run? (also the img below - my mouse is represented by the big black dot.
  10. whoa, I would totally wait before changing stuff like above... I mean, that just worries me. you don't want to start doing things to a rig that are software driven... especially when you have a single isolated issue of c4d not working properly. It would help if you provided the motherboard, power supply, ram, and cpu you have... But rather then me look at it for you, I would just say to check your motherboard compatibility with the 3 gpu build and again against the power supply etc. Desribe what happens when it crashes in more detail. Does it crash upon start up? or is it something that happens when you begin rendering? I would double check the app for nvidia and make sure everything is running correctly there.
  11. brilliant. thanks for clarifying all these things super helpful. off topic, I did figure out why I was getting the real time render preview error: I'm running a rog zenith ii extreme alpha and apparently I placed it in the wrong port because the configs were x16 x8 x8, x16 x16 x16 or x16 x8 x16 x8 for bandwidth to the cpu. Switched the pcie port to the x16 because I'm running a single gpu and it corrected the issue as well as increased performance. 🙂 anyhoo thanks again
  12. curious, what was the verdict on the RAM as the culprit? I think RAM issues will restart the machine rather than power down. Power supply, gpu drivers all sounded like good avenues to me when others commented on this. Perhaps, maybe even a bad software install if I was running down the list but you said that you tried r20 and r23, it would be weird if install was the case on two separate occasions. Are there other high performance programs(likehobbyist mentioned) that you run on the computer where you can reproduce the issue? I know that during a certain point renders can seriously upregulate their performance if the scene has simulated material or a large point count. If your temps were stable but then shuts down, perhaps it could still be the cpu auto protecting itself for high temps. I dont have experience in how quickly autotrigged cpu shutdowns respond so idk. But just trying to run through it how I might see it. good luck
  13. I would recommend make your build specific for the render settings you are planning to use.
  14. well, your answer is reassuring. Thanks @srek but let me ask...in a ideal state... say the cpu is running 100% throughout the entirety of the process - is that bad for the cpu over time? Also this ideal state includes proper airflow. (note: my temp for cpu while renders are near 80-85 C
  15. Under performance => auto-detect threads will max out the thread count. since I have that checked in the render settings and had the thread count specified in preferences, which one takes defines the render rule?
  16. When rendering, is it optimal to have a cpu utilization rate of 100%? I notice that my cpu will upregulate 100% and then downregulate to around 10%. It continues in this way till the render is done. But, I noticed that having a custom thread count can increase the cpu utilization compared to leaving it unchecked. (just an observation) I thought that I read you could decrease the thread count in the interface preferences to help the machine save cpu for ancillary tasks while rendering. But, I noticed it does this *see attachment* to the cpu utilization. I am trying to figure out if running a cpu at 100% is ideal. If it is not ideal, what render settings would be recommended to put a threshold on max performance of rendering to allow for longevity of your cpu?
  17. no I was looking for the server-esque epyc chip. I chose to go with the obsidian series 1000D and I'm planning on using the extra room to have a single server render farm in the machine. I went with corsair 280mm h115i. Happy I did. off topic here... but do you know how cpus are to optimally perform. => does running a cpu at 100% mean you've selected the correct components to max out the cpu performance or is it better to taper down the thread count to lower cpu overall utilization? yea, i went with 4x32gb 3600. It's not even getting touched in terms of workflow. I have one dedicated for the programs and another for recent projects. And, a 5tb for long storage. I went with the 3090 despite some info about it not performing well... There is only one issue I have seen where it cannot render in real time preview with cyles... I've been using CPU to render real time but I would like to know why i get this kernel error when using the GPU as the renderer. ya. I went with evga suppernova t2 and I think it should cover me if I end up doubling up my GPU in the future. I went with the 1600 just so i dont need to worry about it.
  18. Gotta appreciate the hardware. Definitely running with cycles. But, I have to say that redshift is a great step forward. Wrapping the next build around redshift is a great rig. Appreciate the article, I'll read it next. Good question about redshift hardware not recommending AMD. I'll post a reply in the future if I come across something.
  19. So I have mixed feelings about the render farms. Maybe for two reasons - the first, you're uploading the project content to it which depending on clients may be an issue. For some projects, I think it is good to have it but not all. Also, certain programs when up-to-date versions are used have difficulty with using the plugin to whichever farm you choose.
  20. Perhaps gigabyte Geforce RTX 3080 10 GB aorus xtreme video card would prove a good alternate to the 3090...
  21. sorry for the confusion. I've been reading about processors to long.
×
×
  • Create New...

Copyright Core 4D © 2024 Powered by Invision Community