Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 07/22/2022 in all areas

  1. Nodal workflows are more difficult to conceptualize. As for me, I need to watch out for approaching nodes without first thinking through the core steps. Having written (and re-written) some rather extensive software programs for work (Unix shell based with C), I can tell you that the first pass is just inefficient spaghetti code if you start without crafting a logic diagram. I would imagine the same applies to nodes. So I resonate with your comment on complexity. Interestingly enough, Redshift has implemented the Standard Material node - which is brilliant. Nodal trees are behind it but the front end echoes the channel system that is a lot easier to wrap your head around. On the Houdini side, I started to look at Igor's posted node diagrams on Houdini. Very pleased to say that I can understand "some" the logic of what it going on (still a far way off from "all"). That is probably one of the big benefits of nodal workflows: you can see the approach taken. I look at some of Cerbera's or Vectors (or pretty much anyone else's) masterful meshes and you really can't figure how they got there by starting with a primitive. You do learn what good polygonal modeling looks like, but you have no idea how they did it. Not so with studying nodal workflows in Houdini. They are teaching opportunities. I think that is a big downside to C4D nodes which may be corrected with capsules. The nodal commands embrace more mathematical than everyday 3D functions: normalize, decompose, cross-products, vectors-to-matrix....and that was to just create a "look-at" function for animation. Honestly, I can't learn from that. Dave
    2 points
  2. The node examples shown here seem somewhat labourious for little benefit. I can handle a redshift material with ~20 nodes, but that's about it. More than that, and (for me) nodes become just too complex to maintain and troubleshoot. All I really wanted from C4D is: keep the goods, but make it 100x faster. I don't want to learn an entirely new and unfinished workflow paradigm to get a speed boost in C4D. If I wanted that, I could just hop to Houdini, where everything is ready and waiting. Packing up the node complexity into easy-to-use capsules was a smart and promising move to integrate node-power into the classic workflow - but so far, I don't see capsules that I would use on a daily basis. Perhaps that is just a documentation/tutorial problem as well. tl;dr -- I just neeed speeeeed!
    2 points
  3. To make that analysis a bit more open, care to mention what the CPU and GPU rendering engines were? I would assume all things were equal in coming up with the different per frame rendering times (e.g., same hardware, same scene, same output resolution, same anti-aliasing settings, etc.). CPU = 35 minutes/frame GPU = 10 minutes/frame U-Render = 10 seconds/frame Were the GPU/CPU rendering engines biased or unbiased? Were they the same engine? The fairest comparison would be for them to be Redshift (unbiased) as R26 now runs on both CPU and GPU. But even then, as an unbiased render engine, skillful ray path optimization can drastically reduce render times. To truly appreciate the 10 second/frame render times on U-Render, we need to hear more about the other render engines used in this study and how all the settings across all the render engines compared. And finally, side by side comparisons of the finished images would be helpful. Thanks, Dave
    2 points
  4. Lenovo workstations can support your needs. I recommend the P620 series of workstations as they support the AMD Threadripper processors. You can configure your own here. For price, cores and speed, Threadripper is hard to beat. You can have more than one GPU and here they favor the nVidia RTX-A family of GPUs with RAM amounts up to 48Gb with the RTX-A6000. That should meet your needs of two RTX3090Ti from a memory perspective (though now sure RTX3090Ti's can be slaved together as one with NVLink). You can also have your SSD boot drives in mutliple RAID configurations as well as SATA drives (again multiple configurations) for pure storage and memory up to 128Gb but at 3200 MHz ECC. If you are not in favor of Threadrippers, the top of the line Intel workstation is the P920 series found here Now these are all workstations and as such you tend to only find server rated components (like Xeon processors or ECC memory, or the RTX-A series of GPU's) But they do have pro-sumer and consumer machines as well and their price point is far below HP. I would stay away from HP - build quality only starts to appear at the workstation level and they are very pricey. Same with Dell, but they are a bit less expensive than HP. The key to Lenovo pricing is to also order through a Perks at Work program. For example you can get one through some car insurance programs that will net you 40% discounts. If your company only buys Lenovo then maybe they have a Perks-at-Work program with Lenovo of their own (the discount rates vary from program to program) Overall, I have been very happy with Lenovo build quality. HP's is horrible. Dave
    1 point
  5. What i can see at HP is either a non matching Xeon Workstation or a mediocre i7. Lenovo offers the Thinkstation P620 Tower. It only comes with Threadrippers, so good at multicore but not as fast in single core as the intel, still the best choice of the lot imo.
    1 point
  6. If that is your choice you are a bit out of luck. Xeons make no sense anymore for CG, their single core speed is to low. HP and Lenovo mainly build Business systems and their definition of business somehow usually does not match with CG work, if anything they target CAD, but those systems are a waste of money for Cinema and co. Xeons and Quadro cards are expensive without extra gain. I would ignore DDR5, the speedup is not worth any compromise or additional money imo.
    1 point
  7. Thanks anyway! Well you know, sometimes I just need to write stuff down in order to break it down and find the solution myself. Maybe this helps someone else in the future... Laters, kws Btw: did end up, not using selection objects but rather "inexcludedata" in the user data. Keeps my object manager a bit cleaner...
    1 point
  8. Better yet...he has a sense of humor: Something to try this weekend: "Hey honey! In the mood for some fine alpha channel inversion? {wink...wink...}" 😉 Dave
    1 point
  9. @Igor Can't confirm but I do know there will be an upcoming update on the COPS. My basis is that the legendary Bran Ray was hired as a consultant for Houdini just recently. http://www.bryanray.name/wordpress/
    1 point
  10. The thing that interests me about Houdini is you get A LOT of power since it's all node-based, it's all refined and it's working in a unified architecture. FX, Modeling, Animation, Look Dev, Rendering.. it's all there. Unlike Blender, Maya and C4D, you don't need an addons/plugins to fill the gaps, that ultimately don't work anyways in nodes. It's just getting over that initial hump... nay.. MOUNTAIN with Houdini.
    1 point
  11. They introduced nodes in R20. And it's nearly R27. So roughly 3-4 years already. Definitely a mark that it should have been used already. But as you can on the latest 3d motion shows or webinars, no one is using them. maybe 1 or 2 instances? Just shows how even the presenter doesn't find it practical/useful.
    1 point
  12. The ol' SOPs gotcha. I really should have picked that up from your screenshot (it says Object in the corner). Glad it's sorted.
    1 point
  13. Wow...that certainly puts into perspective. So despite how scene nodes may have improved since its "tech demo" days, you do have to ask if it is catching on with the users. IMHO: Blender geometry nodes seem to be catching on much faster --- so it can't be a "If I am going to learn nodes, I am learning Houdini" type of reasoning that keep people from diving into C4D scene nodes. So what keeps adoption rates low or slow? Is it lack of attributes? Is it too big and too complex? Is it too unstructured? Are scene nodes the 20-ton elephant in the room that is just too big for the average user to swallow: "There are 50 ways to do this simple thing and each iteration requires a minimum of 20 nodes" I would imagine that if there were some case studies, they would be all over the news section. But a quick search yielded nothing. It has been well over 2 years as I think they were officially announced with R23. Fortunately, Maxon has still been investing in modeling tools, Redshift CPU --- improved cloth simulation. R26 was a much-needed update so some respect there. But it does leave me wondering what the end state is with scene nodes. Too early to tell? Maybe. Dave BTW: I am starting to put my toe in the Houdini waters. Soak time is required as there is a lot there but there is a structure and a methodology that is slowly (very slowly) becoming evident. But....and here is the difference....once people make that transition they become passionate about Houdini. it was that excitement and enthusiasm from Igor and others that got me interested. Definitely not seeing that same level of energy from C4D node users.
    1 point
  14. One of the 'minor' features of 19.5 which really caught my eye was the 'Kelvin wake' deformer. Kelvin wakes being the very characteristic standing wave pattern created by ships. Something I've needed to do more than once, eg: below: This was done with displacement maps - and what a pain and effort it was. I've just had a vey quick play with the new deformer and it looks fantastic - and so much easier : )
    1 point
  15. @Jeff H1 RE: This feels like Autodesk and Bifrost for the first 4 years it existed. Yea. To some degree. But to be fair, Bifrost on its first iteration was mainly for VFX simulation. So it was actually understandable, there wasn't an wide adoption. But currently, it is mainstay in Maya since it can be used for general stuff like mograph, particles etc. C4D Scene nodes on the other hand was supposed to be for general usage (i.e. can easily be implemented for wide adoption) but didn't really pan out, so far. Also, I used already a lot node programs before C4D, even before Houdini. And i have to say C4D version is a bit verbose. Like setting and retrieving selection. Again, why not just improved Xpresso. It was already capable. Just need a better UI and performance. I'm never going to rant about 1) cutting the cineversity for perpetual users 2) leaving xpresso. lol RE: Redshift content on their live streams On point. So many REDUNDANT INTRODUCTION AND BASIC VIDEOS. Keyword being REDUNDANT. I'm pretty sure there is already an existing videos for the features already introduced.
    1 point
  16. They are different. One is the project's framerate for example used for dynamics simulation, timing of keyframes,... The other is the render framerate, where the renderer basically "samples" the project per rendered frame. Setting them differently certainly has use cases, but I'm the wrong guy to talk about such. With that said, I'd prefer the other parameter's value being shown in either place, plus a button to set the other or get from the other.
    1 point
×
×
  • Create New...