Jump to content

Dear Visitors

If your confirmation email doesn't arrive, give us some time while we manually approve your account. Support ticket is not necessary, we are checking new registrations regularly and approving if registration requirements are met.

Please read Rules & Guidelines before posting around Forums. Information we seek is minimal and in the end saves us a lot of time and we are able to provide solutions faster.

Please consider supporting us via Contributions for all the work we are doing. Contributing even with the smallest amount, will remove ALL AD's and unlock some nice features like being able to be Tipped for providing help, or file etc.


pfistar

Customer
  • Posts

    74
  • Joined

  • Last visited

Profile Information

HW | SW Information

  • C4D
    R23 / R24
  • Other
    Houdin 3dmax
  • Renderer
    Redshift
  • OS
    W (+ M)
  • CPU
    i9-11900K
  • GPU
    RTX 3080 10GB

Recent Profile Visitors

987 profile views

pfistar's Achievements

  1. Greetings all, I just bought a a new system that's been very crash-prone during Cinema4D renders. I haven't even put it to the test yet in a production situation - I have just been running test renders of old projects to see how the system performs. So far, I've tested three different scenes, one using c4d Standard renderer, and the other two using Redshift. All 3 scenes have crashed or freezed-up C4D during the rendering of a sequence. Even worse, one of the scenes crashed the whole system and Windows did an auto-restart. Basic system specs: Processor: Intel Core i9-11900K Motherboard: ASUS Prime Z590-P Storage: Seagate FireCuda 520 | 2TB Samsung 860 PRO Graphics Card(s): 1x GeForce RTX 3080 10GB (VR Ready) System Memory: 32GB DDR4 3200MHz First action I took was to make sure all relevant software was updated properly (C4D, Redshift, NVidia drivers). Still crashing. I'm at a loss for how to continuing trouble-shooting. Are there steps I can take to parse whether it's a hardware issue or a software bug. As mentioned, these crashing scenes were created on an older machine with lesser specs, but I never had crash-issues with any of them. Preemptive thanks to any responders, NpF
  2. I was able to figure this one out - the requirements being: Spline-based 'worm' animated with Displacer deformer Cloner 'legs' running along the spine of the worm - the 'legs' need to stick to the normal direction of the worm-segments as it wiggles 'worm' body needs to have a consistent round cross-section and rounded caps at each end. @noseman 's great Edge To Spline plugin (free!) opens up some options. Many thanks @noseman !!!
  3. Greetings all, I have a thing in mind to do, which is to: run some cloned ‘legs’ along the length of a ‘worm’ which is animated by way of a Displacer deformer object Have the ‘legs’ rotate according to how the worm bends and wiggles, or to put it another way, have the ‘legs’ inherit the vector normal direction of the points or polygons they are cloned onto. At the same time, have the body of the worm be a cylinder with capped ends. Ideally this is accomplished using the Sweep tool, so that the deformer animation can be applied to the spline object, thus letting the width of the body of the ‘worm’ stay consistent while it wiggles around. Below is a screenshot showing several setups, all of which fall short of ideal in some way or another. I’ve also attached an animated file which should explain things better, if anyone wants to look at it. I’ll work backwards from D to A to explain each one. D - Here, we have a Spline.D animated with Displacer object and in it a Noise map. Spline.D is the template for the Cloner object set to ‘Vertex’ distribution mode, and that same spline is also used to generate the Sweep object. The problem here is that when the Cloner is set to ‘Align Clone’, the ‘legs’ jump around, between certain frames, as it looks like the Spline’s point vectors don’t behave the same way a mesh object’s point vectors would when deformation is applied with Displacer. If I turn off ‘Align Clone’, the clones no longer jump around, but then they’re bound to being parallel to one another, which is not the effect I want. I’m wondering if there’s a way to actually see the spline’s point vectors in viewport, and also a way the make them behave in a way similar to a mesh object (?) C - Here, I’ve added the Rail option to the previous Cloner setup. The Rail object is an Instance of the base spline (Spline.C) with its position slightly offset. The Rail lets me control the general direction, but the vectors, and hence the clones, all still run parallel, which again, is not what I’m after. I’m wondering if there’s a way to offset the Instanced spline while it’s animating so that the points are offset according to their vector normals (like the ‘Move Along Normal’ modeling tool) (?) B - Here, I’ve introduced a Plane object which is simply a strip of polygons that is now being used to distribute the clones (via Polygon Center distribution). The Plane object has the same about of polygons as the spline has edge segments, and is following the animation of the spline via Spline Wrap deformer. Playing with ‘Up Vector’ and ‘Banking’ settings in the Spline Wrap deformer gives me an approximation of what I’m looking for, but I’m noticing that the Sweep object and the clones distribution object (the Plane) don’t quite move along the same vectors, so you get a bit of a ‘sliding’ effect. Not terrible, but not ideal. A - Finally, instead of using a spline as my distribution object, I’m having the Displacer act directly on the Plane object, whose effect I like a lot. Wondering if there’s a relatively easy way to use the Plane object’s geometry to generate the ‘worm’ body, so that it follows the animation while maintaining a consistent width / radius. I appreciate any feedback - preemptive thanks to all responders NpF controlling-normal-vectors-on-spline.c4d
  4. @bentraje Thanks for the response, and confirming that I'm not crazy 😄 I definitely share your sentiment regarding integration, as I'm sure many others do as well - Maxon acquisitioned Redshift in early 2019. What are they waiting for? Regarding the term 'playblast', I guess I assumed it was a more universal term for 'a viewport animation rendered to a movie file' - by 'universal' I mean software-agnostic. I was never much more than an occasional Maya user, but it sounds like this is the official term Autodesk uses for viewport renders so it's become something of an industry term in any studio I've worked at. Apologies for any confusion about that. Cheers Nik
  5. Hi @bentraje Thanks for the reply. I'm not sure I follow you - The way I have things setup, 'Shift+R' is the command that tells C4D to "invoke rendering" (essentially the 'render' button), which will automatically pull up the Picture Viewer window and show the frame buffer as rendering is happening frame by frame. 'Shift+R' will also of course invoke whatever the active render setting happens to be, as determined in the Render Settings window (Crtl+B), and C4D will render according to those settings. Now, once I'm in the Render Settings (Ctrl+B) window, I know that switching to "Hardware/Software Render" as you say, tells C4D to render whatever's showing in the viewport for the active camera. However, what I'm saying here is that this is only true as long as the shaders on my geometry are Standard c4d shaders. If they happen to be Redshift shaders (which is what I use most of the time these days), c4d's Hardware/Software renderer will render them as solid black (screenshot #2, above), even though they have lighting/shading on them in my active viewport (screenshot #1, above). In other words, if all my scene objects have Redshift materials, my 'playblasts' will render all objects solid black, no shading or lighting, which makes them largely useless. I've been using a clumsy workaround which is that I set up a separate 'playblast' sub-Take for all my cameras, and for those Takes, I swap out my Redshift materials with Standard c4d materials. By doing this, c4d's Viewport Renderer, will render things just as they're seen in my viewport. This is generally not too disruptive to workflow, but can becomes so if my scene happens to have a lot of materials. In any event, I'm wondering if I've missed some setting somewhere that would let me render exactly what I see in my viewports when I have Redshift materials assigned to my objects. Thanks again! NpF
  6. Hi @Igor What is a playblast?: A Playblast is a quick preview that lets you make a "sketch" of your animation, providing a realistic idea of your final render result without requiring the time needed for a formal render. "Playblast" basically just means view "viewport render", (or "GPU render" if you're old enough to remember a time before GPU raytrace engines were commonplace.)
  7. Hi all, I'm a big fan of viewport preview playblasts (and so are most of my clients). Am I missing something, or is there simply no way to create proper viewport renders once you're using RS materials and lights? To be clear, I can create OpenGL output file sequences using the "Make Preview" command from the Animate menu, as well from the Render Settings panel. However, neither way actually renders the lighting and shading that I see in my viewport, but rather all shaders render as solid black. What I see in my viewport when in Node Space: Redshift: What I see when I switch to Node Space: Standard Physical and what I get however I try to render using the viewport's renderer: Anyone have any remedies? Big thanks! NpF
  8. That's it! Many thanks @bezo 🙏!
  9. Greetings all, I've hit a little snag using the MoGraph Multishader, and I'm a little unclear on how it actually works. What it seems like it should do, or is doing, is that it assigns a particular shader to its corresponding clone by its index number. However, this doesn't seem to apply in the case where I'm using the Grid Array mode vs. Linear mode, even though the index numbers show the same range. It appears when using Grid Array mode, the Multishader can only iterate on clones along the x-axis of the array (see attachments). Is there any way to override this using an Effector or some other means, so that the Multishader's index corresponds with the Cloner's index #s, and the shader properties (in this case, the diffuse color) are actually transferred. Thanks ahead of time to any responders. NpF mograph-multishader.c4d
  10. @Teknow Thanks for the response! Cool solution, though my aim was ultimately to Break and Delete Nth points, rather than transform them. I guess I should have mentioned this in the first place 😳. I do wonder though if there's an Xpresso-based solution that would allow your node-network to select the actual points (or rather, see the actual points selected / yellow-highlighted in your viewport - not sure what the difference is, technically). I would have thought a 'Point Selection' tag could be dragged into the Xpresso window and would have some Input port that would allow you to feed the activated point id numbers into it so you could get a visual of it in your viewport, though this doesn't seem to be the case. In any event, I ended up finding a very simple but workable solution using the Point Selection tag in conjunction with Formula Field ( mod(id;x)=0 ). File attached. Thanks again! -NpF select-nth-point-on-spline.c4d
  11. Hi all, I'm looking for a way to select every Nth point along a spline. Same would go for a static mesh. It's something that there seems to be a wealth of tutorials on, if we're talking about using the Formula Effector on Mograph objects, but where are the tutorials for this for addressing simple static objects? Guessing there might be an Xpresso solution somewhere - but I haven't figured it out yet. Any help would be massively appreciated! -NpF
  12. @Cerbera My thoughts exactly! I'm certain I would have never arrived at that solution by myself - in fact I was unaware that you could affect the Matrix's falloff so definitively using a mesh object. When replicating the setup for myself, it also occured to me how much the 'meandering' effect of the Matrix's points are dependent on not one, but really four factors, (and these factors are somewhat interdependent): 1. the Tracer object's spline interpolation settings (Matrix object creates a point for wherever you'd have one along a spline so if you're not careful you can blow up your scene) 2. the Matrix object's Step setting, or how far apart you set your initial points are from each other 3. the Push Apart effector's Radius and Iterations settings (if radius is set too low for your initial point distance, the Matrix eats itself, meaning the points will decrease until there's one left) 4. Shape/size of Falloff geometry object is critical too, because its perimeter redirects the points inward so increases the density of spaghetti on frames that come after the perimeter is hit. Overall, a very cool effect/technique that requires no physics sim.
  13. Thanks @NWoolridgefor making those amendments. Color-coding is definitely called for in this case, and yours is a useful tip. I appreciate your feedback! NpF
  14. @MODODO 1000 thank yous for this!!! 🙏🙏🙏 👑👑👑 May be the best solution anyone could have come up with! -NpF
  15. @MODODO This looks very promising, and so I've been trying to replicate your setups, but not really succeeding: 😳 Let's take the bottom image first as it appears to be work only in 2 dimensions. My first guess is that the basic idea is that the SplineMask object is used as the Object template to redistribute the Matrix's nodes, while you are using the Matrix's parameters to animate the nodes, and the Push Apart effector keeps the nodes away from each other in a random direction, and and also keeps them from overlapping. On second glance, I notice that the green linework looks to be a single continuous line AND a closed loop. Does this mean the Matrix is only generating a single node, which wiggles around and finds its way back to its birthplace, while the Tracer 'records' its path? I also noticed that in the first example, you had a keyframe on the Matrix's 'Object' picker. Does this mean there's some kind of feedback loop going on between the initial point generated by the Matrix object, and the points created by the Tracer?? If you could drop another clue or two, I'd be ever so grateful! 🙏 NpF matrix-spaghetti_01.c4d
×
×
  • Create New...

Copyright Core 4D © 2021 Powered by Invision Community