
BoganTW
Premium Member-
Posts
599 -
Joined
-
Last visited
-
Days Won
29
Content Type
Profiles
Blogs
Forums
Gallery
Pipeline Tools
3D Wiki
Plugin List
Store
Downloads
Everything posted by BoganTW
-
Looking back at greatest C4d tutorials / Instructors of the Century?
BoganTW replied to a topic in Discussions
I was going to post a Thanassis - Sexy Beast meme weeks ago when one thread or another was heated - Don Logan is back and he is not happy - but piked out and never hit the submit button. -
Looking back at greatest C4d tutorials / Instructors of the Century?
BoganTW replied to a topic in Discussions
All the above cited are fantastic. A shout out to the Maxon Training Team too, and to the ongoing 3D and Motion Show presenters. Amazingly helpful, all of them. -
Not really, but Rick Barrett responded to a question in an online chat more than five years ago, that redone dynamics were eventually coming. Since it's vaguely on their radar, they may as well go the distance and implement it.
-
Is anyone going to bother using a game engine to do dynamics in C4D? Equally - are many people going to be doing their dynamics in Houdini, then bring them from there over to C4D? If they're making things bounce and blow up in Houdini, wouldn't they just stay in Houdini? Maxon doesn't make Unity or Houdini so I would have figured they'd be looking at a way of improving the dynamics in C4D rather than pointing across the road and telling users to go to another application. Sure, I can see some people saying since Unity and Houdini are so great, why bother using C4D for it or hoping for an improved solution? But doesn't this defeat the purpose of developing anything at all for C4D? I'm not sure why the existence of game engines and Houdini means that C4D shouldn't bother improving what they have.
-
Stu, one of the top guys now in charge of Maxon development, has a heavy VFX background, and not much motion graphics background at all. He used to work at ILM, and has done a ton of FX work since. This doesn't change what anyone else in the thread has said, but it possibly suggests VFX is now more likely to be on their radar. It's certainly on his. Scene nodes will be part of anything Maxon makes in future. Not sure why this means particles need to be at the end of the line.
-
Thank you. We were stuck in lockdown here in Melbourne and couldn't attend, so I had weird stuff happening like the lady who won best actress at the fest 'liked' my Instagram post about the movie shortly after it screened, and a director with a feature there suddenly followed me on Twitter. Go figure. I'm hopeful when next year's festivals come around we'll have something to show again and can attend in person. Final Cut and Resolve both appear to run happily on the new Macs (especially the former), and I'm sick of waiting for the various Adobe products to reach the same level of reliability.
-
ICM. I’ve been doing 3D stuff for about a year and a half. I render regularly - stills - and occasionally animations. It’s nothing fancy and it is dabbling. My main preoccupation is filmmaking, and I had a film screen at the TCL Chinese Theater in LA last month, its third international fest. Yes, it’s a guess that a 2021 MacBook Pro will be better than my 2010 iMac. Not many people are arguing that it won’t be. My issue is that I need to also have a workable laptop for portable shoots and for travel, something with a fast SSD that can run Resolve and let me grade in Fusion. The new MacBooks fit this. Your experience in working for multiple companies doesn’t change my requirements, which is to have something which can cover multiple bases, which will be reliable and won’t crash, and which has most of its software optimised for M1 so I can get a bit extra bang for the buck. Your suggestion for Mac users is on target as far as it goes, but doesn’t cover my need to have something that can edit 6K Blackmagic files when I’m shooting and cutting on location, or overseas. The M1 can largely do that but being able to hit 32GB RAM is better, and the original M1 can’t do that.
-
It's trivial for what I'll ultimately be doing as it will relate to either stills, or animations where I plan to work on it for a week or so, then render at the end of the week, at which point I won't care if a one hour render becomes three or four. Probably also pertinent. I'm unlikely to be doing animations or stills professionally for quite some time if ever - 3D will be a time sink on the side - so the professional requirement to get 3D renders done on deadline won't loom as large for me as for others. Again this is why I don't see the need to jump over to Windows and invest in multiple GPU's just to get things rolling a few times faster. So what are we arguing about? The new MacBook Pros will be more powerful than an Air. Is this a problem? This is true but the last time I did video editing I found Windows asking for updates to be multiple times more annoying than any of the other impediments you've quoted. A 2021 MacBook Pro will be faster than my iMac from 2010, so hmmmm and emoji all you want. Again I don't see the problem. Cool. If you respond to this post please just do a fat paragraph, and I'll do the same, the multiple sentences quoted in staccato fashion has its uses and I've done it myself in the past. but this week I'm not in the mood.
-
I already know all that. Yet somehow people still managed to work in Cinema 4D a handful of years ago when things were a bit slower. Go figure. Now they're a bit faster. You somehow seem determined to bring in the peak workload of a professional artist cranking out multiple shots under a deadline, in a thread about pretty decent laptops that can now run C4D very nicely. Are they really going to be buying one of these machines to do their work on? You mention the example of someone doing look dev with lots of iterations. Is a professional 3D artist going to be doing his look dev on a 14-inch MacBook? I thought he'd be seeking out a bigger screen for a start. So then maybe that artist buys a big screen and connects it to his laptop. That would be better. But with that bigger screen he'd be doing this at a desk or workbench, so he'd really be better off getting a desktop, maybe with all those GPU's you mentioned, as he could then update the parts and replace stuff if needed. That would make sense. But that setup is not a laptop. Does the person want to buy a laptop, or do they not? There's a reason people buy them sometimes, and it isn't because they want a desktop. It's because they want a laptop. If you're buying a laptop and doing laptop stuff with it, it's not a huge help to know that a desktop can be faster. Call the news networks, a desktop machine sitting on your desk can do some more things than the portable MacBook you chuck in your bag. Who would have thought this could be the case? The questions may be though, if someone gave you an M1 Max laptop with R25 (or Blender or whatever), would you get any work done? Or would you be pulling your hair out because that particular system isn't as fast as your rig that has 3 x expensive GPU's attached? Maybe you'd be doing the latter. Shrug. I'll be coming from a 2010 iMac with a set-up that did CPU rendering only. When I'm using the laptop, I'm not sure I'll be bothered that a desktop with multiple expensive GPU's attached is faster. From the sound of it you might not be able to do it but plenty of artists manage to remember their goals despite being interrupted sometimes. They even managed to do it several years ago, when the interruptions were longer. Nothing personal but there are already heaps of videos appearing on Youtube with people doing C4D scenes on the new MacBooks and expressing happiness at how fast it is. One guy also appeared to test the very same card you mentioned earlier and noted that yeah, it was a bit faster - yet for some reason he also doesn't seem to care that much. Again - shrug. Did anyone ever get any work done prior to that card you mentioned being on the market? I'm going to go out on a limb and say that they did. To an extent though, while your posts remain informative and thoughtful, this thread is generally about what it will be like to use C4D on the new laptops, and I understand you won't be using C4D on these new laptops as you presumably aren't using C4D again in the future. So I'm not sure why you're hugely concerned about how C4D will be running on these laptops, as you will never be using C4D on them. Will someone who is using C4D on this laptop be happy? Probably.
-
Wow, twenty two seconds. If he saves that much time ten times a day, that's three and a half minutes daily, or a bit under half an hour each week. Or if he does it forty times a day, that's around 15 minutes a day that he's saved, giving him a bit longer to go make a sandwich or something. If the test scene is a still, and he's going to be cranking out long animations, the comparison makes more sense as the 24fps adds up. Which begs the question if he's the sort of guy who absolutely has to get that client animation done by 5pm this afternoon, or if he's working on a project where he can just let it render during dinner while he walks outside to enjoy the fresh air and the sound of kids playing in the local park. I can see how one of those two could be a PC guy and the other an Apple guy TBH. I'm not sure if the aforementioned stats make up for being stuck in the Windows ecosystem for the next half decade with a noisy PC and its multiple GPU's cooking away on the desktop, attached to a monitor which you've had to shop around for in the hope that it will be as colour correct and nicely calibrated as the iMac ones will be come this April. And if the guy is under the gun and absolutely has to get those scenes to clients, can't he just chuck it to a render farm rather than stockpiling 3080TI GPU's, which (checks Google) seem to top out over $2500 Aussie dollars each? Maybe he does multiple scenes each week and his life is spent cranking out one animation after another, while the PC is cooking away and he's doing the things Windows users do to get the program to stop asking for updates. The data is useful and might be extra useful to Octane fans, but you're not really selling me on the benefits of this. It's faster but for some reason I don't really care. Different strokes for different folks I guess. Also is this guy comparing a maxed out desktop to a portable laptop? I'm not sure if he's noticed but there are additional things laptops can bring to your day besides GPU rendering speeds.
-
A very funny review from Stu Maschwitz (@ Maxon) of the new laptops. I don't understand his brief Redshift demo image but the overall discussion is good and informative. https://prolost.com/blog/m1max
-
Filipstamate, I skipped over most of your first long paragraph as I got bored reading it. Sorry. You have this habit (I won't use the word dumb as that would be insulting) of loosely paraphrasing or making up comments and then sticking quote marks around it, then asking someone to address the comment you've chucked into quote marks as if they were the one who said it. You did it to Rick earlier too. From memory I never said subs were good for every single person anywhere and everywhere - which is what you've written and thrown quote marks around - I just said that they were perfect for me and I didn't understand all the fuss and whining. I also conceded maybe thirty pages back (or was it 25?) that I could see why it didn't suit a few people here and there. Given that I've already conceded that it won't suit the occasional unfortunate soul, you'll need to rephrase your commentary into something more accurate if you want me to read it. Feel free to dig back twenty or thirty pages and dig up exact quotes if you want me to respond to stuff I've said earlier, otherwise don't bother and stop wasting everyone's time writing nonsense. Nothing personal but if you're going to spend time constructing an argument about stuff I've said, it's probably best you address it to stuff I've said rather than stuff I haven't.
-
Ingvarai, maybe just use the R25 demo for two weeks and see if you like it? I'm not sure if any newcomers are going to be diving into R25, then they somehow see the older GUI and complain that the old one looks better. Jops you make some good points. But with marketing maybe lots of people these days just go to Youtube and see the coverage there? I think there's a dozen or more channels that regularly cover each batch of new C4D features in depth. That's how I heard about them. Also Jops the Core4D spellcheck keeps changing your name to 'cops' which might come in handy if the thread gets rowdy and some of the more agitated posters need to be sent behind bars for a bit.
-
TBH I'd find it annoying if someone dug up comments I'd written nearly two months ago just to use it anew as a cudgel.
-
I see links citing last year's M1 chip as 7608 in multicore mode. In the above video the M1 Max gets 12173.
-
Interesting C4D, Blender, Redshift results here. Turn the captions on for English subtitles. https://www.youtube.com/watch?v=nJK2m4YIK4s
-
We're probably more in agreement than not. All the things you listed in that first post sound good. I'm maybe less sceptical than you of the stuff cited in that next post. That's all fine. I'm probably less bothered about having my computer remain King of the mountain for years thereafter. I just want it it to work reliably and somewhat decently. A solid Mac should do both of those for me. If you do get a Mac mini please chuck Blender on it at some point and let us know how it travels, I'm assuming it will be decent (though maybe 16GB Ram will be an issue) but would still like to hear what you think. I don't think a dirge is needed with C4D discussions, would suggest maybe Hall and Oates or (if in a snappy upbeat mood) maybe Royksopp.
-
I'd only use the tests Ritchie did as evidence of the stuff he tested. And as a video editor the stuff he tested is of interest. I'm not interested in what Nvidia and LG are doing, so I'm not sure what an expose of those companies has to do with what Apple is up to. Is Apple a different company, or the same company as them? I thought it was different. You say that Apple 'of course' does the same thing with their dedicated reviewers. I'm assuming the long sentence that you added in quotation marks as being said by Apple, is fictional, isn't referenced anywhere, and is just your imaginative take on what might be going on behind the scenes. That's great. Fiction is fun and it's always nice to read things people have made up. But I'm interested in real world results and data, and both Ritchie and Stallman provide data on the tests they did in their reviews. If you come across a different test elsewhere that says they got their numbers wrong, feel free to link it. Is the M1 Max 'of course' being fast at certain encoding and decoding meant to be a bad thing? I thought it being fast at those things would be good, not bad. You also note that it doesn't say much about overall performance. I would have thought it said plenty about the overall performance of those particular tasks. Side note. I don't care about the upcoming Mac Pro, and don't care that a PC equivalent with some great new graphics card added might be as fast, or faster. I also don't care about Blender, and don't care that the use of C4D on these machines will involve both Maxon and a software subscription. None of these things bum me out. I'm assuming that these new machines will be faster than my 2010 iMac, won't crash a lot, and will get decent performance to my needs. So regardless of the sad violin you frequently bring into these threads. I view the glass as half full and will be happy to grab an iMac in a few months if they chuck one of these chips into it.
-
And a crazier test of pro apps at the end of this one. Hopefully some dedicated C4D tests will turn up soon.
-
Opening segment covering video rendering is worthwhile.
-
From the sound of it Maxon didn't need Apple techs to walk over and show them how to get C4D working well on M1 hardware as the Apple and C4D guys were already in tight communication. But Blender fully tuned up on Apple Silicon does seem like a no-brainer for people to at least try once they have the hardware ready. It might click for them. I hope you do get a MacBook or Mini at some point as you'll be in a good spot to give impressions on how it all goes on that hardware. Though (knock on wood) I should be able to offer my own impressions in a few months. I personally hope as much 3D software as possible gets fine tuned to run nicely on Apple Silicon, not just C4D and Blender but the other big apps as well.
-
I thought the segment was more designed to suggest C4D will run faster than before on the new hardware, than it did on the old. I'm assuming this is true, unless anyone wants to argue that it will run slower. Blender users will benefit from the new Macs. C4D users that get the new hardware will also benefit. I dunno if Maxon arguing this that this is the case is silly, or deceitful, or lame. Possibly it would have been more silly if Maxon had taken out a big ad telling people to go use Blender instead, which almost seems to be what you're arguing here. It would be logical for the Blender Foundation to make this statement, maybe less so for Maxon to bother doing it. I do not see how C4D running nicely on the new Macs, and Apple and Maxon celebrating this, is somehow a bad thing. C4D was first 3D app out of the gate to be optimised for M1, and they didn't need a grant from Apple to get there or stay there. To keep things agreeable I'll note ICM is right and if Blender is optimised nicely for these new machines, it will be a really good thing for Blender users. And I think the optimisation is more of a done deal now that Apple is stepping in - I recall months after an earlier Blender announcement circa the M1 launch a thread or two posted at Blender Artists were noting, has that M1 code been installed into the software yet, oh I thought Fred was going to do it, maybe Bill can do it when he gets a chance, or else we can wait a while, it'll get done soon I think, surely they won't leave it for a long while without finishing the task, guys? etc etc. Somehow I don't think the Maxon guys at Friedrichsdorf run things in quite the same manner, which might be why Apple thought Blender could use some extra cash and a pat on the head to help them along with the task, while Maxon was already there a year ago - which might also explain why Maxon and C4D have been invited onboard Apple presentations the past year or so. The suggestions that this is all a slick advertising move implies that Nemetschek put their hands deep into their pockets to grab pride of place in Apple's global keynote, and tbh I wasn't aware that Apple was that strapped for that they were putting sections of their keynote up for sale. But if Apple thinks Blender is the future of 3D this will be borne out if they tout Blender to the same degree in future presentations - something they haven't felt the urge to do up till now.
-
I’ll happily take that loss not to go near Windows ever again.
-
I'm still using my iMac from 2010. It's long overdue for replacement but it's still ticking along. None of the PC's I ever had managed anything near that amount of longevity. I understand things may be better now, and that the ease of replacing parts, RAM and whatever brings additional benefits for anyone wanting to upgrade their PC. Just for me though I'm glad there are Mac options coming back again that look decent. I'd guess the new latest Apple silicon chips would run great with Blender, and would do ok with Houdini if SideFX ever look at getting things working on it - they might and they might not, who knows. But I wouldn't assume the various industry professionals popping up midway through the Apple video to tell us how happy they with its performance would be doing so if the machines ran like crap. There will surely be tons of benchmarks and comparisons and stuff online in the coming weeks too.
-
It is a lot of money. The 'woooo! I have a Mac and I like it!' feeling usually comes some months later.