Jump to content

C4D in the Apple event


BoganTW

Recommended Posts

On 10/22/2021 at 4:13 AM, Icecaveman said:

I was such a huge fan boy, but when the trashcans came out, I knew

100%. I feel incredibly sympathy for anyone who bought those trashcans...or the $12,000 Mac "Pros" that followed.

 

Anyone defending either of those Mac Pros has flawed judgment. Those machines are out-of-date, with no upgradeability one year after purchase.

 

If one does choose to go the new M1 Mac, which has more promise, massive Apple investment in the Blender Foundation has been assured. M1 will be tuned for Blender, so that's the good news.

 

The M1 Mac is a Blender Mac, IMO. Pair the most forward thinking processor with the most modern 3d package. Just my opinion. Apple seems to agree with my view with their recent investments.

Our academic program equipped a lab with those "trashcans" in 2014, and they were incredibly reliable performers. We have since equipped that lab with Threadripper PCs, which are great. But when COVID hit last year we were able to loan out the "trashcans" to students who were able to complete their animation projects largely at home, and remote into the lab to take advantage of the threadrippers for extra rendering potential. 

 

So, I don't hold to the hatred that those machines get. They were clearly an evolutionary dead-end, but they were well-built and reliable.

 

The notion that the M1 will be "tuned for blender" is bizarre, frankly. The opposite is true: with this investment, Blender will be tuned for the M1, the same way that Octane and Redshift are now tuned for the M1. The groundwork that Apple laid with developer support for Redshift and OTOY will now pay off in the adaptations of Cycles to the Metal framework.

 

You could actually make the case that the M1 is tuned for the workloads that Apple likes to think are the province of their Pro customers; this is evidenced by the silicon support for ML, and advanced video encode/decode (including ProRes).

Link to comment
13 hours ago, NWoolridge said:

Our academic program equipped a lab with those "trashcans" in 2014, and they were incredibly reliable performers. We have since equipped that lab with Threadripper PCs, which are great. But when COVID hit last year we were able to loan out the "trashcans" to students who were able to complete their animation projects largely at home, and remote into the lab to take advantage of the threadrippers for extra rendering potential. 

 

So, I don't hold to the hatred that those machines get. They were clearly an evolutionary dead-end, but they were well-built and reliable.

 

The notion that the M1 will be "tuned for blender" is bizarre, frankly. The opposite is true: with this investment, Blender will be tuned for the M1, the same way that Octane and Redshift are now tuned for the M1. The groundwork that Apple laid with developer support for Redshift and OTOY will now pay off in the adaptations of Cycles to the Metal framework.

 

You could actually make the case that the M1 is tuned for the workloads that Apple likes to think are the province of their Pro customers; this is evidenced by the silicon support for ML, and advanced video encode/decode (including ProRes).

The trashcans were a dead-end, as you state. "Reliable" and reliably lame performance...that's the trashcans! I would argue that Pro machines = ferocious performance today, upgradeable tomorrow. Want 3 GPUs? No problem. Want to replace those GPUs, maybe the CPU in two years with something way faster and keep rolling? No problem. 

 

As for the "tuned for blender"...I was satirizing and playing with Maxon marketing gimmicks and the OP for trying to ride Apple's coat tails. 

 

Let's be clear: Apple has commissioned some of their own internal technicians to get Blender humming on their hardware. That in addition to Apple's cash investments for the Blender Foundation. If Apple is investing in other 3d vendors, I've yet to hear about it. Reality: The M1-Blender combo will hit like no Mac/3d combo in history. If that's enough to compete with a big PC rack? Another question.

 

Call it Blender Pedal to the Apple Metal, call it synergy...whatever, If the Apple platform is your thing, Blender is going to purr like an African cat on the hunt.

 

Full disclosure: I might buy a MacBook or Mini this year, but my 3d work will continue to be PC-centric. Nvidia CUDA and RTX are still king...and c4d is still roadkill on the way to the future.

Link to comment

I definitely want to thank Maxon for their superb support of C4D and Redshift on the M chip. It makes me so glad I stuck with C4D since R6. I have a lowly M1 Mac mini with 16 Gb RAM (which was dirt cheap) and the experience has been a joy. I have forgotten what a crash is like and the viewport is so responsive! Insydium has been wonderful in the same way but they have not come out with a native renderer in Cycles so I have fully switched to Redshift. It is wonderful on the M chips! Soon I will have 32 cores of RISC chip and 64 Gb RAM and I am doing a countdown. Thank you Maxon for being so responsive! SREK is not too shabby either.

Link to comment
17 hours ago, Icecaveman said:

The trashcans were a dead-end, as you state. "Reliable" and reliably lame performance...that's the trashcans! I would argue that Pro machines = ferocious performance today, upgradeable tomorrow. Want 3 GPUs? No problem. Want to replace those GPUs, maybe the CPU in two years with something way faster and keep rolling? No problem. 

 

As for the "tuned for blender"...I was satirizing and playing with Maxon marketing gimmicks and the OP for trying to ride Apple's coat tails. 

 

Let's be clear: Apple has commissioned some of their own internal technicians to get Blender humming on their hardware. That in addition to Apple's cash investments for the Blender Foundation. If Apple is investing in other 3d vendors, I've yet to hear about it. Reality: The M1-Blender combo will hit like no Mac/3d combo in history. If that's enough to compete with a big PC rack? Another question.

 

Call it Blender Pedal to the Apple Metal, call it synergy...whatever, If the Apple platform is your thing, Blender is going to purr like an African cat on the hunt.

 

Full disclosure: I might buy a MacBook or Mini this year, but my 3d work will continue to be PC-centric. Nvidia CUDA and RTX are still king...and c4d is still roadkill on the way to the future.

From the sound of it Maxon didn't need Apple techs to walk over and show them how to get C4D working well on M1 hardware as the Apple and C4D guys were already in tight communication. But Blender fully tuned up on Apple Silicon does seem like a no-brainer for people to at least try once they have the hardware ready. It might click for them.

 

I hope you do get a MacBook or Mini at some point as you'll  be in a good spot to give impressions on how it all goes on that hardware. Though (knock on wood) I should be able to offer my own impressions in a few months.

 

I personally hope as much 3D software as possible gets fine tuned to run nicely on Apple Silicon, not just C4D and Blender but the other big apps as well.

Link to comment

The sweet thing about the M1 is the performance per watt. That can lead to quieter, cooler computing and a smaller utility bill.  I suspect once testing is in that Apple will also have an attractive mid-range offering, again setting aside right-to-repair, upgradeability, and Apple's obscene pricing for RAM and storage. That, and the dearth of cool games.

 

I think the verdict is still very much out about how Apple silicon will scale for premium performance. I've heard a number of pundits say that if Apple does even come out with a MacPro in the next 14 months that it will be on Intel/AMD iron, not their own chips. This, despite their announced roadmap. Scaling is a real thing, a real issue...something silicon upstarts haven't grappled with as deeply.

 

I look forward to real-world-non-cherry-picked benchmarks.

Link to comment
4 hours ago, BoganTW said:

Opening segment covering video rendering is worthwhile.

 

This guy is a known Apple schill. These are commercials. 

 

The M1 has dedicated hardware for certain decoders/encoders so it will of course be super fast at that. That doesn't say much about overall performance.

 

Video Codec rendering is not 3d rendering. 

Link to comment

It's tough to find objective reviews of any products online nowadays. 

 

This was exposed this last year by Hardware Unboxed and others. Nvidia and LG were sending out review hardware with a list of talking points. "You will review our product and these are highlights you will talk about." Hardware Unboxed said, no...we aren't going to follow your talking points. 

 

Linus seems mostly objective but gets mushy with LG and AMD.

 

Apple of course does the same thing with their designated reviewers: "You will mention video compression speed. You will talk about our amazing display. You will fake whine a little about the notch to sound objective (no one cares). You will hit battery life with a sense of awe and grandeur. "

 

I've lost track of "Apple reviews" that have followed this exact formula.

Link to comment

I'd only use the tests Ritchie did as evidence of the stuff he tested. And as a video editor the stuff he tested is of interest. 

 

I'm not interested in what Nvidia and LG are doing, so I'm not sure what an expose of those companies has to do with what Apple is up to. Is Apple a different company, or the same company as them? I thought it was different.

 

You say that Apple 'of course' does the same thing with their dedicated reviewers. I'm assuming the long sentence that you added in quotation marks as being said by Apple, is fictional, isn't referenced anywhere, and is just your imaginative take on what might be going on behind the scenes. That's great. Fiction is fun and it's always nice to read things people have made up. But I'm interested in real world results and data, and both Ritchie and Stallman provide data on the tests they did in their reviews. If you come across a different test elsewhere that says they got their numbers wrong, feel free to link it.

 

Is the M1 Max 'of course' being fast at certain encoding and decoding meant to be a bad thing? I thought it being fast at those things would be good, not bad. You also note that it doesn't say much about overall performance. I would have thought it said plenty about the overall performance of those particular tasks.

 

Side note. I don't care about the upcoming Mac Pro, and don't care that a PC equivalent with some great new graphics card added might be as fast, or faster. I also don't care about Blender, and don't care that the use of C4D on these machines will involve both Maxon and a software subscription. None of these things bum me out. I'm assuming that these new machines will be faster than my 2010 iMac, won't crash a lot, and will get decent performance to my needs. So regardless of the sad violin you frequently bring  into these threads. I view the glass as half full and will be happy to grab an iMac in a few months if they chuck one of these chips into it.

 

 

 

 

 

 

 

Link to comment
4 hours ago, BoganTW said:

I'm not interested in what Nvidia and LG are doing, so I'm not sure what an expose of those companies has to do with what Apple is up to. Is Apple a different company, or the same company as them? I thought it was different.

 

You say that Apple 'of course' does the same thing with their dedicated reviewers. I'm assuming the long sentence that you added in quotation marks as being said by Apple, is fictional, isn't referenced anywhere, and is just your imaginative take on what might be going on behind the scenes

 

I mention other companies to be gentle to Apple. Not single them out. Almost all multi-billion dollar companies try --and often succeed -- in controlling reviews. If you listen to Linus and Hardware Unboxed they share real world stories about how pre-release products for review come with stipulations and those absolutely do include talking points.

 

You think iJustine (lol) or Ritchie are going to give you anything but Apple's blessed messaging? Pretty naive. Apple even has a decades-old term for such people, "Evangelists."

 

Video compression benchmarks are cherry picked because they are the most skewed to make Apple hardware look great.

 

If you like gobbling up corporate messaging and corporate media...and view it as a road to bliss...you've been well conditioned and well trained. 

 

Link to comment

One thing Macs are used heavily for is Video editing. Any review that does not include this topic in a major way would miss it's target audience. Apple knows it's customers and thus implements hardware acceleration for the most computing intensive part of it, compression.

If, for what ever reason, Apple fails to deliver on this, it will be very quickly noticed and made public by disgruntled customers.

I'll just wait for Dimi https://www.youtube.com/c/dimitriskatsafouros to get his hands on one and tell his verdict.

 

On a related note, the fact that the M1 systems have unified memory is already a huge boost to video work, where classic systems need to transfer loads of data between RAM and VRAM to make use of the GPU computing cores, the M1 simply doesn't have to.

Link to comment
  • Guest unfeatured this topic
  • Recently Browsing   0 members

    • No registered users viewing this page.
  • LATEST ACTIVITIES

    1. 7

      Draw primitives on surface of existing objects?

    2. 0

      Nodes Modifier result isn't updated anymore

    3. 220

      Scene Nodes | Capsules file pit

    4. 3

      Flipped clones in multi-segments curve

    5. 2

      How to create continuous UV texture for irregular wall like shapes ?

×
×
  • Create New...

Copyright Core 4D © 2023 Powered by Invision Community