• Welcome to our Forum! Ask PC-Build Questions, discuss Tech-News, Content Creation & Gaming Workloads or get to know the CGDirector Community off-topic. Feel free to chime in with insight or questions on any existing topic too! :)

PC for C4D high poly scenes, simulations/dynamics and GPU rendering

EJ_

EJ_

Tech Intern
Joined
May 22, 2021
Messages
2
Reaction score
0
Points
1
I’ve been debating what to put into a new PC build to run C4D that I would like to keep for a very long time and upgrade in the future as needed. I also use After Effects, and I’d like to get into Unreal Engine at some point.

I was a Mac guy my entire career until I got my current PC about 5 years ago that was basically an off-the-shelf item, so I didn’t have to think much about what went into it.

The kind of work I’ve been doing lately is sort of like arch visualization 3D explainer-ish videos that have me hunting down random models on Turbosquid, etc... and I don’t have time to optimize them. For example, if I find a nice detailed car model, that one object alone could be over 1 million polys, which really makes the viewport chug. I also do a fair a mount of simulations with xparticles and dynamics like soft bodies, etc. and have trouble iterating and art directing the effects because I can’t see them in real-time. The renderer I use the most is Octane, but Ive also been messing with Redshift a little.

I’d like to stay under $10k if possible, my main sticking point right now is whether I go with a Ryzen 5950x or a 32-core Threadripper (and whether it should be the TR Pro version).

My build looks like this right now:

Obsidian Series 750D Airflow Edition
1600w PSU
Motherboard TBD

Ryzen 5950x or 32 core TR (Pro?)
128GB RAM
2x 3090 I might start with one and ensure the build has capacity to add a 2nd later (could be a sticking point for the 5950x?)
2x 2TB M.2 drives
1x 4TB SSD

A lot of what I’ve read says C4D mostly needs a single high clock speed processor, is that true for EVERYTHING that isn’t CPU rendering? Someone told me that high poly scenes and simulations benefit from multiple cores, but I haven’t had luck verifying that.

If I do end up going the 5950x route, I believe I saw an article that I need to track down that said AMD may be releasing an updated version of it soon. Has anyone heard that?

Thanks for reading and any advice offered!
 
Jerry James

Jerry James

Hardware Nerd @ CGDirector
Staff member
Joined
Jun 19, 2020
Messages
767
Reaction score
141
Points
43
If you're doing any sort of active workstation work on the viewport, I'd recommend the Ryzen 9 5950X over Threadripper. It'll be a much smoother experience to work with. And yes, simulations (especially physics-based like fluids) do scale with core counts quite well.

Here are a few tests we did with X-particles - https://www.cgdirector.com/x-particles-benchmark-results/

As for the new stepping of AMD processors, I wouldn't wait. It's most likely a minor refinement that won't have any noticeable performance improvements. It's more likely to be something that helps them bring down operational costs or increase efficiency etc.
 
EJ_

EJ_

Tech Intern
Joined
May 22, 2021
Messages
2
Reaction score
0
Points
1
I’m thinking the 5950x is the way to go. Do you know what the best mother board would be for dual 3090 GPUs and 2x m.2 drives and about how much of a performance hit I’ll take vs a single 3090?

Also, are there any downsides to not having 2 of the same GPU, if I went with one 3090 and one 3080?

thanks!
 
Last edited:
Jerry James

Jerry James

Hardware Nerd @ CGDirector
Staff member
Joined
Jun 19, 2020
Messages
767
Reaction score
141
Points
43
I’m thinking the 5950x is the way to go. Do you know what the best mother board would be for dual 3090 GPUs and 2x m.2 drives and about how much of a performance hit I’ll take vs a single 3090?
For dual GPUs, I'd recommend the ASUS WS X570 ACE Motherboard. For professional workstations with multiple GPUs, that's my go-to recommendation.

Also, are there any downsides to not having 2 of the same GPU, if I went with one 3090 and one 3080?
Nope, no downsides :)
 
Top