Best PC Upgrade for Daz?

I don't know a lot about computer hardware. When I had my current desktop built, I asked for something powerful, and let a local shop put one together for me. Here are my current specs:

Nvidia GeForce GTX 1060 6GB
Processor: Intel Core i7-8700K CPU @ 3.70GHz
Installed memory: 16 GB

Daz 4.12 can get a little slow sometimes. I make Iray images as well as animations (usually 60 frames give or take). I also use Photoshop. I've been wanting to improve my PC or take a little strain off it. Based on these specs, where should I start? Or is there something I'm leaving out all together that would be helpful?

Thank you.

«1

Comments

  • fastbike1fastbike1 Posts: 4,077

    Not enough CPU Ram, Not wnough GPU VRAM. "something powerful isn't specific enough for a reputable to shop to go on. If they didn't ask you what you wanted to do, I wouldn't go back to them. If you gave them a specfic price limit, then it's on you.

    32GB RAM and 8 GB Vram should be the minimum. An RTX 2070 Super or 2080 Super would be a significant improvement. Leave your 1660 to run the monitor unless you can recoup significant cost by selling it.

  • DefaultNameDefaultName Posts: 388
    fastbike1 said:

    Not enough CPU Ram, Not wnough GPU VRAM. "something powerful isn't specific enough for a reputable to shop to go on. If they didn't ask you what you wanted to do, I wouldn't go back to them. If you gave them a specfic price limit, then it's on you.

    32GB RAM and 8 GB Vram should be the minimum. An RTX 2070 Super or 2080 Super would be a significant improvement. Leave your 1660 to run the monitor unless you can recoup significant cost by selling it.

    I don't have many options where I live. I've been pretty happy so far, it's lasted two years, made hundreds of renders and animations, and is still performing relatively well.

    That being said, what do you think of all that should I upgrade first? Or more specifically, what should I add to my existing setup? Can I add another 16GB Ram and have 16+16 for 32? Or does it not work that way?

    Thank you.

  • kenshaw011267kenshaw011267 Posts: 3,805

    Yeah I don't build systems to vague things like "powerful."

    You need to find someone who will ask what you want to do with the system and then build to those needs.

    In your specific circumstance, the CPU and system RAM are fine (16Gb is more than enough)

    Your major issue is the 1060. As suggested the 2070 Super is a beast of an 8Gb card and will let you get more scenes to render on the GPU and the ones that do will render faster.

  • DefaultNameDefaultName Posts: 388

    Yeah I don't build systems to vague things like "powerful."

    You need to find someone who will ask what you want to do with the system and then build to those needs.

    In your specific circumstance, the CPU and system RAM are fine (16Gb is more than enough)

    Your major issue is the 1060. As suggested the 2070 Super is a beast of an 8Gb card and will let you get more scenes to render on the GPU and the ones that do will render faster.

    Okay, thanks. Now, I've noticed the RTX 2060 6GB is about half the price of the 2070. Can I simply add a 2060 to my existing 1060 for an improvement? Or would it not really matter?

  • Lord TherosLord Theros Posts: 64

    the 2060 would only give you some improvement on render speed on Iray, but having the same amount of memory you wouldnt be able to do bigger scenes. IMHO upgrading from a 1060 6Gb to a 2060 isnt worth it.

    I had a 1060 and upgraded to a 2070 8Gb, got me able to do slightly bigger scenes and a much faster render time.( remeber to check if your PSU has enough juice to power the new card)

    On the rest of your rig, if its working nice now I would keep it, maybe upgrade to 32 or 64 Gb of RAM. or just go for a full system upgrade (MOBO,CPU,Memory and PSU.

    But if you rendering in 3Delight is a completely different setup.

  • kenshaw011267kenshaw011267 Posts: 3,805

    Yeah I don't build systems to vague things like "powerful."

    You need to find someone who will ask what you want to do with the system and then build to those needs.

    In your specific circumstance, the CPU and system RAM are fine (16Gb is more than enough)

    Your major issue is the 1060. As suggested the 2070 Super is a beast of an 8Gb card and will let you get more scenes to render on the GPU and the ones that do will render faster.

    Okay, thanks. Now, I've noticed the RTX 2060 6GB is about half the price of the 2070. Can I simply add a 2060 to my existing 1060 for an improvement? Or would it not really matter?

    If you buy I'd at least get the 2060 Super which has 8Gb. You'll get a lot from those 2 extra Gb. But yes if your PSU can handle a second card adding a 2060 (get the EVGA 2060 KO as it has 2080 level CUDA) will make your renders faster, at least the ones that fit on the cards.

  • DefaultNameDefaultName Posts: 388

    the 2060 would only give you some improvement on render speed on Iray, but having the same amount of memory you wouldnt be able to do bigger scenes. IMHO upgrading from a 1060 6Gb to a 2060 isnt worth it.

    I had a 1060 and upgraded to a 2070 8Gb, got me able to do slightly bigger scenes and a much faster render time.( remeber to check if your PSU has enough juice to power the new card)

    On the rest of your rig, if its working nice now I would keep it, maybe upgrade to 32 or 64 Gb of RAM. or just go for a full system upgrade (MOBO,CPU,Memory and PSU.

    But if you rendering in 3Delight is a completely different setup.

    Thank you! I had a couple of questions -

    1) Do you have any RAM cards you'd recommend? It seems to be debateable, but I think I should try to upgrade RAM too.

    2) I typically use Iray. But I've been using LineRender9000 with 3DL. Is there a significant difference in that setup?

    If you buy I'd at least get the 2060 Super which has 8Gb. You'll get a lot from those 2 extra Gb. But yes if your PSU can handle a second card adding a 2060 (get the EVGA 2060 KO as it has 2080 level CUDA) will make your renders faster, at least the ones that fit on the cards.

    Great, thanks! I'm a little indecisive, and really considering my budget, but perhaps I can spring for the 2080 or 2070. I'll need to call around as these things cost more in Aus than they do in the US.

  • LenioTGLenioTG Posts: 2,118
    edited May 2020

    I have 32GB of RAM, and sometimes I wish I had more, but it's fine.

    I've recently upgraded to a 2TB SSD to install there all of my products, and I'd do that 1000 times again, but the GPU is more important.

    The RTX 2060 Super is a great value. But keep in mind that in a few months we could see Nvidia Ampere.

    Your CPU is fine!

    Post edited by LenioTG on
  • espindavespindav Posts: 20

    It has been many years since I put a computer together.  My last system I purchased out of the box from BandH Photovideo it was a gaming computer.  That was a little over 10 years ago.  I am also looking to get a new computer and I came accross this site that helps to understand what is needed for renders.  https://www.logicalincrements.com/articles/building-pc-3d-rendering-animation  ;

    This site has a breakdown of the best type of build for your budget.

  • kenshaw011267kenshaw011267 Posts: 3,805

    Those builds do not take DS into account. What works for Blender or Maya may not work for DS. for instance the budget build includes a Radeon GPU and would be terrible for iRay in DS.

  • Hello,

    sorry for my poor englih.

    Tell me if i need to decide between workstation with xeon like :

    HP Proliant dl160 gen8 1U 2 x Intel Xeon 3.10Ghz 16 Core 32 threads 96Gb RAM

    2 x Procesoar Intel Xeon E5-2665 2.40ghz - 3.10ghz turobo 16 core / 32 threads
    4 x 24 Gb ram = 96 Gb Ram [ 384 GB Maxim upgrade ]
    2 x 1 Tb SAS Stocare Western Digital Enterprise Storage

     

    and

     

    Lenovo Legion C530-19ICB

    PC Lenovo Legion C530-19ICB cu procesor Intel® Core™ i5-9400F pana la 4.10 GHz, Coffee Lake, 16GB DDR4, 512GB SSD M.2 2280 PCIe, NVIDIA GeForce GTX 1650 4GB GDDR5

    which is better ?

     

    If recommend servr workstation with xeon tell me how can instal windows 10?

  • Hello,

    sorry for my poor englih.

    Tell me if i need to decide between workstation with xeon like :

    HP Proliant dl160 gen8 1U 2 x Intel Xeon 3.10Ghz 16 Core 32 threads 96Gb RAM

    2 x Procesoar Intel Xeon E5-2665 2.40ghz - 3.10ghz turobo 16 core / 32 threads
    4 x 24 Gb ram = 96 Gb Ram [ 384 GB Maxim upgrade ]
    2 x 1 Tb SAS Stocare Western Digital Enterprise Storage

     

    and

     

    Lenovo Legion C530-19ICB

    PC Lenovo Legion C530-19ICB cu procesor Intel® Core™ i5-9400F pana la 4.10 GHz, Coffee Lake, 16GB DDR4, 512GB SSD M.2 2280 PCIe, NVIDIA GeForce GTX 1650 4GB GDDR5

    which is better ?

     

    If recommend servr workstation with xeon tell me how can instal windows 10?

    What GU does the first have? If you are rendering in Iray then you'd do better to put the money into an nVidia GPU (or GPUs) with plenty of memory, assuming your scenes actually need a lot of memory in the first place.

  • RexRedRexRed Posts: 1,296
    edited November 2020

    If you want something super powerful you need to start from the bottom up.

    First the case needs to be a full tower not a mid tower, the motherboard needs to be an EATX not ATX. (Windows Pro)

    https://smile.amazon.com/gp/product/B078TR5CMG/ref=ppx_yo_dt_b_asin_title_o05_s00 (I just bought this)

    You need an I9 x299 processor (or comparable) with 44 lanes to accommodate two RTX 3090s (with NVLink) and at least 64gb of system ram.

    1600 watt power supply

    That is "really" powerful.

    (Don't get rid of your old computer, it can be used as a streaming computer...)

    Post edited by RexRed on
  • JamesJABJamesJAB Posts: 1,760
    RexRed said:

    If you want something super powerful you need to start from the bottom up.

    First the case needs to be a full tower not a mid tower, the motherboard needs to be an EATX not ATX. (Windows Pro)

    https://smile.amazon.com/gp/product/B078TR5CMG/ref=ppx_yo_dt_b_asin_title_o05_s00 (I just bought this)

    You need an I9 x299 processor (or comparable) with 44 lanes to accommodate two RTX 3090s (with NVLink) and at least 64gb of system ram.

    1600 watt power supply

    That is "really" powerful.

    (Don't get rid of your old computer, it can be used as a streaming computer...)

    First off, with rtx 3090 GPUs in a system, there is no reason to go CPU heavy on your build.
    You should be looking at something more like a Ryzen 5 3600 and a PCIe 4 motherboard as your base.  With 2 nvlink 24GB cards, I would go with at least 96GB of RAM.

    Now before you go on about number of PCIe lanes, keep in mind that each PCIe 4 lane is double the speed of PCIe 3.  This means that 2x PCIe 4.0 RTX 3090 cards @ 8x have the same bandwidth as same cards running @ 16xon a x399 board. (also, PCIe bandwidth will only affect the initial render scene load time, not the actual rener speed or NVLink speed)

  • RexRedRexRed Posts: 1,296
    edited November 2020
    JamesJAB said:
    RexRed said:

    If you want something super powerful you need to start from the bottom up.

    First the case needs to be a full tower not a mid tower, the motherboard needs to be an EATX not ATX. (Windows Pro)

    https://smile.amazon.com/gp/product/B078TR5CMG/ref=ppx_yo_dt_b_asin_title_o05_s00 (I just bought this)

    You need an I9 x299 processor (or comparable) with 44 lanes to accommodate two RTX 3090s (with NVLink) and at least 64gb of system ram.

    1600 watt power supply

    That is "really" powerful.

    (Don't get rid of your old computer, it can be used as a streaming computer...)

    First off, with rtx 3090 GPUs in a system, there is no reason to go CPU heavy on your build.
    You should be looking at something more like a Ryzen 5 3600 and a PCIe 4 motherboard as your base.  With 2 nvlink 24GB cards, I would go with at least 96GB of RAM.

    Now before you go on about number of PCIe lanes, keep in mind that each PCIe 4 lane is double the speed of PCIe 3.  This means that 2x PCIe 4.0 RTX 3090 cards @ 8x have the same bandwidth as same cards running @ 16xon a x399 board. (also, PCIe bandwidth will only affect the initial render scene load time, not the actual rener speed or NVLink speed)

    Please list an ATX or EATX PCIE 4.0 motherboard that has a slot layout for 2 RTX 3090s and NVLink 4 slot SLI spacing.

    I could not find one... 

    The one I listed was $150 and does the same thing and you also get a fast processor to boot.

    While I am rendering I can still use my PC for other thigns (like making music) becasue it has a good processor too. 

    I also make music with my PC, I also work with Premiere and After Effects and my music software does not use my graphics processors much, it relies heavily on my CPU.

    When you are talking about 100 tracks with a myriad of realtime effects while mixing and mastering simultaniously. I am glad I did not skimp on the processor.   

    This is from Google and Adobe search today.

    Which processor is best for After Effects?

    What CPU is best for After Effects? Currently, the CPUs we most often recommend for After Effects is the Intel Core i9 10900K 10 Core, followed closely by the 3rd generation AMD Ryzen 9 3900X or Ryzen 7 3800X processors.

    Daz renders often need a home like a video or a slideshow and not all video editors and effects rely solely on the GPU.

    This is also from Google

    Currently, the disadvantage of AMD CPUs are its lower per-core performance compared to competing Intel ones & better game support for Intel CPUs.

    I don't know about anyone else but I am a gamer also and I don't mind paying more for better preformance.

    I also own an AMD PC with a Ryzen 7 CPU and it is in a PCIE 4.0 motherboard but it is for streaming not for my workstation.

    What you said about the graphics cards not needing 16 lanes on PCI 4.0 boards is very interesting and once 4.0 boards become more mainstream it is something to definitely consider,

    I personally recommend getting a good processor also.

    About 15 years ago I took one of Intel's promotional 2D art pieces and remade it in a 3D program (Bryce) and sent it to them. They sent me back a free motherboard and CPU... They said I, "brought their artwork to life".

    I have been a fan ever since.

    Post edited by RexRed on
  • https://www.newegg.com/p/pl?d=x570

    As to getting CPU recommendations from random Google searches? Just don't. 

    For the uninformed, the best processors, price to performance are Ryzen CPU's. Unless AMD has totally blown it the Ryzen 5000's that will release Thursday, and will fit in the X570 motherboards with the 3 and 4 slot spacing needed for 3090's if you want NVLink, will not just be even better multi threaded than the current Ryzen chips they will be significantly faster single threaded than any Intel chip on the market.

    We'll know for sure Thursday but if current reported performance numbers bear out Intel is dead on the desktop until they come out with something new or drastically slash prices.

  • RexRedRexRed Posts: 1,296
    edited November 2020

    https://www.newegg.com/p/pl?d=x570

    As to getting CPU recommendations from random Google searches? Just don't. 

    For the uninformed, the best processors, price to performance are Ryzen CPU's. Unless AMD has totally blown it the Ryzen 5000's that will release Thursday, and will fit in the X570 motherboards with the 3 and 4 slot spacing needed for 3090's if you want NVLink, will not just be even better multi threaded than the current Ryzen chips they will be significantly faster single threaded than any Intel chip on the market.

    We'll know for sure Thursday but if current reported performance numbers bear out Intel is dead on the desktop until they come out with something new or drastically slash prices.

    Kenshaw, PC Gamer has reviewed the processors and according to their results Intel has the fastest CPU "for gaming" on the market in 2020.

    This is also becasue the Intel chipsets are optimized for Nvidia.

    https://www.pcgamer.com/best-cpu-for-gaming/

    I am assuming this thread is about the best upgrade not the most affordable.

    Given that AMD chipsets and processors are optimized for AMD graphics cards and that Intel Chipsets are optimized for Nvidia Iray and Intel chipsets;

    It seems Intel also has the edge when it comes to rendering in Daz.

    I am not sure of the logic of buying a motherboard that has a chipset that is not optimized for the graphics card you plan to run in it. 

    The next Gen AMD processor will even further entrench them in proprietary graphics and their driver support is terrible.

    Oh and a 64 Core Threadripper is $4000.00, not what I would call competitively priced.

    Maybe PC Gamer is, "uninformed". :)

    Post edited by RexRed on
  • LOL.

    Ryzen 5000 releases Thursday. 

    From the article you clearly did not read

    "Before we go any further, it's worth noting that AMD is about to release a slew of new CPUs using its Zen 3 architecture. It has already announced a 19% IPC improvement, which should see a serious uptick in gaming performance. It could finally mean that Intel loses it's gaming crown to AMD, which will obviously shake up this list quite a bit. The new Ryzen 5950X, Ryzen 5900X, Ryzen 5800X, and Ryzen 5700X will launch on November 5. "

  • RexRedRexRed Posts: 1,296
    edited November 2020

    LOL.

    Ryzen 5000 releases Thursday. 

    From the article you clearly did not read

    "Before we go any further, it's worth noting that AMD is about to release a slew of new CPUs using its Zen 3 architecture. It has already announced a 19% IPC improvement, which should see a serious uptick in gaming performance. It could finally mean that Intel loses it's gaming crown to AMD, which will obviously shake up this list quite a bit. The new Ryzen 5950X, Ryzen 5900X, Ryzen 5800X, and Ryzen 5700X will launch on November 5. "

    Yes, AMD is about to relase more proprietary chips and chipsets with bad drivers that are not optimized to work with Nvidia graphics cards.

    Are you going to rush out and buy an AMD graphics card?

    I didn't think so.

    So Intel does have "the crown", till Nov. 6th, maybe, and even if AMD takes the crown it will still not be optimum for Daz and IRAY.

    :)

    Post edited by RexRed on
  • RexRed said:

    LOL.

    Ryzen 5000 releases Thursday. 

    From the article you clearly did not read

    "Before we go any further, it's worth noting that AMD is about to release a slew of new CPUs using its Zen 3 architecture. It has already announced a 19% IPC improvement, which should see a serious uptick in gaming performance. It could finally mean that Intel loses it's gaming crown to AMD, which will obviously shake up this list quite a bit. The new Ryzen 5950X, Ryzen 5900X, Ryzen 5800X, and Ryzen 5700X will launch on November 5. "

    Yes, AMD is about to relase more proprietary chips and chipsets with bad drivers that are not optimized to work with Nvidia graphics cards.

    Are you going to rush out and buy an AMD graphics card?

    I didn't think so.

    So Intel does have "the crown", till Nov. 6th, maybe, and even if AMD takes the crown it will still not be optimum for Daz and IRAY.

    :)

    There is no performance penalty for using Nvidia GPU's with AMD CPU's. Stop spreading disinformation.

  • RexRedRexRed Posts: 1,296
    RexRed said:

    LOL.

    Ryzen 5000 releases Thursday. 

    From the article you clearly did not read

    "Before we go any further, it's worth noting that AMD is about to release a slew of new CPUs using its Zen 3 architecture. It has already announced a 19% IPC improvement, which should see a serious uptick in gaming performance. It could finally mean that Intel loses it's gaming crown to AMD, which will obviously shake up this list quite a bit. The new Ryzen 5950X, Ryzen 5900X, Ryzen 5800X, and Ryzen 5700X will launch on November 5. "

    Yes, AMD is about to relase more proprietary chips and chipsets with bad drivers that are not optimized to work with Nvidia graphics cards.

    Are you going to rush out and buy an AMD graphics card?

    I didn't think so.

    So Intel does have "the crown", till Nov. 6th, maybe, and even if AMD takes the crown it will still not be optimum for Daz and IRAY.

    :)

    There is no performance penalty for using Nvidia GPU's with AMD CPU's. Stop spreading disinformation.

     

    Then why is AMD marketing their graphics cards as working better with their CPUs?

    Please tell AMD to stop spreading disinformation... lol

  • JamesJABJamesJAB Posts: 1,760

    As a ling time AMD CPU and Nvidia GPU user, I agree with kenshaw... There is no performance penalty using Nvidia on AMD platforms.

    Especaly now with the RTX 30X0 cards being PCIe 4.0... (The advertisement you are looking at claiming AMD GPUs work better was probably from before Nvidia released any PCIe 4 cards.  Back then the RX 5000 cards where the only PCIe cards on the market.)

  • RexRedRexRed Posts: 1,296
    edited November 2020

    "So there was 4-13% jump in performance with the 6800xt when paired to a 5000 series CPU thanks to Smart Access Memory..."

    This jump will not be possible with Nvidia cards in an AMD system.

    Likewise, since AMD is cutting out Nvidia from the "standards" you will probably see the same synergy within an Intel and Nvidia rig...

    This is only one example of processor optimization for proprietary GPUs...

    "Taking advantage of the fact that they are the one of the big two in both the CPU and GPU market they have sought to maximize efficiencies for systems that combine both newly released AMD components on the same rig."

    Is AMD spreading disinformation? You guys can agree all you want and still be wrong...

    Windows also loads the games into Nvidia's GPU RAM, is this compatable with a Radeon card?

    I recall hearing it is not compatable with AMD cards.

    This is two examples of real, not "disinformation" of processor and graphics card proprietary functionality.

    Now why would you want an Nvidia card in an AMD rig if it cannot "fully" function as it was designed?

    Be my guest if you wish... lol

    Post edited by RexRed on
  • That AMD is saying their GPU's work better with their CPU's and chipsets is not the same as saying that Nvidia GPU's have a performance penalty compared when they are used with Intel CPU's.

    There is no implication that an Nvidia GPU will not function at 100%. You are spreading disinformation.

  • RexRedRexRed Posts: 1,296
    edited November 2020

    Incorrect Kenshaw... These AMD GPU boost features are "processor dependent". Exactly the opposite and you are misleading people... 

    You are painting an all rosy picture of how wonderful Nvidia cards work on an AMD system. 

    The more dear sweet AMD alienates Nvidia with their proprietary GPU/CPU technology the more Nvidia will pair with Intel

    and pull the same proprietary stunts driving people to Intel and widening the gap of incompatibility.

    Does Nvidia raytracing work on a Radeon card? No.

    That is just one example of incompatibility with hardware and software.

    Does the Gforce Experience work on a Radeon card? No.

    This rift is bound to grow over time.

    Does Iray work on a Radeon card? No.

    You can bet that the 30 series cards were designed to work best on an Intel processor.

    Why would Nvidia want to help AMD sell more CPUs?

    Your assumption that Nvidia would want people crossing the aisle and installing their cards in an AMD PC is proposterous and laughable.

    The same goes for AMD and their Radeons running wth an Intel CPU.

    Post edited by RexRed on
  • WTF?

    DXR is a DirectX feature. Both AMD and Nvidia have added hardware accelaeration for that feature to their cards. But if a game has real time ray tracing it will work with either brands cards.

    GeForce Experience is Nvidia bloatware. Why would anyone care that it doesn't download AMD drivers and take up system resources running all the time? No one should have it installed anyway.

    No, I can and will bet that Ampere cards were not built to run better on Intel, you seem unaware that there are benchmarks that prove it isn't even remotely true.

    Nvidia literally doesn't care what systems their GPU's are installed in. They have had issues with Intel far more than they have with AMD, when Intel stopped puting Nvidia chips in Intel chipsets that cost Nvidia literally billions of dollars. Also there are these things called antitrust laws. If they were to try and favor Intel over AMD they'd get sued by the US and EU governments and they'd lose.

    And the same in reverse for Radeon and Intel. AMD can implement tech that adds features when paired with their own stuff but they cannot degrade or disfavor their competition that is a sure way to get the regulatotors involved.

    So again you really have no idea what you are talking about. You are spreading disinformation. 

  • JamesJABJamesJAB Posts: 1,760

    Honestly over the last decade there has been a lage performance advantage using Nvidia GPUs, so the last thing AMD would want are gamers running away from Ryzen CPUs because of proprietery features that reduce Nvidia performance.

    Adding extra features that boost performance for Radeon GPUs paired with Ryzen processors does not in any way reduce Geforce card performance.  The only thing this achieves would be giving a performance penalty when using Radeon GPUs with Intel processors because the Intel CPU can't access the extra feature.

    Allthough flipping to the other side of the page......
    There are 3 gaming raytracing standards that you need to be aware of...
    Nvidia RTX (Only works with RTX GPUs and only 15 games support it as of Oct 2020)
    Microsoft Direct X Raytracing (Works with both Nvidia and AMD ray tracing cores)
    Vulcan API Raytracing (Will work with any device and OS that supports Vulcan, for example Playstation 5)

    Proprietary feature sets designed to boost performance are nothing new on all sides of the fence for CPUs and GPUs.... (RTX, DLSS, SLI, Crossfire, MMX, 3D now, Hypertransport, Quickpath interconnect, Hyperthreading, AMD64, and the list goes on...)

  • RexRedRexRed Posts: 1,296

    Also there are these things called antitrust laws. If they were to try and favor Intel over AMD they'd get sued by the US and EU governments and they'd lose.

    James wrote:

    Proprietary feature sets designed to boost performance are nothing new on all sides of the fence for CPUs and GPUs.... (RTX, DLSS, SLI, Crossfire, MMX, 3D now, Hypertransport, Quickpath interconnect, Hyperthreading, AMD64, and the list goes on...)

    Comment:

    Were are antitrust people when it comes to these proprietary boost features?

    Ken you are misleading people, trying to mix apples with oranges and calling them the same.

    Your supposition that Nvidia wants people rushing to Ryzen is fantastical.

    Microsoft may be crossing the fence using AMD in their new Xbox but I don't plan on buying one and neither is any serious PC gamer.

    AMD is competitive and their chips have become a real contender but I still value Intel above them when it comes to my main workstation, Iray and drivers.

    AMD is simply stealing from Intel's playbook. This only makes me more certain of who the real industry innovator and leader is.

    And, a performance penalty, isn't that the same as a proprietary boost for the competition? Apples and oranges.

  • thank you for all answer..

    if i have much money i buy recomandation but...for the moment i have an opportunity tu buy sh xeon....

    Model ThinkStation Lenovo P700

    Processor 2 x INTEL XEON E5 2650 V3 DECA CORE
    RAM 64 GB DDR4 ECC FB (8 gb x8 dimm)
    HDD 256 GB SSD
    GPU-RADEON RX 570 Series with 4gb

    and pay 960 usd.....

    I think change RAdeon with NVidia RTX 2080 with 8gb ...

    And buy for more memory....

    P700 have 12 slot dimm...so can add another 8gb x 4 pieces.....

    In final....is enough power for leraning daz3d and make animation 3d?

    Using Blender, Cinema 4d and other software?

    Buget for the moemnt is 1000 usd. and in few month adding and changeing components

    Sorry for my poor english...

     

    All recomandation offered in this forum are verry good but over my power.....

    So can tell me if this configuration for starting is ok?

    i thin can add more NVIDIA card after time....

    Mother board support up to 3 video card....

    BTW, for rendering can use different card or maybe all card could be same model?

    Can put RTX 2080 and GTX 1650 and GTX 1070i ? For example

    or need all card same model?

  • RexRed said:

    Also there are these things called antitrust laws. If they were to try and favor Intel over AMD they'd get sued by the US and EU governments and they'd lose.

    James wrote:

    Proprietary feature sets designed to boost performance are nothing new on all sides of the fence for CPUs and GPUs.... (RTX, DLSS, SLI, Crossfire, MMX, 3D now, Hypertransport, Quickpath interconnect, Hyperthreading, AMD64, and the list goes on...)

    Comment:

    Were are antitrust people when it comes to these proprietary boost features?

    Ken you are misleading people, trying to mix apples with oranges and calling them the same.

    Your supposition that Nvidia wants people rushing to Ryzen is fantastical.

    Microsoft may be crossing the fence using AMD in their new Xbox but I don't plan on buying one and neither is any serious PC gamer.

    AMD is competitive and their chips have become a real contender but I still value Intel above them when it comes to my main workstation, Iray and drivers.

    AMD is simply stealing from Intel's playbook. This only makes me more certain of who the real industry innovator and leader is.

    And, a performance penalty, isn't that the same as a proprietary boost for the competition? Apples and oranges.

    Microsoft isn't "crossing the fence" to put AMD in the Xbox. The last gen Xbox had a AMD APU as well.

    AMD has always been the industry innovator. You'd be hard pressed to name a single major CPU advance of the last 25 years where Intel led the way. AMD was the first to 64 bit, the x86 64 bit instruction set is called AMD64. They were the first to a true dual core CPU etc.

    You show a clear and consistent lack of knowledge on the subject. I have no idea why you continue to even discuss the subject.

This discussion has been closed.