Single lane graphics cards?
I just thought I would mention this...
What a great day it will be when the graphics cards companies can get all of the needed electronics and fans on a single PCIE lane. Something as powered as a 1080ti or better.
Imagine having the ability to put 4 or 5 (even 6) graphics cards in a PC instead of just 3 or 4.
Perhaps if they could break apart the graphics card and have part of it tethered and in a remote place (maybe a hard drive bay) other than right above the slot.
This change would certainly increase productivity, more than likely, double it.
Why do they use those shortened PCIE slots? It can't cost that much more to simply just only use full sized slots.
I have been reading that 8x lanes can supply most graphics cards with enough bandwidth to reach their full potential...
Just thinking forward...
Comments
Rosewill Server Chassis, Server Case, Rackmount Case for Bitcoin Mining; 4U Metal Rack Mount Bitcoin Miner for 6 GPU; Solution for Building a Mining Rig (RSV-L4000B)
search for that on new egg
can something like this be used as a render farm?
https://www.newegg.com/black-rosewill-rsv-l4000b/p/N82E16811147270
https://cgcookie.com/tutorial/setting-up-a-renderfarm
This looks interesting!
Most of what you're talking about already exists, has been tried(and failed miserably) or just can't be done due to various limitations of current hardware, manufacturing standards or the laws of physics.
Deep dive time.
There are, consumer class motherboards that currently support 8-16GPus either directly or via a daughter board.
There's also the asus B250 mining expert motherboard that has 19 total pci-e slots, 18x1 1x16. https://www.asus.com/us/Motherboards/B250-MINING-EXPERT/
The problem arises that these motherboards usually only support quad core processors, 32GB of ram, and may have other limitations that negate the gpu capacity for rendering use.
.
If you want a completely off the rails setup, look into port splitters and crypto mining gear. Potentially you could connect upto 28, normal, GPUs on a 7 slot board. 1 splitter per slot, 4gpus per splitter.
But you'll take a performance hit working and in rendering.
It's kinda negligable with dual cards, but 4 starts to get a bit wonky. Multiplying that out, and it could get really bad.
now, if you're really serious about building a render farm, server class hardware is the best option.
There are solutions that will house 8-20 gpus, in a single system, and with the right configuration, you could take that upto 80GPUS in system.
Lambda labs has a solution that houses 20GPUs for ~$85K us. Tesla T4s specifically.https://lambdalabs.com/products/inference#tiers-triptych-anchor
There are also gpu expanders that connect via thunderbolt, usb or HIC(host interface card) that support between 1 and 16Gpus, depending on model.
Some of these can be daisy chained, or multiples connected to a single computer.
As to the idea of cutting the gpu size down, that already exists, they're called Mezzanine cards, and they're about 1/4 the size of a 'normal' gpu.
They are usually only available in specialized servers, blades mostly, but there are some options that might, and i stress might, work in a consumer class computer via a carrier card that is x16 type.
A few links for example and further research.
Nvlink/Nvswitch:https://www.nvidia.com/en-us/data-center/nvlink/
6Gpu mobo:https://www.ebay.com/itm/Onda-D1800-BTC-Mining-Motherboard-4gb-Ram/184255888888?hash=item2ae681bdf8:g:nnYAAOSwa8NemHIU
8 gpu mobo:https://www.ebay.com/itm/Colorful-Mining-Motherboard-8-PCI-E-slots-for-Intel-LGA1151-C-B250A-BTC-YV20-/274122655444?_trksid=p2385738.m4383.l4275.c10
Mezzanine card:https://www.ebay.com/itm/HP-NVIDIA-TESLA-V100-SXM2-VOLTA-GPU-ACCELERATOR-16GB-HBM2-GPGPU-Q2N66A/143583243593?hash=item216e3a6d49:g:y4gAAOSwBp1emjM-
https://www.ebay.com/c/504467850
Mezzanine card carrier(upto 4 gpus in a single card):https://www.ebay.com/itm/HPE-NVIDIA-Mezzanine-Graphics-Adapter-Multi-GPU-Carrier-w-2x-Tesla-M6-MXM-GPUs/283433429727?epid=10004318755&hash=item41fdf2eadf:g:YyYAAOSwA05cnVRW
4gpus, one card:https://www.ebay.com/itm/HP-nVidia-Tesla-M10-Quad-32Gb-Q0J62A-900-22405-0300-030-868798-001-870046-001/233390813227?hash=item36572d302b:g:jl0AAOSwt0FdwI63
x1-x16 adapter:https://www.ebay.com/itm/PCIe-risers-card-PCI-e-X1-TO-X16-Adapter-with-60cm-USB3-0-And-molex-Power-cable/263467503850?epid=1129965821&hash=item3d57e310ea:g:8BUAAOSwMgdXxuUi
x1-x16cable:https://www.ebay.com/itm/PCI-Express-Riser-Card-x1-to-x16-Adapter-Power-Supply-for-Graphics-Card/173732354260?hash=item28734158d4:g:fRQAAOSw8TJcLXZ-
x1-4x16:https://www.ebay.com/itm/PCI-E-X1-TO-4PCI-E-X16-Expansion-Card-Switch-Multiplier-Riser-Adapter-Card/303352335270?hash=item46a1354ba6&_trkparms=ispr%3D1&enc=AQAEAAADUFpzeZgtNKdmHwLLcyYSd60fMjB0KZ%2FQZswOuHII5R6DRQVNLQZsHTBLC9cvISkcWo14iV0KMY%2FbMxUVk%2F5sG%2BBDy9P%2BH4LJQWgNFy%2FI%2Fx5x70SNSI6%2FtVDabBFFlxLLmgTiUprr5MUINnaKsytC1UsJ9xpEqpN2SbHNCVx5ffToJmcECkcOiPUnzXZdF2x7J6NBHo%2FR%2FKASeiI3LFfC%2FfjsMnL2LyJ7EStTEjaXrfRAugGQeIuCps8RfeyKERGW5sTGNHDaQN1gHkqWgezpMyMghPcMF64KDQF%2Bhc5SAvNoD1oqu6vEgXDpOn%2FETl%2BgveFi56w24nKWqmZ6e9SJ%2BLwmdgRN7dfDgibYopb3NQaHOSRuaPkMmgBxnNX81g3ll2tlmZ5BTiPM%2B3jAw9KnixRlC0EzzazUjWU6nyWYqevntl7dNszPFmdyAqdxwNpPFcM7nxBdxqrLuUC4WE5pdBITePw4fjrHLnm%2F1fMeRcmZIwnAn9hTUi7XoFfMbSrZa%2Bg3DIXAQ%2B7c4eU1gxR%2BqUO0LYHiLLKdt4T3WHTZt3UrzUSiM8t3JRboOV5MLvl0mfPmV0A07qlSFCL%2F90WogggfaJWofxUUG%2Fng5hJTrdt6LlysS1UWlF%2B%2BMEKbaisH7dQcf%2BEopJ5GaEgZtAZp4FfIin77egSZe%2BkGLf0zL8EUV7Pw5fK9gCs3z8woPlRJWpVyBCCYaNzB9PFQNiyCaFGbJRJqxvfBI7uW8P%2Fj2fgEycuff7vGRp%2Fivc%2BQbht8gvrTi0givSX%2Fad8NYwkLU8UXKpz%2BrPxawBA3mEwHJ7%2Ft3bmNRkhi%2FPs8YIdGmKUexCm%2BDBOs0N88UM%2BT9DmfkFqdXmlhThomAlZBaa4bEstCrOic74m%2FGJU1chfIW%2F5Rp6ElYqZ5xd8rbzuNtVwBZ91tcn88gpcPlg5Zxz8H1E5KLlYibY%2BDB27MUO1tokRX4kSQ46NbeviRXuv9l%2FtrilaiVvZsEalI0wVEel9%2Bu0BlqVoxuHQUV%2FYyN%2FYHJFAwXiB5kWJ%2BS4m%2FV0SInUHnAwOmBJG5s8DgEyAn78bl8Wa872hP%2BzExDnWW61OeKiHIR5Vu1JOrzKfvLW806zpj5hBeqjjhIBO1OtveP%2FVCawQp&checksum=3033523352701f5017a4775146b19ddfca70d4aeb501
port splitter:https://www.ebay.com/itm/4-port-PCI-E-x1-Transfer-adapter-to-Powered-x16-Riser-Adapter-Card-USB-3-0/124113311296?hash=item1ce5bad640:g:G-sAAOSwe7peYE7T
Here's my usual caveat.
I don't endorse any particular product, service or manufacturer, links are provided for informational purposes only and i don't always have direct knowledge of performance or other variables that may arise from the use of said products.
Caveat emptor(buyer beware).
Hope that helps. if there's more information needed i'll post up what i can.
OH WOW! I did not expect all of this awesome information!
THIS is exactly what I needed!
Thank you very much for taking the time to compile all of this great information. What a resource!
Thanks DrunkMonkeyProductions! Absolutely amazing!
RR
Before you rush out to build a mining rig to render iRay you need to keep some things in mind.
iRay requires one CPU thread for every active GPU. So those mining motherboards will be very limited, the B250 ones support CPU's with no more than 8 threads. The ones with 1800 in the description/name have a Celeron J1800 chip soldered into the board and that chip is 2 threads.
Also to power bunches of GPU's will require multiple power supplies. You'll need to learn how to get a PSU to run when not plugged into a motherboard, not impossible but you should be prepared. Also the older and cheaper GPU's will draw more power than newer ones so while your upfront cost might be lower you'll pay in your electric bill.
Further the limit for any cards ability to help render a scene is its VRAM. So getting lots of 4Gb cards or the like will be of very limited utility.
At some point it just gets cheaper to go with new HW and not run so many GPU's.
Juat a few clarifications, corrections and additional information here.
It's not Thread, it's Core. The terms are not interchangeable.
The conventional wisdom is 1core/2th(for hypertheraded cpus) per gpu.
In personal experience it's way more complicated than that.
With some initial testing i've done, i don't recommend less than 2 cores(ht or not) per gpu, minimum, with 2 additional cores for system overhead.
1gpu=4cores.
I came up with this from some practical testing i did with old core2 cpus, a couple servers and HEDT systems i've been testing this on.
In a Core2Quad system with a single render gpu, in addition to the system gpu, the cpu would max out 2 cores during rendering on an x1-x16 adapter. The extender was utilized due to physical system limitation and i happen to have a box full of these.
On my servers, it gets more interesting.
I won't go into too much detail, but by cutting the cores down, via Set affinity in task manager, to one, the render time increased anywhere from 2-5 times, depending on other variables.
When running multiple gpus, the gpus, appear, to simply switch which one is rendering at a particular point, although the render times did decrease slightly, 15% drop. With changes in render settings, specifically the update interval options, there could be a futher drop in render time of upto 30%.
Not ideal, but it'll do in a pinch.
The best option i found was running 4 cores with dual gpus. this resulted in render times comparable to not having afinity set.
Unfortunately this testing was done via a port splitter, so that may be causing some additional overhead i haven't accounted for yet.
Once i get a bit more testing done, and some other hardware in for this purpose, i'll probably start a whole thread about it.
I actually have one of those b250 boards on a shelf. that project kinda got backburnered when i had to upgrade the file server.
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Regarding power.
It's actually pretty simple to run secondary, or more power suppllies external to a system, and as i pointed out, there are options specifically for this purpose that don't require additional power supplies.
For most consumer systems, you can mount upto 6GPUS on a powersupply before you need to goto more specialized solutions.
You may need to invest in some extension cables or converters, molex or sata to 6/8 pin, to power the mess.
I'd recommend at least a 1200-1500W supply and fully modular for internal mounting.
For external mounted GPUS, there's a few options that are pretty simple.
Note:Some of these options may include risks of personal injury if done improperly. Research these options thoroughly before attempting. Use proper tools and safety precautions when applicable.
Note: One thing to be aware of is that not all systems use standard 24 pin connectors, HP and Dell are notorious for this, so these options may not work with these powersupplies, be sure to check the pinout diagram for the particular system you are connecting to and compare it to the 'standard' diagram before purchasing /attempting
If you want to use a consumer powersupply, all you need is a paper clip to jumper two pins to turn the supply on.
I recommend a supply with a physical switch on it for turning it off and on and not just using the paper clip to turn it on and off.
If the particular supply doesn't have an on/off switch, use a power strip, or physically unplug it before attempting to jumper connections.
As an alternative, and to simplify the power on process you can use splitter cables or "jump starters"
Splitter cable:https://www.ebay.com/itm/24-PIN-20-4-Dual-PSU-Multiple-Power-Supply-Splitter-Adapter-30cm-Cable-Cord-WL/174245147924?hash=item2891d1f514:g:TAEAAOSwvXdeUzzL
"jump starter":https://www.ebay.com/itm/ADD2PSU-Dual-Triple-Power-Supply-Adapter-Connector-PSU-Board-for-Miner-BTC-ETH/292167063042?_trkparms=aid%3D1110004%26algo%3DSPLICE.COMP%26ao%3D1%26asc%3D20200220094952%26meid%3Da5ac4520b81546c2a303856d47629d7c%26pid%3D100008%26rk%3D12%26rkt%3D12%26sd%3D174245147924%26itm%3D292167063042%26pmt%3D1%26noa%3D0%26pg%3D2047675%26algv%3Ddefault%26brand%3DUnbranded&_trksid=p2047675.c100008.m2219
For the more "specialized", i.e. normal for crypto setups, there are two main components, a server powersupply, usually dell or HP, and a Breakout board.
The breakout boards come in a few different styles, but work the same.
They attach to a powersupply, and provide 8-12 power connections, usually 6 pin, and have a powerswitch on the board.
Breakout board:https://www.ebay.com/itm/For-HP-PSU-GPU-Mining-Ethereum-ZEC-ZCASH-ETH-DPS-1200FB-A-1600W-Breakout-Board/113766160534?hash=item1a7cfdd896&_trkparms=ispr%3D1&enc=AQAEAAADYHAEYRlXFw2oPJySHd7lP3FCEKofMC2U%2BbX3DpCskUQ2BrTMoe2PBlEI0SfE2eShsG6vet8D05C%2BitPcMloFpkEKOxJGM%2Fbq2NNLat%2B4uaMbI359OMsgjSd4UIIRa%2Ff3av493dN18rYuQMcoG0DAcSrj0z1VOZElNwNE4V6Klj6YWMQiOzs1Xv85xnzkGGZIZWZhiL0SiYANrwoondbaguYCarkrXRg5m2U%2F2fbril5qDMNnn3qBzPixwQLXUDl3w4bPTlJDoIkrWTjib3TpNBlUGyVJ8lYpwK%2BZdIAM66JEGEZfAYr0e2h%2BU%2FP5n3qKGdlNoVFxIPEulSAO78n1b4%2FO%2FCtRHREFJH688TyR6RseFhmNAkLfyO6aimj5gPGfOgHsdOXaF66ZFwtaFYnLZ4uwWt9VUfVoMQ49vrlnyzZKjIQczRPsnwVh0LVxAZapmbUPF4FPl5wANS%2F2m0%2BnIHh5xweZvu26a50NKgI%2B1F0vZj5o8TZXClAYL9pyNumtTW6s8XvodqIOR%2Bla9eEtgR8NPcUFzzm32GAU9m4Kh7nBrSGT%2FFw6mnyvacFtBtu%2FL%2BCYJoysnEuD%2Btsxe9km3hUNSqCj3cTPubz6UF7E0M0dbfqChMqhyK3MkCNHUR5v09qke1SdGMiaPV3bVzc8V5ZEmfsWNzNJBq3P%2BFSPq45rDqim3GqxeGhUSlE%2FDIpni6bGo%2BmIlXOFQtkbmMEneRUo%2FmTOOFn%2FYZLZQmWil0jfs8peilG0cak8syNz92Ym6dTyvRosegj58mRAu2l%2B2NHzBjit9%2Ff6NGwJOOWa%2BR9o6rqnBL0YZs4YTufSmdk6%2B4SkCRV%2BOCZq4O9YaOW2bMpeUQxsfL2oXzccGpAsb0lz1ArgyI2kGRHnrCaVlXKmYfNR%2FL%2BTmNwuK4pDeIvam7Ksf74oDmf2R%2F%2FB3JZkZ3BRgmKbG5rTHNQlmGCASJOV%2BaCSBXRFJpe%2B8gnMsck0eDlzN7WZNucY13UJsq6YCm%2BKpaywqZ2Pwhd13lvmcN%2Fd4nIwUtkpv2%2Fxa5DYaon8FOFTPDdcMOvKBTNCWPyP5LnrWxt1UoL4%2BxxYN9qAexhdltsY7NWVWUvjSOW53gy%2BAqyeHRo5cUKf4Xji1IxONdHJc7Gey8kwdYwCiqtc2BCbdg%3D%3D&checksum=11376616053404f5005bd3f64a4baf84f4035f5b65d4
https://www.ebay.com/itm/For-HP-PSU-GPU-Mining-Ethereum-ZEC-ZCASH-ETH-DPS-1200FB-A-1600W-Breakout-Board/113766160534?hash=item1a7cfdd896&_trkparms=ispr%3D1&enc=AQAEAAADYHAEYRlXFw2oPJySHd7lP3FCEKofMC2U%2BbX3DpCskUQ2BrTMoe2PBlEI0SfE2eShsG6vet8D05C%2BitPcMloFpkEKOxJGM%2Fbq2NNLat%2B4uaMbI359OMsgjSd4UIIRa%2Ff3av493dN18rYuQMcoG0DAcSrj0z1VOZElNwNE4V6Klj6YWMQiOzs1Xv85xnzkGGZIZWZhiL0SiYANrwoondbaguYCarkrXRg5m2U%2F2fbril5qDMNnn3qBzPixwQLXUDl3w4bPTlJDoIkrWTjib3TpNBlUGyVJ8lYpwK%2BZdIAM66JEGEZfAYr0e2h%2BU%2FP5n3qKGdlNoVFxIPEulSAO78n1b4%2FO%2FCtRHREFJH688TyR6RseFhmNAkLfyO6aimj5gPGfOgHsdOXaF66ZFwtaFYnLZ4uwWt9VUfVoMQ49vrlnyzZKjIQczRPsnwVh0LVxAZapmbUPF4FPl5wANS%2F2m0%2BnIHh5xweZvu26a50NKgI%2B1F0vZj5o8TZXClAYL9pyNumtTW6s8XvodqIOR%2Bla9eEtgR8NPcUFzzm32GAU9m4Kh7nBrSGT%2FFw6mnyvacFtBtu%2FL%2BCYJoysnEuD%2Btsxe9km3hUNSqCj3cTPubz6UF7E0M0dbfqChMqhyK3MkCNHUR5v09qke1SdGMiaPV3bVzc8V5ZEmfsWNzNJBq3P%2BFSPq45rDqim3GqxeGhUSlE%2FDIpni6bGo%2BmIlXOFQtkbmMEneRUo%2FmTOOFn%2FYZLZQmWil0jfs8peilG0cak8syNz92Ym6dTyvRosegj58mRAu2l%2B2NHzBjit9%2Ff6NGwJOOWa%2BR9o6rqnBL0YZs4YTufSmdk6%2B4SkCRV%2BOCZq4O9YaOW2bMpeUQxsfL2oXzccGpAsb0lz1ArgyI2kGRHnrCaVlXKmYfNR%2FL%2BTmNwuK4pDeIvam7Ksf74oDmf2R%2F%2FB3JZkZ3BRgmKbG5rTHNQlmGCASJOV%2BaCSBXRFJpe%2B8gnMsck0eDlzN7WZNucY13UJsq6YCm%2BKpaywqZ2Pwhd13lvmcN%2Fd4nIwUtkpv2%2Fxa5DYaon8FOFTPDdcMOvKBTNCWPyP5LnrWxt1UoL4%2BxxYN9qAexhdltsY7NWVWUvjSOW53gy%2BAqyeHRo5cUKf4Xji1IxONdHJc7Gey8kwdYwCiqtc2BCbdg%3D%3D&checksum=11376616053404f5005bd3f64a4baf84f4035f5b65d4
Take note in the items description that they are compatible with specific power supplies.
When selecting a supply be sure it's compatible with the breakout board first.
Selecting a power supply.
When considering which supply to get, you need to consider the GPU power(TDP), the adapter board(s) you intend to use and future expandability.
You'll also need to consider the power connection type to the adapter board.
Some adapter boards use molex(4pin), some use sata(15 pin) some use 6 pin(standard gpu) and some have all three(my preference).
Be sure to select/purchase the correct cables and adapters as necessary for your chosen combinations.
Or take the easy route and buy a kit with everything you need.
Breakoutboard+PS+Cable kit:https://www.ebay.com/itm/HP-750W-PSU-w-Breakout-Board-5-Cables-HSTNS-PL18-Server-Grade/123886627884?hash=item1cd837ec2c:g:I3cAAOSwp1BdYvnk
---------------------------------------------------------------
Electric supply considerations
As an aside regarding power delivery, you also need to consider your current wall power.
In the u.s., i can't speak to other countries, electrical wiring in homes is insufficient for the power needs of a render farm.
So, be sure to check the Amperage of breakers on circuits you intend to connect to, verify any other outlets/lights/appliances/etc on said circuit and check the wiring(gauge) where you can.
If you don't feel comfortable with, or local ordinances prevent it, you need to engage the services of an electrician.
---------------------------------------------
Gpu selection for rendering and render farms.
Kenshaw brings up a couple valid points regarding Vram and power usage.
There is however a bit more to the equation i'll delve into here.
For the purposes of this disucssion i'm going to utilize spec TDP, a KW/hr rate of $0.20 usd(to keep the math simple) and ebay, new egg, taobao and aliexpress for purchase prices, i'll be using averages.
Bench mark assumptions will be based on results in the benchmark thread here in the forums, octane render, migenius, as well as others.
RTX 2060(non-super) vs p106-100 vs gtx 1060(6GB) vs k20c
Cost:
RTX 2060(non-super):$300 usd(newegg/ebay(oh yeah the buy it nows used are the same as new)
P106-100:$100(ebay), $60-80(taobao)
Gtx 1060(6GB):$250(newegg), $150(ebay)
k20c:$60(ebay) $600-2500(newegg).
TDP
RTX 2060(non-super):160w
GTX 1060(6GB):120w
p106-100:120w
k20c:225w
For performance
1x rtx2060=2x1060/p106=4xk20c
TDP
160w vs 240w vs900w
Cost
$300(2060) vs $300(1060x2) vs $200(p106-100x2) vs $240(k20x4)
operation cost based on equivalent performance.
RTX 2060 ~6hrs per KW/hr
GTX 1060/p106-100 ~4hrs per KW/HR
K20c ~1hr per KW/Hr
For a 24HR period @ $0.20 per KW/hr
RTX 2060 ~$0.80
GtX 1060/p106 ~1.20
k20c ~$4.80
Operations hours until purchase/operation cost equals based on purchase price/ops cost of RTX 2060(non-super)
GTX 1060: Eliminated due to the cost of 2GPUs equaling the cost of single RTX 2060.
P106-100:250 days(6250hrs)(based on $100 difference in price and $0.40 difference in ops cost)
.K20c:15 days(360hrs)(Based on $60 difference in cost and $4.00 difference in ops cost)
Ok, so based on this information, the k20c is a bad investment.
GTX 1060(6GB) are just as bad.
The p106-100 is the best price to performance, especially if you can get them less than $100 usd, but due to a lack of video out ports it's flexibility is limited.
Note: The cost of purchase for the dual p106 does go up if you require a secondary powersupply, breakout board and port extenders(~$45usd total) or if you need to upgrade to a heavier(higher wattage) powersupply for in system mounting.
A bit of real world
The p106-100 i own pulls about 75w avg(81w is the max i've ever seen mine pull) and the k20c 115w(125 max).
In performance the p106-100 is comparable to 2 k20c for about 1/4 the power usage.
------------------------------------------------------------
Vram
Once you get past the 6GB(or 8GB) gpus, things start getting way more complicated and the trade offs become more relevant.
For example, the highest capacity card on the nvidia market is the Quadro RTX 8000 with 48GB of vram at a cost of $5200.00 usd.
The fastest render card, for iray, is the Quadro GV100 which only has 32GB of HBM2 vram and a cost of $10,000.00 usd used.
For consumer cards the RTX 2080 Ti is about the best, with performance on par with the 8000, but with 1/4 the Vram, 12GB vs 48GB respectively at about 1/4 the price, $1100.00 vs $5200.00 usd respectively.
I'm not considering the Titan RTX as a 'consumer' card, as it's a bit of an oddity with 24GB vram and a cost of $2500 usd(newegg)
For comparison with older cards. A tesla k40(12GB) costs right at $100 usd and has a performance at about 1/4 that of an RTX 2080 TI.
Whereas a Tesla M40 has 24GB of vram and a performace of 1/3 the RTX 2080 Ti with double the ram for about $300 usd used.
The Tesla m40 clocks in at around 1/4 the performance of the Titan RTX at the same ram capacity.
--------------------------------------------
Supporting computer system
Once you start getting into multi and higher capacity(vram) gpus you've got to put some bucks in your system, assuming you're on a potato like me(lol)
You'll need to get higher core count cpus(2c/gpu+2 for system, minimum, more preferably) and way more ram.
Now i don't know who came up with the 2 times system ram to gpu vram, that is 6GB gpu=12GB system, but i'd like to have a word with them.
This is probably one of the most malarky suggestions i've ever seen.
On an average day i can easily hit 40GB of system ram, and still fit my scene in a 5-6GB gpu.
And that's on default render settings and no scene optimizer.
By contrast i can use less than 8GB of system ram, and totally exceed the 6GB gpu.
Scene composition matters folks.
So, my rule of thumb, 16GB minimum, 32GB recommended, and as much as you think you'll need, can afford and can fit in your system.
If you start looking into 24+GB gpus, i'd suggest more than 128GB system ram, just to be on the safe side.
------------------------------------------------
Final notees
Don't limit yourself to only considering 'Gaming" gpus.
Look into tesla and quadro cards as well.
If you don't feel comfortable, or don't want to put up with the noise from external mounted gpus, look into single wides gpus.
They'll cost more than their double wide counterparts, but you can get more in a standard computer case, depending on Pci-e slots available on your mobo.
Always be sure to check your cases size before purchasing any gpu. Make sure it fits first if you'll be internallly mounting it.
Always check your system/motherboard manual for slot step down. x16/0/0, x16/x8/x8, x8/x4/x4, is the general nomenclature, which indicates how much of a drop in bandwith there is with multiple adapter cards in each slot. And this includes, sound cards, capture cards, raid cards etc.
Also check to see if your mother board uses a PLeX chip for some of the slots, this can have a bit of an impact on performance.
If your system lists more PCI-E lanes than the cpu supports, it's probably routing through a PLeX chip.
peace folks.
I can't thank you enough DrunkMonkeyProd for your exhaustive response.
I am using your post as reference for many things!
Please feel free to expound on as much as you like here.
Your thoughts and experiences are very valuable for myself and those looking to build render farms and needing such info as you have supplied.
I will be coming back here a lot to absorb in the info you have supplied.
Thanks again for taking the time to help out here!
The links to hardware are especially helpful!
Amazing, simply amazing!
I bought this today to go with a third Nvidia 1080ti that I ordered also.
https://www.ebay.com/itm/PCI-Riser-Card-Extender-Flexible-Extension-Cable-Ribbon/351680465361
I bought the founders edition graphics cards 1080ti (all three) made by Asus (and Nvidia) that have only one fan that pushes the air out the back of the card.
But, i think the cards are too big and the last slot will not have enough room for the middle card to draw in air.
Luckily i have a full sized tower and the extra card slots that extend beyond the base of my atx motherboard.
I am hoping this will be long enough and it will clear the case under the card too.
I am also hoping the connections are good enough on this extender to handle high speed throughput. It does not say anything in the Ebay page description about speed capactiy of the cable...
There were more expensive ones... I may not even need this if the card fits and the middle card can still somehow draw air.
Just out of coincidence, I can offer some information on both of your concerns :) Like you, I've got 3 blowers crammed extremely close together. I've also got a fourth one on a PCIe riser, and I have had zero problems with any of them, even rendering for more than 24 hours straight.
The system designer at System76 told me that for multi GPU setups, blowers are esential, and you already saw why: They don't recirculate hot air, they vent it outside the case; and those fans are flush with the card and work with no clearance.
I think additional proof that they work well is that the air coming out the back of my system, which when rendering, sounds like an F-16 idling on the tarmac, is EXTREMELY hot.