Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
Thanks for the info, will start reserarching coolers now.
If you want to use LuxRender then you would do yourself a huge disservice to go with nvidia. If you don't want to do any GPU based rendering then I would also recommend an AMD card if price is an issue.
I recently installed a Noctua NH-D14 CPU cooler in my system and it keeps the freshly installed i7 4770 idling at 29c and hits mid to high 50s at full sustained load. Considering we are having a very warm summer, I'm very happy.
You make a good point about the newness of SSD. I think I might get a smaller SSD now and then get a larger one down the road.
Newness of SSDs? What then about the newness of the 500 GB platters in 2 TB HDD disks? Most of the components you chose for your new PC didn't exist 18 months ago.
I have the first generation of Intel 160 MB SSD for more than 4 years now. No burnt cell till now. In fact the efficiency of a SSD depends mostly on its controller. The second generation of controllers that equips all newest SSDs runs nearly flawless.
Samsung has a 250 GB EVO SSD guaranteed 3 years that will cost about 120£ The same in 120GB version costs about 85£ (checked on Amazon UK).
Edit: No further advice from me. I have the feeling we are confusing you :)
Probably. But all hardware will someday stop functioning. Is it a reason never buy a computer?
I recently installed a Noctua NH-D14 CPU cooler in my system and it keeps the freshly installed i7 4770 idling at 29c and hits mid to high 50s at full sustained load. Considering we are having a very warm summer, I'm very happy.
Considering badly written code of a single pice of software would that be clever to refuse yourself using other ones? That seems the TS going to build entry level graphic PC, why should this be turned into office/gaming one?
Noctua is excellent choice ($105 here, isn't it a bit TOO much for 4 cores, 84 wt) but for my 3930k (6 cores, 130 wt) @ 4250mHz dual cooler Scythe of half the Noctua cooler price keeps it at 32-34° C idle and 54-56° C at full load under linx running for 4-8 hours.
Considering badly written code of a single pice of software would that be clever to refuse yourlelf using other ones? That seems the TS going to build entry level graphic PC, why should this be turned into office/gaming one?
Noctua is excellent choice ($105 here, isn't it a bit TOO much for 4 cores, 84 wt) but for my 3930k (6 cores, 130 wt) @ 4250mHz dual cooler Scythe of half the Noctua cooler price keeps it at 32-34° C idle and 54-56° C at full load under linx running for 4-8 hours.
Well, it helps to give all the options. It really doesn't help much to railroad someone asking for advice down one path. Just saying, go buy nvidia, might seem like great advice, but there are a lot of people out there that don't care about GPU rendering or they don't want to spend all the cash to go get something like Octane. Who knows. Just sharing the info.
The Noctua is a bit of overkill here. I wasn't sure what I was going to build. I was thinking 8 cores originally, but decided to stick with i7. It doesn't help me much but it would probably be well suited to moderate overclocking though. I got mine for $60 so I'm pretty happy.
double post :/
I built a 32G system last spring with AMD fire pro video card. Its good but also can use my notebook. Nice thing about the tower is I specified it was for rendering therefore they built the appropriate heat sync. I can render for a week and cold air still comes out the vent.. jimzombie I suspect you Luxrender none GPU I do too.
That is why some people prefer GPU rendering over CPU: render minutes or hours, not weeks, when deadline is close.
If i'm not mistaken, Thea render has working demo to try it with CPU and GPU, moreover they are going to release next version of the render this december (according their promices) with hybrid render mode.
Hope so but its been flaky I also use 3Dlight from time to time and also depending on the lights they too can take time. With Lux you can line up the renders and let them rip while you sleep or are at work...
Actually only the GTX 780 and Titan are worth for Luxrender which is the only thing speaking for AMD. For what I know, 3DSMax iRay, Vray RT, Octane, Blender Cycles, and all others are better with Cuda than OpenCL. It is a choice to eventually make before buying, and knowing the consequence : to self limit to one application or have to option to use them all. But as I said to anikad, he doesn't have to buy the GC now. He can use the integrated GC to begin and decide later when he is ready
@Anikad : there is already a Cooler Master V8 in the config. You can but anything else it is your purse
@ Anikad. I'm more of a Gigabyte motherboard fan but other than that it looks like a good setup.
@ Jimmy. I'm running Windows 8 64bit. I surely hope that Microsoft in their supposed wisdom has not put that that same limitation on Win 8 since I have 32 gigs of DDR3 RAM.
I don't think anybody should bother with that. I'm pretty sure most SSD will die for other reasons. The speed up with caching of files is a good use. One shouldn't refrain from using that. Having a more responsive system allows you to work quicker, do more(or better) thing and have time for something else. There is a test on Samsungs's TLC based SSD to predict the lifespan here http://us.hardware.info/reviews/4178/hardwareinfo-tests-lifespan-of-samsung-ssd-840-250gb-tlc-ssd-updated-with-final-conclusion
I don't like the idea of self penalizing just to keep the SSD alive for more than 10 (100?) years. And take in account that TLC is not as good as MLC or SLC NAND; these one will last even longer. So the philosophy is : you buy a cheap one for your OS and buy a new one if it ever dies.
For contents I'll wait a bit to see how it evolves. There is a factor to take into account for content datas : is it possible to recover datas on SSD once it died ? Till now the answer is no, as far as I know although we got the opposite expectations. So I'll go on trusting classic HDD to store datas
I mostly agree with you.
Didn't want to go too technical.
TLC based SSDs is a no goer. I recommend SLC based SSDs... for those who can afford them.
I never had a SSD die but it can happen. Less frequently than with mechanical devices but It can happen.
Disabling the system cache on a SSD is maybe not necessary but you also must consider that the system cache is the same size as the installed RAM. Same goes for the hibernation file. On some systems this may become crucial: On a system disk Windows 8 will by default create a cache file, an hibernation file and a restore file this will roughly cost twice the installed RAM. 16 GB of RAM thus means more or less 30 Gigs. On a 120 GB SSD (= 114.4 Gigs) those files can rapidly become a problem. Moving the system cache to an HDD can save some Gigs. I never tell people how to remove the hibernation file and it's up to them to decide how many restore points they want to keep (Many don't even know when a restore point is created). I don't think removing the system cache on a SSD is self penalizing. I didn't measure any difference with or without it. And it is just a few clicks away. On large SSDs => 500 GB there is no need to move the system cache: Most motherboards will only allow 32 GB of RAM.
Anyway I agree on one essential point: Data goes on an HDD. For several reasons. Recovering being one of them.
I'd add that non system software and specially games don't need to be on the SSD. Start time may be somewhat better but most of them won't run faster.
In Windows 8, MSFT raised the memory limit of the standard version to 128 GB. The basic W8 (Core) version supports one physical cpu. In W7 Home Premium, it's 16GB. You need W7 Ultimate or Pro to go beyond 16GB in W7. You need the Pro version of either to support 2 physical cpus (dual socket systems). See MSFT memory limits
As far as GPU rendering goes, with Lux, it's still beta and pure GPU rendering is "experimental". For routine rendering in DS or Poser, you do not need tons of GPU, but getting a decent video card is still a good idea. You do not need a high end video card if you are not rendering in GPU. See Lux Wiki
Clients hair who I was coloring last night is involved with a company that tests and recovers data for it's living. SSD Drives, when they die they die according to her so after so many write toos the drive can't do any thing more and all that's on it is lost. She says the industry is looking into rectifying this issue as it's the only way the SSD drives are going to really take off. I'd stick to what the industry standard is: SATA based drives for the time being.
Thanks for that info. I'm glad my 32 gigs of RAM are full accessible. YAY
Actually only the GTX 780 and Titan are worth for Luxrender which is the only thing speaking for AMD. For what I know, 3DSMax iRay, Vray RT, Octane, Blender Cycles, and all others are better with Cuda than OpenCL. It is a choice to eventually make before buying, and knowing the consequence : to self limit to one application or have to option to use them all. But as I said to anikad, he doesn't have to buy the GC now. He can use the integrated GC to begin and decide later when he is ready
This is true, but seeing as this is a hobbyist rig I would think most of those applications are out of reach. I know for a fact that LightWave Layout tends to get more FPS with AMD cards (pro or otherwise), which pretty much clinched it for me. I wasn't saying AMD was the key I was merely pointing out that nvidia is not the final word, unless you want to spend extra for CUDA based rendering. Not sure why I'm getting rumbled about this.
In fact it relies on the software used to make the most adapted choice. But without the information you can only advice what would work in all case.
For accessible GPU render, I'll say I have a copy of Gelato, Machstudio pro 2 (was free) and Blender Cycles is free too
Arion, Octane are affordable for a hobbyist and work with Cuda only and I'm sure there are some others.
For the moment OpenCL is not at the level of Cuda. The new OpenCL specification 2.0 was out this year and you'll have to wait before getting something well implemented. The new spec are to bring OpenCL to the Level of Cuda framework and will enhance Hybrid rendering.
But you're right, it seems AMD delivers better viewport performance with gaming cards in some apps but that is not enough in my view to recommend AMD unless in specific cases (Luxrender)). If the main goal is to model and you want to have a good viewport performance at an affordable price I'd recommend a Firepro v3800 or v3900 for example. But you won't have very good GPU rendering capability. Anyway you can't choose any card without some tradeoff unless you pay the price
Most of the hobbyists out there I talk to don't have the money in their pocket for a high-end GTX and a license for Arion or Octane etc.
BUT, that's not really the point. Like I said in my original post, AMD is the only way to go for LuxRender. That was it. I should have said for GPU based Lux, but it slipped my mind. Don't know how many times I've seen all people all excited to try out LuxRender with their shiny new GTX card. Most of these people I meet online, so I can only imagine how their faces crumple with despair when I tell them they've got the wrong card. That said a lot of these people probably want to use the program where CPU rendering would be more fitting. Believe it or not there are a good number of hobbyists out there that buy their card just for the sake of LuxRender.
I wouldn't recommend a pro card for a hobbyist unless their computer was never going to have games installed on it. That said I went very close to picking up a W5000.
Although I don't do a whole lot with Luxrender, I went a different route. For < $200 I bought a used Linux server with 8 cores and 16GB of ram. If I want to render in Lux, I export the scene from DS to a folder mounted by Linux, and run the render remotely on the Linux box. It runs maybe 5-6x faster than my puny dual-core Athlon. No, it's not faster than a GPU, but for a hobbyist, I agree it makes no sense to buy a pro-level card just for the GPU cores that you don't use often. The other advantage of an external render box is that I can also render Bryce on the same box. Besides, I can still run other things on the desktop while the "compute farm" is grinding away. Of course, with a server in the room, I need to put on the industrial hearing protection. :lol: Now, I need to figure out how to actually USE all those other render engines rather than just letting 3Delite inside DS do all the heavy lifting. :red: