Is there any point buying a MacBook Pro ?

in New Users
Is anyone using a MacBook Pro for rendering?
You currently have no notifications.
Is anyone using a MacBook Pro for rendering?
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
Yep. All my renders are done on a Mid 2014 15" MacBook Pro (16GB RAM, 2.8GHz Core i7).
It has an Nvidia graphics card -- the newer MBP's don't -- but the card only has 2GB VRAM, so DAZ Studio just ignores it and uses CPU rendering. As a result, renders tend to run fairly slow -- for my final versions, which I typically render at 3600 x 2400, I leave the render running overnight.
I wouldn't necessarily recommend it as a rendering platform (although it's great for Carrara), but it works fine if you don't mind waiting. If you're not wedded to Apple, but you have to use a laptop, you might find this thread about laptops with GPUs interesting.
Apple has newer MacBook Pros, of course, which are probably a bit faster than my old machine, but will still be bound by CPU rendering. However, Apple now supports external GPUs on High Sierra via Thunderbolt 3 (I think it's also supported on Thunderbolt 2, but it's trickier). So ... theoretically ... you might be able to buy a Sonnet eGFX breakaway box and load it up with a powerful Nvidia card and do your iRay renders that way. I don't know of anyone who's tried it; I also don't know of any reason why it wouldn't work.
But it's a pricey way to get what you might get less expensively by building your own (relatively) cheap rendering PC.
I thought as much, Alienware it is then by the look of things.
thanks.
I'm a Mac-head, but AFAIK currently there's nothing in the Mac lineup with a good GPU for Iray rendering, so I myself gave up and went with a PC.
If you do go with Alienware, make sure you get the most VRAM you can afford in the GPU. If you end up with too little VRAM, DAZ will have to render with CPU only, and then you'll be CPU-bound, and rendering times will be as bad as or worse than the Macbook Pro.
do you think 8gig vram will be any good?
Not as much faster as you'd think. I didn't notice a big jump in speed when I upgraded from a near-top-of-the-line 2012 MBP to a midrange 2016 (touchbar) model. This was unlike the 2008-2012 upgrade, which was like night and day.
8 Gigs is probably good. It depends on the size of your scene geometry, which has to be loaded completely into VRAM for Iray to use the GPU. If more VRAM is available and affordable, I'd buy it, but if 8 is the best you can do (which I think it was for the graphics card I ended up buying), you'll still be able to use the GPU on a lot of scenes.
But could you make use of the onboard Nvidia card if you wanted too for renders that don’t require more than the tow gig?
I have a memory of reading somewhere that DAZ Studio won't make use of GPU cards with 2GB or less of video memory. I can't confirm this, though. DAZ's own notes say that 2GB is sufficient for "a single character with a medium-sized environment". However, the OS also wants some of that VRAM for its own purposes, so you typically have quite a lot less than 2GB available to you, and DAZ Studio probably takes one look at what's left, gives up and falls back to CPU rendering.
I've never seen DAZ Studio successfully use the GPU to render on my MacBook Pro.
I used Iray in the early stages on a 1GB card - it would take a single nude figure (most of the time) or some scenes (most of the Stonemason sets I tried, that weren't sprawling urban settings).
Interesting so there’s a good chance 4 gig will work?
I mentioned it as I have seen one for sale a MacBook Pro Retina display 15 Core i7 2.3Ghz 16GB 512GB A Grade Dual GPU with a Gt 750m.
I have an Alienware laptop but I’m sending it back.
4GB should work, but again, you'll be limited. DAZ say "4gb can handle around 4 characters and a scene" -- I suspect that with hair, clothing etc. and increasingly huge image maps (plus the OS's own demands) you might expect to be able to work on scenes with 2 or maybe 3 characters.
Are you sure this machine has 4GB? My own laptop is a top-end mid-2014 MacBook Pro 2.8GHz Core i7. It's fitted with a GT750M, but it only has 2GB of VRAM, and my understanding -- which may be wrong -- was that this represented the most VRAM available in any Nvidia-powered MacBook Pro. (With the 2015 series, Apple switched to using AMD Radeons, which are no good to you).
I could be totally wrong, but double-check before you buy. If this is a secondhand machine, you can ask the seller to use 'About this Mac' and read off the 'Graphics' line. My own reads:
NVIDIA GeForce GT 750M 2048 MB Intel Iris Pro 1536 MB
(the Intel Iris is the integrated graphics card, the GeForce is the discrete graphics card).
When I eventually replace this 2014 machine, I might get one of the newer MBP's (it's my work machine, so I need a fairly beefy laptop) and see if it's possible to pair the thing with a GPU in an external Sonnet box. Failing that, I guess I'll just need a barebones PC of some kind. Although with the crypto miners driving the price of GPUs through the roof, anything in this line at all looks like a big investment.
You know what you are absolutely right.
sod it im just going to buy a new MBP but looking at Apples site it says they are a from 2015 which confuses me.
Is there much difference between the i5 and i7 I wonder 2.3ghz and 2.5 ghz with turbo boost up to 3.6ghz.
I can always get the breakout thunderbolt box at a later date as currently my renders are only small with one character, it’s more important for me to learn and fiddle around at present.
Apple releases new MacBook Pros about once a year, so the most recent models are from 2017. The 2016 and 2017 models are significantly different from the ones that went before, with USB-C connectors and (on the higher-end models) the new 'Touch Bar'. I'm not well up on the differences between i5 and i7, but if your budget will stretch for it, I'd aim for a faster i7.
You might want to look at Apple's refurbished MacBook Pros. Refurbs are machines that were sent out, found to be defective in some way, and sent back to be fixed at the factory. Apple then sells them off at a reduced price because they're technically not 'new' any more. However, they're rigorously tested -- the testing standards are at least equal to the tests applied to new machines, and possibly even more strict -- and they're covered by a full Apple warranty (and you can get AppleCare for them as well, extending the warranty out to 3 years). Most importantly, they're usually about 15% cheaper than brand-new machines, which can be a saving of hundreds of dollars.
Your mileage may vary, but I've bought refurb devices from Apple in the past and never regretted it.
Good idea the Alienware I bought was a refurb and I saved £700 so I know what you mean.
Is there any sort of external GPU device that could be plugged into a laptop? I use Mac Pro towers with Nvidia cards powered by an external power supply unit.
A company called Sonnet makes 'breakaway boxes', which are external boxes with a power-supply and slots for PCI cards that can take a high-end graphics card. These will connect to a modern Mac with Thunderbolt 3 ports, running High Sierra, allowing the Mac to use the card as an external GPU. Apple actually ships the Sonnet box as part of their 'Augmented Reality kit' for developers.
So that would probably be the way to go. The Amazon page for the box in question says:
You'll notice that it doesn't mention NVIDIA support on Mac. A review at 9to5Mac.com says that NVIDIA isn't currently supported, but that there are third-party workarounds. I'm still using a Thunderbolt 2 Mac, so I'm not in a position to test this out. And while I haven't yet heard of anyone successfully doing DAZ Iray renders on such a setup, I don't know why it wouldn't work.
I notice that one user on the eGPU forum claims that they were able to use a GTX980/Sonnet e550 from a 2013 MacBook via Apple's Thunderbolt 2 to Thunderbolt 3 adaptor, so maybe there's hope for us old-school users too.
EDIT: OK, here's a user who says that they are successfully using their Sonnet/GTX for DAZ Studio Iray renders. One interesting thing is that they also say "... daz manipulation significantly slower than without egpu ..." which is a bit worrying. However, they are using a TB2-to-TB3 adaptor, and I think an eGPU over Thunderbolt 2 is meant to be noticeably slower than over Thunderbolt 3. And it's possible that working in the Studio UI involves lots of back-and-forth across that slower bus, while rendering just involves Studio handing everything off to the graphics card and saying "Here, get on with this."
From what I’ve read thunderbolt 2 is half the speed of thunderbolt 3 which is why it will seem a lot slower.
The Daz UI shouldn’t need a fast graphics card to run as my old windows laptop ran it just fine with onboard intel graphics, this is why my Alienware laptop is terrible on battery while using Daz as the Nvidia software kicks in the GTx 1080 for just setting things up before rendering rather than just using the onboard graphics and unfortunately you can’t manually choose if you have the UHD screen version I have, but you can in the cheaper fhd models, so major design flaw there, hence why I’m returning the laptop and going the Mac root as two hours on battery is torture for just setting up poses ect lol.
I will add an external gpu at a later date, at a 2k plus price point I expect professional standards.
My guess would be whether they're driving the monitor off the egpu or the built-in graphics. I use a 5k iMac and am considering this as an option for Iray rendering. In that case, I'd just be using the egpu for rendering. I could see if you're using Iray preview then that would slow down if it's going to use the GPU as that's going to need to transfer the details to the card over a slower interface as even Thunderbolt 3 is an x4 PCIe interface instead of x16.
Yes I’m wondering if you would need to use a second monitor but assuming you wouldn’t have to.
how do you turn off Iray preview by the way?
Okay, I wanted to try this eGFX Breakout Box with a nVidia card pretty bad. This weekend I ordered it with a 1050 ti (all I could afford quite frankly). With cables, since I only have Thunderbolt 2 I needed a new cable and a TB2 to TB3 adapter, the cost run about $600. I had both high hopes and low expectations. Some background info:
I have a MacBook 15" Mid-2012 with 2.6ghz i7, 16gb of RAM, a 2tb SSD and a GeForce 650M with 1gb of RAM. I also have an external monitor setup so I can have two screens working at the same time.
I love Daz Studio and wanted to have the 650M do wonders with it's CUDA cores, but alas, it was not to be. I don't think the chip is new enough to work with the Iray code as my renders with it are the same as the CPU. I have one simple image that I used that took my MacBook about 3 hours to render on CPU. I even re-tested it today.
So I decided to do some real world tests. I first tested Elder Scrolls Online. Imagine my surprise when my 650M beat the 1050 ti. I was getting 60+ fps standing a room and a solid 24-30 running around. Apple has done some good work here as have Bethesda on ESO to make Open GL much smoother. The 1050 ti was getting almost 60 fps but wouldn't hold it and getting a 12-18 running around.
This was disappointing, but it was my first test and my real reason for this setup is for Daz Studio, so I fired it up and testing the Interactive window first. While there is some improvement, the 1050 ti was able to render a bit faster and was smoother moving the object around in the scene, it wasn't mind blowing. Up to now I was thinking perhaps I needed to better card and the costs would be prohibitive at this point. Still, I went ahead and did a full render, just the 1050 it.
I was blown away. The CPU and the 650M would take 3 hours. The 1050 ti took 16 minutes!!! That's over a 11x improvement!!!
Here is the log info:
GeForce GTX 1050 ti
2018-06-19 17:51:22.686 Finished Rendering
2018-06-19 17:51:23.101 Total Rendering Time: 16 minutes 48.74 seconds
MacBook Pro 2.6ghz CPU
2018-06-19 22:39:28.016 Finished Rendering
2018-06-19 22:39:28.458 Total Rendering Time: 2 hours 56 minutes 37.48 seconds
So, I have to say, it's been worth the investment based solely on Daz Studio. I'll still want to upgrade the card to get more speed later, but for now, this saves me so much time it's not funny. Not thrilled with the cost, but the video card market seems to have taken off, so getting a cheap video card is no longer an option. Also, there are only certain cards that are compatible wtih the eGFX. It will also state that the nVidia is NOT compatible with the Mac, but there is a very stable installer (through Terminal) that I had no problems with. I keep my Mac pretty healthy and I ran all the maintenance programs and backed up before doing it.
So far, I'm happy with this. I get to keep my MacBook Pro, which I love, and I get renders in a fraction of the time.
I've included a copy of the render in case you were curious. Plus a screenshot of the on screen setup. FYI, it says my 42 inch display is 1280x720, but it's 1920x1080.
Thank you, @macsavers, for this very detailed and helpful write-up. I think you've just cost me a very large amount of money.
I have some questions ...
I assume that you're using the 'macOS-eGPU.sh' script from fr34k at egpu.io, yes? Did you find the installation straightforward, or were there gotchas that we ought to know about? Is your system generally stable with and without the eGPU after installation?
You mentioned buying a TB2-TB3 adapter and a new cable. Is the adapter the standard Apple one? Would I be right in assuming that the cable was just a TB2 cable for connecting your laptop to the adapter, and then the adapter plugs into the eGPU unit directly?
Thanks in advance for any more information you can give.
The one I used was at https://egpu.io/forums/mac-setup/script-fr34ks-macos-egpu-sh-one-script-all-solutions-fully-automated/paged/1/. The instructions were pretty thorough and I did not have an issue.
One thing I did to help insure it going smoothly is boot into Recovery Mode, do a Disk Utility -> First Aid run to fix any small issues first. Then I ran Time Machine. I did that BEFORE I disabled SIP.
The terminal script worked just fine. It found all the drivers without any issues. The system has been stable. Only two (2) issues have cropped up, neither one being a deal breaker: The brightness buttons on F1 and F2 no longer work and if I disconnect the cable to the eGFX, it crashes the System. If I just turn off the external monitor, it's fine and the dual monitor solution in Mac OS works like it is supposed to, but turning off the eGFX or disconnecting it, something I could do with the internal GPU, causes the system to crash. For the brightness, which I use to keep my computer running during the night for backups, I just have the Energy Control panel darken the screen after a few minutes. That seems to work for me.
One thing I did on the hardware is make sure the model matched what was compatible with the eGFX. I found a lot of cards in my price range, but only one was an exact match to a model number on the list. I didn't want to take a chance. In the future, I plan on getting the card direct from nVidia, even though others might be cheaper. I don't believe we are getting any benefits from the third parties tweaks because you need their drivers to get the benefits, something the Mac and the eGFX won't be able to do. The price points aren't that far off either.
I got the eGFX from Amazon as their price was the exact same as others, but I was able to get a 4 year extended warranty for about $6. That was worth it to me.
I will need to upgrade my laptop in 2019 most likely as the Mac OS won't be available to my old MacBook Pro by then, but the eGFX allows me to buy $1k laptop instead of a $3k laptop. While the eGFX and nVidia card has set me back a bit now, it will save me tons later.
The adapter is the Apple one. I decided it was best to use Apple products on this. The Thunderbolt 2 cable and the Thunderbolt 2 to Thunderbolt 3 adapter are both Apple branded that I purchased locally at Fry's. The Thunderbolt 2 cable attached to the MacBook Pro, extends to the Adapter that plugs into the eGFX. There is a HDMI cable going from the nVidia card to the external monitor. My external monitor is an RCA 42" HD screen which works really nice. Much better than a 27". ;-)
Thank you, @macsavers.
I'm on the fence about whether to splash out a ton of cash on a system which may or may not work, but if eagerness overcomes caution, I'll let everyone know how I get on.
Amazon seem to really want me to do it: the already-discounted price for the eGFX just dropped another $20 while it was sitting in my cart overnight. ;-)
I know how you feel. I saved up over the last few months to get the eGFX. I hadn't saved for the card and was quite frankly suprised at the high cost. Apparently VR and virtual currency mining is the cause of the prices no longer dropping.
My only issue is whether I want to return the video card and save up more money for a more powerful card. I also purchased the 550 eGFX, but I could have gotten the 350 eGFX and it would have been fine. I just wanted the longer life of the 550 in case video cards started wanting more power to drive the newer ones in the future.
The tide may have turned as far as virtual currency mining is concerned. I'm seeing a lot more deals on GPU cards, and cards that have been out of stock for months are suddenly 'in stock' again. Prices are dropping from 'utterly insane' to 'merely slightly-overpriced'.
It's hard to predict whether this state of affairs will last. Apparently Bitcoin mining is moving increasingly to purpose-built ASIC (Application-Specific Integrated Circuit) rigs. But Ethereum, according to my friendly neighborhood crypto-weenie, is ASIC-proof (i.e. the 'proof of work' calculations required for ETH are designed so that they can't be efficiently implemented using an ASIC). I am skeptical about that on general principles, but if he's right, then there's still going to be demand for GPUs, particularly as BTC loses its lustre and speculators start looking around for the Next Big Thing.
Of course, if we're lucky, ASICs will take over crypto completely, and all the people who thought they were going to get rich off mining BTC will start unloading their 1080 Ti's at fire-sale prices. We can dream, can't we?
Ooo... do dream... I actually saw someone put a 1080 it on Ebay for $10k. They must be out of their bloody mind. I might hold out until I see one for $200. ;-) Then I can go ice skating in Hades.
Since you can buy 1080Ti's for $800-900 from NewEgg currently (inflated, but not outrageous), $10K seems just a little overpriced.
On the other hand, you may need to wait a few years to see them hit $200, and if they do it'll be because they've been replaced by something that renders at 10 times the speed.
One more question for @macsavers: do you know if it's possible to use the breakaway box/graphics card 'headless', i.e. without an external monitor plugged into the graphics card? (Or with one plugged directly into the laptop). Or does the graphics card have to have a monitor plugged into it for the setup to work at all?