Guess ECC is not high on the list for potential NUC buyers, even if it's a Skull Canyon NUC. I think most people would rather go for better graphics than ECC.
All of the cache in Intel CPU's is ECC anyway. Chances of errors not being corrected by bad information pulled from system RAM is rare (16e^64) in consumer applications.
ECC was more important when there was a small amount of system RAM, but these days with the amount of RAM available, ECC is not only less effective but less necessary.
I'm all for ECC memory in the server/mainframe space but for this application it is certainly an odd request. This is, for the most part, a laptop without a screen.
>ECC was more important when there was a small amount of system RAM, The larger the array the larger target it is for the radiation to hit. Laptops are often being used in high altitude environments with generally less shielding than the typical desktop or even a server so from that perspective ECC would seem beneficial basic feature. Then there are the supercomputers, a fun read: http://spectrum.ieee.org/computing/hardware/how-to...
@TeXWiller: "The larger the array the larger target it is for the radiation to hit."
While this is true, transistor density has improved to the point that, despite having magnitudes more capacity, the physical array size is actually smaller. Of course the transistors are also more susceptible given the radiant energy is, relatively speaking, much greater compared to the energy in the transistor than it was when transistors were larger. There is also the consideration that as transistor density increases, it becomes less likely that radiation will strike the silicon (or other substrate), but miss the transistor. So we've marginally decreased the chance that radiation will hit the substrate, but significantly increased the change that any hit that does occur will be meaningful.
Demo a system on lot of virtualized machines. ECC is a must have for server workloads and although a chance that the demo will fail due to flip-bit is low, still man would rather not risk that at customer premises right?
It would be really cool to see them make a Xeon-D based NUC oriented for home NAS/router/micro-server use, with dual M.2 for onboard RAID 0/1, Thunderbolt for external storage, a minimal GPU, ECC RAM, and dual 10GBase-T support!
100% agree. If you known how high tech DRAM is and how easy it goes wrong (just google for rowhammer...). I spend already a hughe amount of time to figure out RAM issues which have ECC (as otherwise you would not notice) in embedded systems, that I do not trust RAM without ECC anymore. Still waiting on the first xeon laptop without quadro (which just drains power and produces heat...) but with ECC. I use my computer for work, not for playing games. So if it can drive the display size used, then it is ok... I do not want any laptop with an extra GPU, but I want the ECC.
Can someone tell me if they have been able to run 4x 4K monitors (i.e. HDMI hooked to one, mDp hooked to another and then 2 mDp hooked via adapter to the Thunderbolt 3 port)?
@JohnGalt1717: "Can someone tell me if they have been able to run 4x 4K monitors ..."
I just tried it with a slightly different setup, but it was still an HQ processor with Iris Pro 580 (i7-6870HQ). It looks like DanNeely is correct. The IGP is still capped at 3 displays even without considering 4K.
What is the expected market for this? For both my professional and personal uses, lack of proper dGPU support is a deal breaker - for anything from games, through movie and photo enhancements to machine learning and graph stores all being accelerated by DGPUs, I suspect this will have a very limited to a very small subset of hardcore enthusiast
I had the NUC5i7, the machine was just too loud when used in professional manner. And it was ridiculous, they had space to put a better cooler.
I had my doubts when initial mockups flew around internet. This is again another useless NUC. Crazy who has final say over design on this at Intel. They just keep shooting themselves in the foot. Seriously. They are afraid of success.
Not every game requires a dGPU, as many popular MOBAs play just fine without it, and there are people who play competitively for these games on worse hardware.
This is nice as a HTPC, a thin-client, an emulation station, or a monitor/TV mounted PC for grandma or grandpa.
Understandably, this isn't a product you need, but do keep in mind that others on the market do want mini-PCs and can find a use for them.
Oh, also, the inclusion of Thunderbolt 3 means this could be a PC you can velcro to a external graphics card dock (like the Razer Core) to have a very portable LAN PC.
won't work well* The link is only at a maximum PCIe 3.0 x4 under ideal conditions and it is separated from the CPU so the performance gain won't be optimal.
it will work fine. Intel has been using the Razer External GPU Chassis and they even commented on it here on Anandtech Comments, on the last article that was posted about it. DMI 3.0 still does 4GB/s and the CPU is not transferring huge amounts of bandwidth hungry texture data back and forth with the CPU
Plus the fact that PCIe 2.0 x4 was shown to be the tipping point between bandwidth limited and not for single GPU systems, I believe. 3.0 is twice that, and 2.0 4x barely even throttled that one graphics card -> there's a decent bit of overhead available.
Grandpa/Grandma don't need a 1k PC. They would be adequately served by the PC sticks. This would be massive overkill for Fakebook browsing. This is a product desperately searching for a niche market.
Let's see whether you think the same way when you are a grandparent, or at least old enough to be one. I am a grandfather, and I need a decent PC for my 4k gaming, among other things - which means way more than 1k at this time. (I have at least 2k in the 3 4k monitors on my desk, for one thing). I suck at FPS games these days - reflexes just aren't what they were decades ago - but I still have fun. So maybe the ageism should be put away?
Why would you spend 1000 dollars on something a 300 dollar machine does almost just as well? No, this is a really an odd product which can't be upgraded, which can't play current games and which cost an arm and a leg. And why, to save a little bit of space on a desk already housing a (most likely) large monitor, large keyboard and a mouse. I could see the charm of something like this had we had easy access to MXM (or similar standard) modules. But for anyone else far better and almost as small options seem to exist.
for $1000 one can get same specs laptop which comes with screen and keyboard already = much better buy/value.
The only thing when NUCs make sense is if you need to hook it behind telly and use it as streaming device otherwise any other option gives you much more.
Yeah. I have never understood this "I need a high-end PC, thus I buy an expensive PC which will only be high-end for the next 12 months" while keeping it for the next three years. Either we need high-end components and then we will always need the latest high-end components or else we don't need high-end components and then it doesn't make sense to buy expensive bleeding edge stuff.
i am thinking about it, but it is too expensive right now. it needs to be $400 and should come with windows usb. Right now it is basically laptop without keyboard, screen, touch-pad and external power supply. I really like the form factor ...
I'd agree that it is overpriced by a fair margin, particularly compared to other mini-PCs on the market. Yeah, it does have the best CPU package amongst them, but you'd expect that to be mated with a good GPU solution as well. Given that the GPU solution is awful once it's fully configured (at a retail price of ~$1000 all together), there isn't much of a value.
If it had two LAN ports, it'd have the niche of being a great PfSense or router box.
I think this machine is great, not perfect or ideal, for light to medium graphic design work including web graphics. It is fast enough, small, look nice, once configured most designers won't open the machine ever, it can be used with an entry level profesional monitor, plug an external hard drive and add a great keyboard and mouse. It is not for me anyway, but I appreciate this initiative as in the future it may become powerful enough for more demanding work. If I can dream I wish it could have dual mobile high-end graphics and at least 32 gigs of memory, even if it gets bigger. With faster thunderbolt may be a hit. I will keep an eye on this form factor.
I've actually visited a cartoon animation studio the other year, of which 80% was running on Intel NUCs, think it was i5, and everyone had hooked a Wacom Cintiq to it and they worked like little bees, without much issues. They had more powerful machines for more demanding tasks and a render farm in the back, but most work was done on these little boxes.
The reality is, if you're not playing video games, you really don't need a dedicated GPU for the majority tasks you do on a PC. That being said, this skull canyon part is interesting, yet overpriced in my opinion to really pick up.
I am buying this as a portable computer for software development, which can be put into pocket to be carried between office and home. I don't like laptop for software development due to constrained keyboard and display.
Is there a reason you can't use the same display/keyboard you are using for the NUC on a laptop and at least get the benefit of a built in UPS? You'd also have a display and keyboard and the ability to run off the wall should you ever have an emergency, but I do understand your desire for a better keyboard/display. I feel its a bit too expensive for me, but I can still see some viable uses and you seem to have one. In any case, if you decide to get it, let us know how it works out for you.
Looks impressive for such a small integrated GPU package. Perhaps it's too early though as the GPU still doesn't have full HEVC 10b decoding for HTPC. Doesn't AMD's Carrizo, and upcoming Bristol Ridge sport this?
Can you please add the Intel D54250WYKH nuc as an option to the comparative PC configuration? For people who are upgrading from the best nuc available back in the day...
I think it's time to drop the 1280x1024 gaming benchmarks. Virtually no one is going to play at such resolution, especially not with a 1000$ pc if a 22" 1080p monitor can be bought for a hundred bucks and change.
The aspect ratio does not really matter for GPU testing, it's just the number of pixels the GPU has to compute. So performance at 720p will actually be a bit better.
Its rather lame that Anand would post up these low resolution benchmarks to try and make the iGPU not look like a total joke (which it is, at least at this price point).
For $1000 if it can muster a playable framerate at a resolution outside of a decade old standard than this thing is overpriced.
Lots of casual gamers do play at low resolutions because they don't have the budget to stay on the high end GPU treadmill. The real issue is that the days of doing so at 1280x1024 instead of 1366x768 are long past. This was brought up the last time gaming benchmarks were updated here; but is even more of a glaring issue as time goes on.
1680x1050 really should be replaced with 1600x900 too. 16:9 monitors have become ubiquitous; testing at narrower aspect ratios doesn't fit real world usage anymore.
I could see a case for going wider at the upper end and slotting an ultrawide 3440x1440 test between conventional 2560x1440 and 3840x2180 gaming. Mostly because it looks like the 1080 still falls just short of being able to play at 4k without having to turn settings down in a lot of games; making 1440p ultra widescreen the effective max single card resolution. (An increasingly important consideration with SLI/xFire becoming progressively less relevant due to temporal AA/post processing techniques that play really badly with multi-GPU setups.)
Yeah, I guess my point was IF you want to test at low res, then test at a more relevant low res - 1280x720, 1366x768, 1600x900 etc. But my other point would be that those graphs looke like they look now cause low resolution is paired with low settings, mid resolution with mid settings and so on. Many games these days don't really slow down that much at increased resolution, but rather at increased postprocessing effects - shadows, antialiasing, DoF, you name it. Before I had my current gaming PC I used to game on a laptop with GT555M inside, which is probably weaker than this IGP by some margin, and I ran most games in 1080p at acceptable framerates by turnig the details down. In general it yielded better fps AND better looks than running non-native res and mid graphics settings. But maybe it's just me, I like pixels a lot ;)
You rebutted your own statement. Casual gamers don't buy $1k mini-PC's. Testing this at super low resolutions can only serve one purpose, which is to provide the appearance of acceptable performance.
The point with mini-PC reviews with a gaming focus is that they are spread far apart - we may be lucky to have 3 or 4 in a year.
So, it boils down to what we think is more relevant to the reader - a set of benchmark numbers that have to be presented standalone, or a set of benchmark numbers which can be compared apples to apples against some similar previous-generation systems (because, that is what we have the numbers for). We think the latter makes more sense, and that is the reason we are having these 'legacy resolutions' in the gaming benchmarks.
I completely understand why you need to present the information, I just don't think this really meets the "Skulltrail brand" expectations. Skulltrail was always an enthusiast platform designed by enthusiasts. This product looks like it fell victim to marketing requiring a certain thickness of chassis. This product waters down the skulltrail branding, though I guess skulltrail really isn't even relevant anymore. I just don't understand who this is designed for I guess.
I'd love to see a ~90W TDP version of this with CPU cores getting about 30W and GPU having 60 or so allocated for it. Even 65W TDP part would be a definite improvement for gaming as CPU / GPU clocks could stay considerably higher during loading of both parts of the chip.
With proper cooling it could actually compete decently with low-end discrete graphic laptops. Now it is clear that TDP is limiting it badly.
The question is: How is the perf/w compared to for example A9x GPU parts or Maxwells? Somehow I'm not terribly impressed by Intel's GPU's. Especially considering that they've had their hugely superior manufacturing technology which should help...
Isn't is kinda a no-brainer to make this thing a little big bigger (with a little better cooling), to avoid throttling? Wouldn't just an inch taller help immensely?
Sure, but at ~55mm height it is beginning to look similar in size to a 70 mm high Mini-ITX case, which you can use to build yourself a system with similar compute power, for less than half the system cost.
So it really needs to be very flat and very compact to qualify as a niche-product. Asking twice the price for just a 20% difference in some aspect is usually very hard to sell.
So in other words, another 20 mm gives the customer a 50% haircut on price. Yup, more evidence this is a product in search of high margins and little else.
Are you kidding? That is actually a pretty damn good achievement, I mean that uses a Cape Verde ASIC, and has about 1TFLOP of compute power, (640 shaders @ ~750 Mhz)
The Iris Pro 580 has 576 FP32 cores (72 EU * 8) and runs about 1Ghz, so has a bit over 1.1TFLOP compute -- so really pretty similar in terms of compute horsepower. It seems like it is performing right where it should based on it's hardware config.
Well, looks like I will be waiting yet another generation. Still can't game on 1080p. I like the storage and Wi-Fi numbers, and I can picture this doing fairly well in some office environments, but are there that many people who are building home PC's (user still has to add RAM etc.) and who will pay that kind of a premium for a device that is somewhat limited in it's use case? I can't speak for anyone else of course, but there is no way my employer would fork out $1k for this, not when they can spend $400 on a cheap business laptop that takes up a similar amount of space in the office but also offers them the ability to extract additional work from me over the weekend. I love the idea, I just don't think the hardware is there yet.
Alpine Ridge has a USB 3.1 host controller integrated along with the Thunderbolt controller. Please look into our Thunderbolt 3 hands-on coverage here: http://www.anandtech.com/show/10248/thunderbolt-3-... : Note the presence of the xHCI controller in the Alpine Ridge block diagram towards the middle of the linked page.
Skylake gets much better GPU performance/watt than Broadwell did, as evidenced by the NUC with 48 EU 64MB eDRAM being fed by just 23w continuous. That's a huge improvement form the 45w this beast used to take!
I think the only surprise for me was just 40% performance improvement over the 4770r. I always assumed the 4770r was bandwidth-limited, but I guess the eDRAM cache was enough to keep things fed.
But yeah, pointless product continues to be pointless. Intel charges a premium for these things because they take up more die space and require dedicated eDRAM cache to feed them...just like discrete GPUs take up more die space, and require dedicated DRAM to feed them. Where is the efficiency gain in this crap?
Oh, I just noticed the review uses 2133 DDR4, which would account for the 40% performance increase we saw. I thought for sure a "premium gaming" platform like this would ship with z170, so I didn't give the test setup a second glance.
I guess Intel cheaping-out with H170 has forever doomed this machine to mediocrity. Too bad, dropping ten bucks more on the Z170 would have allowed some much more interesting memory configurations. With DDR4 2133 we're probably castrating performance.
"Our only concern is that the cooling solution keeps the temperature of the cores too close to the junction temperature during periods of heavy CPU load."
My rMBP 15", Iris Pro only model, routinely hovers at the tJunction max at load. Is this a real concern? Or is it designed to do this?
"Tjunction Max is the maximum temperature the cores can reach before thermal throttling is activated. Thermal throttling happens when the processor exceeds the maximum temperature. The processor shuts itself off in order to prevent permanent damage. Tjunction Max (Tj Max) is also referred to as TCC Activation Temperature in certain processor datasheets."
Basically, reaching the Tjunction means the CPU is close to shutting itself off to prevent damage. That might mean there are longevity implications related to brushing up against that upper ceiling on a regular basis, but I haven't seen any statistical data regarding a meaningful sample of processors put under such conditions failing more often during their few years of useful life due to CPUs going bad just because the OEM decided to implement a cooling solution that allows the processor to wander up to the Tjunction temp when it's working hard.
I think a bigger concern might be looking into whether or not the rMPB in general will approach Tjunction under load or if that's abnormal. Abnormalities might point to some sort of problem with your specific laptop. I don't know what's status quo for your hardware so its hard to say if that's something you should worry about.
Yeah, that's something I looked into, Anandtechs own retina macbook pro 15" review only pegged them at going up to 76 ish celsius if memory serves, but that was the older dGPU model with the 650M. From the threads I'm seeing, the Iris Pro model does regularly hover at 99-101C, I'm guessing since the GPU grunt is right beside the CPU on a single die so heat isn't spread wider like the dedicated GPU model.
I don't see any reports of this model failing though, so I'd hope they tested extensively at 100 degrees and found it was fine, and so allowed the processor to keep its boost long enough to get there.
I do wish they could have just added another few mm so that the cooling was better and the CPU and GPU could stay at boost longer, and with that room they could have added some mm to the keyboard too (which I consider the absolute minimum in key travel now).
give us a 65W CPU with iris pro , add a couple inches in height... and use the stock retail CPU cooler. Add USB-c in the front. Use USB power delivery usb-c for power.
Except the niche NUCs fill dont want a NUC the size of a MINI-ITX case. Not to mention intels stock cooler is not the quietest nor the best cooler in existence.
well a couple inches of height would not put it near the average mini-itx case size. It's impressive that these NUCs are small, but they goes a bit extreme when they use laptop chips and cooler designs meant for laptops. We can get a very small desktop without sacrificing CPU performance and acoustics/thermals.
Use 65W+ chips w/iris pro, full size intel retail heatsink, usb-c power delivery... no wasted space with expansion slots. 1 m.2 should be the only internal slot.
thanks a lot for the review. I am thinking to get this machine for photos editing (lightroom) and mobile development. would it be a good choice ? I was thinking the processor/ram/ssd are good enough to provide great performance for the next 2/3 years and the iris pro can be a good gpu to support monitor with high resolution
Keep in mind that you are bringing your own RAM and SSD, the kit does not include those items for the consumer. As for the iGPU providing support for high resolution, I think that will depend entirely on your workload.
Ganesh, Did you get confirmation directly from Intel that the PCI-E is limited on this system because it runs through the H170? From my research on ARK and other places, it appears that the H170 acts as a PCI Express passthrough, with a PCI express 3.0 x16 connection to the CPU, and the ability to split the configuration off to smaller widths and more ports coming off the H170. It would seem the DMI3 connection is for other (non-PCI express) peripherals. Granted, from the block diagram, it is not apparent that the H170 connects to the CPU's PCI-E x16 connection, but my guess is that it does.
I would just like clarification, as this is a pretty big deal.
I have confirmation from the technical marketing manager for NUC products at Intel that the communication link between the H170 and the CPU is only effectively PCIe 3.0 x4 for bandwidth purposes. It is definitely not a PCIe 3.0 x16.
H170 itself can act as a PCIe switch, but, for anything that talks to the CPU, it has to go through the DMI 3.0 lanes.
The 16 CPU lanes are entirely un-used in this device. The PCH (H170 in this case) is NEVER connected by a PCIe x16 link -- it is always connected via DMI 3.0 in the H, Q, B and Z platforms. DMI 3.0 has the same B/W as PCIe 3.0 x4. All of the stuff hanging off the H170 shares that same DMI 3.0 link.
Great to know, can you tell me what "Processor PCI Express Port" under "I/O Specifications" details are for on the Intel's ARK for the H170 part? I thought they were for connecting to the PCI Express on the CPU, but would be happy to learn if I am incorrect.
I'm disappointed in the lack of teardown pictures. I was at the very least expecting a look at the cpu side of the board. Is that too much to ask?
Also, considering the massive power throttling seen in your testing, and the torture test nature of the testing, I'd love it if you could monitor clocks and temps during gaming too - I'd be interested in seeing what kind of cpu clocks this can maintain in a low-threaded gaming workload.
All of the Skull Canyon reviews so far online have been relative failures. I hate to bash this work but a few points that need to be said.
To only test this with 2133Mhz is a shame. Intel stated that 2400Mhz works without a FSB OC and you can run up to 3000Mhz with one. That would change the gaming performance tremendously but not a single site has bothered testing this.
We just didn't learn anything that wasn't easily known from just looking at this on Newegg. We knew it would be really fast for the size/power requirements. We knew it would be fairly hot on-load due to past NUCs. But we didn't know how it would react with DDR4 2400/2800/3000+.
The other problem is that there's concern expressed in the review that bidirectional 4GB/sec bandwidth isn't enough. Its been proven and known if you look into it that PCIE 1.0 x16 (4GB/sec) does not bottleneck a GTX 980. Skull Canyon should be closer to 5GB/sec than 4GB as well. This wasn't tested with a Razer Core, but I think there's a really good change this is the fastest stock gaming CPU on the market today when paired with a discrete GPU due to the 128MB L4. It was shown that Broadwell 5775Cs were already holding that crown in the past.
Considering how incredibly impressive this NUC is already with its small size, low power draw, various Thunderbolt3 options (storage/GPU/docks): both of these points on faster DDR4 and 128MB L4 impact with a dGPU would make it even more impressive than it already is and probably result in slam dunk territory.
I think everyone in the tech community is massively missing the mark on this one! It just hasn't been properly tested. Intel absolutely nailed this product but is failing to properly instruct reviewers on what to test. Send me a sample, forumemail123 at g mail. I'll do it right.
So, you want this product reviewed in tests that will show it in a better light, and ignore all the standard tests that give it an apples-to-apples comparison, showing that Intel has a long ways to go to provide good value to their customers in this market segment?
Ah an AMD poverty gamer arrives. Apples to apples against what? Older NUCs? There is no other competition for this small of a form factor. Certainly not from AMDone.
I'm asking that the things that all of us who have been so excited about this product have wanted to see. DDR4-3000 IGP gaming performance and Razer Core FuryX or 980Ti performance.
As I noted, this review told me absolutely nothing that wasn't already known just through common sense. No one will buy this thing for a ho-hum NUC, there's already plenty of those. We're buying them for the size/performance combo and going to run 3000Mhz DDR4 or Razer Core with it.
You just showed your true colors. I am anything but an "AMD poverty gamer." I need to know how this device compares to other computing devices, so I can determine if the small form factor benefit is worth the performance hit. Very few people are going to have a demand for a computer that fits a particular small form factor, and are willing to do anything to hit that size requirement. Most people just want the best value, and size is a component to that.
If we are performing tests that show best case scenario for this unit, then we'd have to do the same for every other bare-bones unit. Then there would be no true comparison, and each piece would be no better than a cnet review bought and paid for by the manufacturer, and we would be no better informed.
Nice attempt at digging yourself out of that hole: you'd be pissed to see this thing shown in a positive light because as you said supposedly, "Intel has a long ways to go to provide good value". Shows how much you know, just taking the typical fanboy stance on this thing without knowing what you're even looking at- much like this review.
The point remains: people want to see this used with varying RAM speeds. It affects the gaming performance greatly due to the IGP. Also people want to see it benched with a 980Ti / 1080 to compare to other high end gaming CPUs. There's absolutely no reason that's "best case" at all. It's just asking for a full review.
Very dissapointed TBH. i7 5775C performs on par with a GTX 750, so i thought this 6770HQ packed with a much stronger integrated graphics and double the eDRAM will be a monster but instead it performed much weaker, can't even matched a 5675C let alone a 5775C. I guess the power limit and TDP really limit the potential of the iGPU. Sigh. Probably need to wait until 7nm CPU to have a playable 1080p integrated graphics solution.
Maybe the drivers haven't caught up. Or maybe it's heavily throttled because of heat. Seems very strange that it's not a substantial upgrade from the 5675c or 5775c. Hopefully, something else with a similar form factor will ship with the same CPU. Note the very wide difference between the two gtx 960 based units in this review.
This is a 45W TDP part, while the 5775C is a 65W TDP part. That is a substantial difference, as a larger TDP allows more leeway for the GPU than just the 20W difference would suggest.
Also, not sure why multiple commentators are talking about two GTX 960 / same GPU when it comes to the GB-BXi5G-760 and the MAGNUS EN970. They are not the same GPU at all - the former uses the Kepler GK104-based 870M, while the latter uses the Maxwell GM104-based 970M.
I thought it would be more to be honest. Looks like the size of the heatsink is limiting it. It's good, but at nearly double the EUs of the already ok Iris Pro 5200 in my Haswell machine, I expected the 72 EU to do better than this.
Missing Noise levels in anandtech reviews, with the thermals ?
The product maybe has a target, which is not me, but the problem with the mini pc reviews at anand is that we only get 1 or 2 little measurements on the fan noise, and in the skull canyon Nuc, I think it's disappointing, as you mention it in the final comment
> " We would gladly trade a modest increase in the footprint of the system for lower fan noise. That said, the fan noise is in no way comparable to the BRIX Gaming lineup. It is just that it is not as silent as the traditional NUCs."
For example, measuring noise at idle, at gpu / cpu loads, and getting a noise comparison at these points would be useful : the thermals are very nice, but you cannot compare them really/easily from one pc to the other.
My point is that with the current idle power at 17 watts in your test, this racehorse nuc is never ever silent or even quiet, but I would like to see comparisons with the other nucs or at least the MSI cubi 2 recently reviewed.
Also curious about the noise level of the asus VC65 / VC65R relative to that of the Skull Nuc ;), what I mean is that the chosen form factor can be either optimal, or poorer than other 35W-45W solutions.
Will we be getting noise comparisons betwen PCs in a nice chart soon ?
I'd say the only NUCs which make sense to buy are the cheapest options(not the one reviewed here). Good enough for office work, youtube, Facebook and internet browsing.
For gaming or serious work laptop or mini ATX build will give user much much more than this overpriced fancy NUC
So disappointed. I was really looking forward to this product, but the PCIE/DMI Situation is incredibly silly. Must be intentional, but why? Such a waste.
What is your comment to jasonelmore who earlier ( Monday, May 23, 2016) that "it will work fine. Intel has been using the Razer External GPU Chassis and they even commented on it here on Anandtech Comments, on the last article that was posted about it. DMI 3.0 still does 4GB/s and the CPU is not transferring huge amounts of bandwidth hungry texture data back and forth with the CPU." ?
Can Undervolting achieve significantly better thermals and less cpu throttling? And if so, by how much? I want to use this as a 24/7 load and very small and light portable cpu package. Thank you!
Ganesh, When will we have high TDP (65W and above) CPU with Iris Pro?
I would even go farther, I'd like to see Extreme Edition CPU's with Iris Pro. I hope Core I7 7820K will also have a configuration with Iris Pro and 128MB of eDRAM.
It's time Intel to bring Iris Pro to the high end desktop chips.
Neat computer but niche. I'll wait for the fire sale on this one. I could see uses as a dev / portable VM box with the m.2 PCI ports (raid striped). As a gaming machine this thing is about as useful as a A10-7870K or even less for driver reasons. But at least it can do some low end gaming however you would be much better off with an Alienware Alpha which is still tiny and packs a real GPU and is about half the price.
"Connecting the Thunderbolt ports on the two machines and allowing the PCs to talk to each other automatically creates a 10Gbps network adapter."
Can anyone shed some light: When TB3 can transfer 40Gbps (bundle the 4 PCIe 3 lanes), why do we end up with 10Gbps USB 3.1 Gen2 speed for networking?
Well, woulda been too good at 40, but I guess I´ll abuse the NUC6i7KYK as an external storage (partition backup) for my Dell XPS 9550 until I see a TB3 SSD with the Samsung T3 SSD form-factor. ;-)
I have this NUC. I am very happy with it overall. I can't seem to get the Thunderbolt port to work, though. I bought a USB 3.0 hub that has a Type C connection. I figured I might as well put that Type C port to use and not waste an existing USB port. But, it doesn't seem to work. Should it? I had assumed the USB 3.1 aspect of it would be backwards compatible with 3.0, as has been the case in the past. Is that incorrect? TIA
Hey mysticmedia, I don´t understand what you´re trying to accomplish. You got 4 USB 3.0 ports on the NUC6i7KYK. Why in heaven would you hook up a USB 3.0 hub to the TB 3?
How are the connectors / headers supposed to be used (left back cut-out in the metal under the top plastic cover)? According to the circuit schema they are internal USB 3 and 2, NFC and LPC Debug.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
133 Comments
Back to Article
utferris - Monday, May 23, 2016 - link
This can be better with ECC memory support. I just can not use any machine without ECC for work.ShieTar - Monday, May 23, 2016 - link
But a low-frequency consumer quad-core is fine? What exactly do you do at work?Gigaplex - Tuesday, May 24, 2016 - link
Sometimes reliability is more important than performance.close - Monday, May 23, 2016 - link
Guess ECC is not high on the list for potential NUC buyers, even if it's a Skull Canyon NUC. I think most people would rather go for better graphics than ECC.kgardas - Monday, May 23, 2016 - link
Indeed, the picture shows SO-DIMM ECC, but I highly doubt this is even supported since otherwise it's not Xeon nor Cxxx chipset...close - Tuesday, May 24, 2016 - link
http://ark.intel.com/search/advanced?ECCMemory=tru...tipoo - Monday, May 23, 2016 - link
What work do you want to do on a 45W mobile quad with (albeit high end) integrated graphics, that needs ECC?I wonder how that would work with the eDRAM anyways, the main memory being ECC, but not the eDRAM.
Samus - Monday, May 23, 2016 - link
All of the cache in Intel CPU's is ECC anyway. Chances of errors not being corrected by bad information pulled from system RAM is rare (16e^64) in consumer applications.ECC was more important when there was a small amount of system RAM, but these days with the amount of RAM available, ECC is not only less effective but less necessary.
I'm all for ECC memory in the server/mainframe space but for this application it is certainly an odd request. This is, for the most part, a laptop without a screen.
TeXWiller - Tuesday, May 24, 2016 - link
>ECC was more important when there was a small amount of system RAM,The larger the array the larger target it is for the radiation to hit. Laptops are often being used in high altitude environments with generally less shielding than the typical desktop or even a server so from that perspective ECC would seem beneficial basic feature. Then there are the supercomputers, a fun read: http://spectrum.ieee.org/computing/hardware/how-to...
BurntMyBacon - Tuesday, May 24, 2016 - link
@TeXWiller: "The larger the array the larger target it is for the radiation to hit."While this is true, transistor density has improved to the point that, despite having magnitudes more capacity, the physical array size is actually smaller. Of course the transistors are also more susceptible given the radiant energy is, relatively speaking, much greater compared to the energy in the transistor than it was when transistors were larger. There is also the consideration that as transistor density increases, it becomes less likely that radiation will strike the silicon (or other substrate), but miss the transistor. So we've marginally decreased the chance that radiation will hit the substrate, but significantly increased the change that any hit that does occur will be meaningful.
jann5s - Wednesday, May 25, 2016 - link
very fun read indeed, thx for the linkkgardas - Wednesday, May 25, 2016 - link
Demo a system on lot of virtualized machines. ECC is a must have for server workloads and although a chance that the demo will fail due to flip-bit is low, still man would rather not risk that at customer premises right?hubick - Monday, May 23, 2016 - link
It would be really cool to see them make a Xeon-D based NUC oriented for home NAS/router/micro-server use, with dual M.2 for onboard RAID 0/1, Thunderbolt for external storage, a minimal GPU, ECC RAM, and dual 10GBase-T support!TeXWiller - Monday, May 23, 2016 - link
That could cost an ARM, though. ;)BurntMyBacon - Tuesday, May 24, 2016 - link
I'd give up an "ARM" for that. ;')lper - Monday, May 23, 2016 - link
100% agree. If you known how high tech DRAM is and how easy it goes wrong (just google for rowhammer...). I spend already a hughe amount of time to figure out RAM issues which have ECC (as otherwise you would not notice) in embedded systems, that I do not trust RAM without ECC anymore.Still waiting on the first xeon laptop without quadro (which just drains power and produces heat...) but with ECC. I use my computer for work, not for playing games. So if it can drive the display size used, then it is ok... I do not want any laptop with an extra GPU, but I want the ECC.
Ratman6161 - Tuesday, May 24, 2016 - link
"I use my computer for work, not for playing games."Which is to say that this hardware is not being marketed at you in the first place.
JKflipflop98 - Wednesday, May 25, 2016 - link
Exactly what part of this device tells you "this is for doing serious business"? Was it the big skull on the front panel?jihe - Wednesday, May 25, 2016 - link
This can be better with free beer support. I just can not use any machine without beer for work.Ethos Evoss - Thursday, May 26, 2016 - link
I can PISS on this for that taking piss price !!!JohnGalt1717 - Monday, May 23, 2016 - link
Can someone tell me if they have been able to run 4x 4K monitors (i.e. HDMI hooked to one, mDp hooked to another and then 2 mDp hooked via adapter to the Thunderbolt 3 port)?Looking to replace my desktop but need 4 screens.
DanNeely - Monday, May 23, 2016 - link
Unfortunately while you'd have the connectivity to do so, I think Intel's IGP itself is still capped at 3 displays.BurntMyBacon - Tuesday, May 24, 2016 - link
@JohnGalt1717: "Can someone tell me if they have been able to run 4x 4K monitors ..."I just tried it with a slightly different setup, but it was still an HQ processor with Iris Pro 580 (i7-6870HQ). It looks like DanNeely is correct. The IGP is still capped at 3 displays even without considering 4K.
JohnGalt1717 - Wednesday, May 25, 2016 - link
too Bad :<xchaotic - Monday, May 23, 2016 - link
What is the expected market for this? For both my professional and personal uses, lack of proper dGPU support is a deal breaker - for anything from games, through movie and photo enhancements to machine learning and graph stores all being accelerated by DGPUs, I suspect this will have a very limited to a very small subset of hardcore enthusiastblahsaysblah - Monday, May 23, 2016 - link
I had the NUC5i7, the machine was just too loud when used in professional manner. And it was ridiculous, they had space to put a better cooler.I had my doubts when initial mockups flew around internet. This is again another useless NUC. Crazy who has final say over design on this at Intel. They just keep shooting themselves in the foot. Seriously. They are afraid of success.
JoeyJoJo123 - Monday, May 23, 2016 - link
Not every game requires a dGPU, as many popular MOBAs play just fine without it, and there are people who play competitively for these games on worse hardware.This is nice as a HTPC, a thin-client, an emulation station, or a monitor/TV mounted PC for grandma or grandpa.
Understandably, this isn't a product you need, but do keep in mind that others on the market do want mini-PCs and can find a use for them.
JoeyJoJo123 - Monday, May 23, 2016 - link
Oh, also, the inclusion of Thunderbolt 3 means this could be a PC you can velcro to a external graphics card dock (like the Razer Core) to have a very portable LAN PC.LostWander - Monday, May 23, 2016 - link
It's mentioned in the article that the external graphics dock likely won't work because the thunderbolt port isn't given enough bandwidthLostWander - Monday, May 23, 2016 - link
won't work well*The link is only at a maximum PCIe 3.0 x4 under ideal conditions and it is separated from the CPU so the performance gain won't be optimal.
jasonelmore - Monday, May 23, 2016 - link
it will work fine. Intel has been using the Razer External GPU Chassis and they even commented on it here on Anandtech Comments, on the last article that was posted about it. DMI 3.0 still does 4GB/s and the CPU is not transferring huge amounts of bandwidth hungry texture data back and forth with the CPUlmcd - Monday, May 23, 2016 - link
Plus the fact that PCIe 2.0 x4 was shown to be the tipping point between bandwidth limited and not for single GPU systems, I believe. 3.0 is twice that, and 2.0 4x barely even throttled that one graphics card -> there's a decent bit of overhead available.Cuhulin - Wednesday, May 25, 2016 - link
The only question I have about the Razer Core approach is price. Wouldn't it be better to simply buy a Razer Stealth instead?fanofanand - Monday, May 23, 2016 - link
Grandpa/Grandma don't need a 1k PC. They would be adequately served by the PC sticks. This would be massive overkill for Fakebook browsing. This is a product desperately searching for a niche market.gurok - Monday, May 23, 2016 - link
You say that, but they've incorporated a skull design specifically for grandpas/grandmas.Cuhulin - Wednesday, May 25, 2016 - link
Let's see whether you think the same way when you are a grandparent, or at least old enough to be one. I am a grandfather, and I need a decent PC for my 4k gaming, among other things - which means way more than 1k at this time. (I have at least 2k in the 3 4k monitors on my desk, for one thing). I suck at FPS games these days - reflexes just aren't what they were decades ago - but I still have fun. So maybe the ageism should be put away?Calista - Monday, May 23, 2016 - link
Why would you spend 1000 dollars on something a 300 dollar machine does almost just as well? No, this is a really an odd product which can't be upgraded, which can't play current games and which cost an arm and a leg. And why, to save a little bit of space on a desk already housing a (most likely) large monitor, large keyboard and a mouse. I could see the charm of something like this had we had easy access to MXM (or similar standard) modules. But for anyone else far better and almost as small options seem to exist.milkod2001 - Tuesday, May 24, 2016 - link
for $1000 one can get same specs laptop which comes with screen and keyboard already = much better buy/value.The only thing when NUCs make sense is if you need to hook it behind telly and use it as streaming device otherwise any other option gives you much more.
Calista - Tuesday, May 24, 2016 - link
Yeah. I have never understood this "I need a high-end PC, thus I buy an expensive PC which will only be high-end for the next 12 months" while keeping it for the next three years. Either we need high-end components and then we will always need the latest high-end components or else we don't need high-end components and then it doesn't make sense to buy expensive bleeding edge stuff.JBSZQn1LI06L8j33 - Tuesday, May 24, 2016 - link
i am thinking about it, but it is too expensive right now. it needs to be $400 and should come with windows usb. Right now it is basically laptop without keyboard, screen, touch-pad and external power supply. I really like the form factor ...Ratman6161 - Tuesday, May 24, 2016 - link
True, but for the uses this machine would be well suited for, the i7 CPU is way overkill. Grandma and Grandpa would be well served by an i3jwcalla - Monday, May 23, 2016 - link
"What is the expected market for this?"There is none. Especially at that price.
JoeyJoJo123 - Monday, May 23, 2016 - link
I'd agree that it is overpriced by a fair margin, particularly compared to other mini-PCs on the market. Yeah, it does have the best CPU package amongst them, but you'd expect that to be mated with a good GPU solution as well. Given that the GPU solution is awful once it's fully configured (at a retail price of ~$1000 all together), there isn't much of a value.If it had two LAN ports, it'd have the niche of being a great PfSense or router box.
jecs - Tuesday, May 24, 2016 - link
I think this machine is great, not perfect or ideal, for light to medium graphic design work including web graphics. It is fast enough, small, look nice, once configured most designers won't open the machine ever, it can be used with an entry level profesional monitor, plug an external hard drive and add a great keyboard and mouse. It is not for me anyway, but I appreciate this initiative as in the future it may become powerful enough for more demanding work. If I can dream I wish it could have dual mobile high-end graphics and at least 32 gigs of memory, even if it gets bigger. With faster thunderbolt may be a hit. I will keep an eye on this form factor.FMinus - Sunday, August 7, 2016 - link
I've actually visited a cartoon animation studio the other year, of which 80% was running on Intel NUCs, think it was i5, and everyone had hooked a Wacom Cintiq to it and they worked like little bees, without much issues. They had more powerful machines for more demanding tasks and a render farm in the back, but most work was done on these little boxes.The reality is, if you're not playing video games, you really don't need a dedicated GPU for the majority tasks you do on a PC. That being said, this skull canyon part is interesting, yet overpriced in my opinion to really pick up.
oasisfeng - Tuesday, May 24, 2016 - link
I am buying this as a portable computer for software development, which can be put into pocket to be carried between office and home. I don't like laptop for software development due to constrained keyboard and display.BurntMyBacon - Tuesday, May 24, 2016 - link
Is there a reason you can't use the same display/keyboard you are using for the NUC on a laptop and at least get the benefit of a built in UPS? You'd also have a display and keyboard and the ability to run off the wall should you ever have an emergency, but I do understand your desire for a better keyboard/display. I feel its a bit too expensive for me, but I can still see some viable uses and you seem to have one. In any case, if you decide to get it, let us know how it works out for you.Gadgety - Monday, May 23, 2016 - link
Looks impressive for such a small integrated GPU package. Perhaps it's too early though as the GPU still doesn't have full HEVC 10b decoding for HTPC. Doesn't AMD's Carrizo, and upcoming Bristol Ridge sport this?monstercameron - Monday, May 23, 2016 - link
Carrizo only supports 8bit hevc, stoneyridge allegedly supports 10bit.Texag2010 - Monday, May 23, 2016 - link
Can you please add the Intel D54250WYKH nuc as an option to the comparative PC configuration? For people who are upgrading from the best nuc available back in the day...Zero Day Virus - Monday, May 23, 2016 - link
Yep, same here! Would like to see how it compares and if it's worth it :)hubick - Monday, May 23, 2016 - link
It would also be interesting to see how the new BRIX like the GB-BSi7T-6500 stack up.Barilla - Monday, May 23, 2016 - link
I think it's time to drop the 1280x1024 gaming benchmarks. Virtually no one is going to play at such resolution, especially not with a 1000$ pc if a 22" 1080p monitor can be bought for a hundred bucks and change.MrSpadge - Monday, May 23, 2016 - link
If your GPU is slow you HAVE to game at such resolutions, no matter what monitor you have.TheinsanegamerN - Monday, May 23, 2016 - link
Then test at 720p. Nobody buys 5:4 monitors anymore.MrSpadge - Tuesday, May 24, 2016 - link
The aspect ratio does not really matter for GPU testing, it's just the number of pixels the GPU has to compute. So performance at 720p will actually be a bit better.cknobman - Monday, May 23, 2016 - link
Its rather lame that Anand would post up these low resolution benchmarks to try and make the iGPU not look like a total joke (which it is, at least at this price point).For $1000 if it can muster a playable framerate at a resolution outside of a decade old standard than this thing is overpriced.
DanNeely - Monday, May 23, 2016 - link
Lots of casual gamers do play at low resolutions because they don't have the budget to stay on the high end GPU treadmill. The real issue is that the days of doing so at 1280x1024 instead of 1366x768 are long past. This was brought up the last time gaming benchmarks were updated here; but is even more of a glaring issue as time goes on.DanNeely - Monday, May 23, 2016 - link
1680x1050 really should be replaced with 1600x900 too. 16:9 monitors have become ubiquitous; testing at narrower aspect ratios doesn't fit real world usage anymore.I could see a case for going wider at the upper end and slotting an ultrawide 3440x1440 test between conventional 2560x1440 and 3840x2180 gaming. Mostly because it looks like the 1080 still falls just short of being able to play at 4k without having to turn settings down in a lot of games; making 1440p ultra widescreen the effective max single card resolution. (An increasingly important consideration with SLI/xFire becoming progressively less relevant due to temporal AA/post processing techniques that play really badly with multi-GPU setups.)
Barilla - Monday, May 23, 2016 - link
Yeah, I guess my point was IF you want to test at low res, then test at a more relevant low res - 1280x720, 1366x768, 1600x900 etc. But my other point would be that those graphs looke like they look now cause low resolution is paired with low settings, mid resolution with mid settings and so on. Many games these days don't really slow down that much at increased resolution, but rather at increased postprocessing effects - shadows, antialiasing, DoF, you name it. Before I had my current gaming PC I used to game on a laptop with GT555M inside, which is probably weaker than this IGP by some margin, and I ran most games in 1080p at acceptable framerates by turnig the details down. In general it yielded better fps AND better looks than running non-native res and mid graphics settings.But maybe it's just me, I like pixels a lot ;)
fanofanand - Monday, May 23, 2016 - link
You rebutted your own statement. Casual gamers don't buy $1k mini-PC's. Testing this at super low resolutions can only serve one purpose, which is to provide the appearance of acceptable performance.ganeshts - Monday, May 23, 2016 - link
The point with mini-PC reviews with a gaming focus is that they are spread far apart - we may be lucky to have 3 or 4 in a year.So, it boils down to what we think is more relevant to the reader - a set of benchmark numbers that have to be presented standalone, or a set of benchmark numbers which can be compared apples to apples against some similar previous-generation systems (because, that is what we have the numbers for). We think the latter makes more sense, and that is the reason we are having these 'legacy resolutions' in the gaming benchmarks.
fanofanand - Monday, May 23, 2016 - link
I completely understand why you need to present the information, I just don't think this really meets the "Skulltrail brand" expectations. Skulltrail was always an enthusiast platform designed by enthusiasts. This product looks like it fell victim to marketing requiring a certain thickness of chassis. This product waters down the skulltrail branding, though I guess skulltrail really isn't even relevant anymore. I just don't understand who this is designed for I guess.FMinus - Sunday, August 7, 2016 - link
this really isn't a low budget part, they can get similar or better performance in an ATX form factor for around ~$100 to 200 less.zepi - Monday, May 23, 2016 - link
I'd love to see a ~90W TDP version of this with CPU cores getting about 30W and GPU having 60 or so allocated for it. Even 65W TDP part would be a definite improvement for gaming as CPU / GPU clocks could stay considerably higher during loading of both parts of the chip.With proper cooling it could actually compete decently with low-end discrete graphic laptops. Now it is clear that TDP is limiting it badly.
The question is: How is the perf/w compared to for example A9x GPU parts or Maxwells? Somehow I'm not terribly impressed by Intel's GPU's. Especially considering that they've had their hugely superior manufacturing technology which should help...
Osamede - Monday, May 23, 2016 - link
Power consumption measured with a 1080p display. Is this the real use case?ganeshts - Monday, May 23, 2016 - link
Why not? Not everyone has migrated to 4K yet. I am a first-world tech reviewer, and the max. res monitor that I have is only 2560x1440 :)jhoff80 - Monday, May 23, 2016 - link
Out of curiosity, will there also be an Anandtech review of the new Core M Compute Sticks as well?jaydee - Monday, May 23, 2016 - link
Isn't is kinda a no-brainer to make this thing a little big bigger (with a little better cooling), to avoid throttling? Wouldn't just an inch taller help immensely?ShieTar - Monday, May 23, 2016 - link
Sure, but at ~55mm height it is beginning to look similar in size to a 70 mm high Mini-ITX case, which you can use to build yourself a system with similar compute power, for less than half the system cost.So it really needs to be very flat and very compact to qualify as a niche-product. Asking twice the price for just a 20% difference in some aspect is usually very hard to sell.
fanofanand - Monday, May 23, 2016 - link
So in other words, another 20 mm gives the customer a 50% haircut on price. Yup, more evidence this is a product in search of high margins and little else.jwcalla - Monday, May 23, 2016 - link
You have to understand that Intel thinks its customers turn on their faucets and liquid gold comes pouring out.Shadowmaster625 - Monday, May 23, 2016 - link
So its not even as fast as an ancient R9 270M despite being two nodes ahead.... and the power advantage is only 30%. Intel has a problem it seems.ganeshts - Monday, May 23, 2016 - link
Did you consider the power budget?extide - Monday, May 23, 2016 - link
Are you kidding? That is actually a pretty damn good achievement, I mean that uses a Cape Verde ASIC, and has about 1TFLOP of compute power, (640 shaders @ ~750 Mhz)The Iris Pro 580 has 576 FP32 cores (72 EU * 8) and runs about 1Ghz, so has a bit over 1.1TFLOP compute -- so really pretty similar in terms of compute horsepower. It seems like it is performing right where it should based on it's hardware config.
fanofanand - Monday, May 23, 2016 - link
Well, looks like I will be waiting yet another generation. Still can't game on 1080p. I like the storage and Wi-Fi numbers, and I can picture this doing fairly well in some office environments, but are there that many people who are building home PC's (user still has to add RAM etc.) and who will pay that kind of a premium for a device that is somewhat limited in it's use case? I can't speak for anyone else of course, but there is no way my employer would fork out $1k for this, not when they can spend $400 on a cheap business laptop that takes up a similar amount of space in the office but also offers them the ability to extract additional work from me over the weekend. I love the idea, I just don't think the hardware is there yet.Taracta - Monday, May 23, 2016 - link
How is Alpine Ridge going to get USB 3.1 if it is not off the PCH?ganeshts - Monday, May 23, 2016 - link
Alpine Ridge has a USB 3.1 host controller integrated along with the Thunderbolt controller. Please look into our Thunderbolt 3 hands-on coverage here: http://www.anandtech.com/show/10248/thunderbolt-3-... : Note the presence of the xHCI controller in the Alpine Ridge block diagram towards the middle of the linked page.tipoo - Monday, May 23, 2016 - link
This makes me wish Intel offered GT4e with a cheaper dual, then it could compete with boxes like the Alienware Alpha on price/value.tipoo - Monday, May 23, 2016 - link
Does this support free sync?ragenalien - Monday, May 23, 2016 - link
Could we get a comparison between this and the iris pro 6200? Seems like there isn't much difference performance wise, but there should be.defaultluser - Monday, May 23, 2016 - link
Skylake gets much better GPU performance/watt than Broadwell did, as evidenced by the NUC with 48 EU 64MB eDRAM being fed by just 23w continuous. That's a huge improvement form the 45w this beast used to take!I think the only surprise for me was just 40% performance improvement over the 4770r. I always assumed the 4770r was bandwidth-limited, but I guess the eDRAM cache was enough to keep things fed.
But yeah, pointless product continues to be pointless. Intel charges a premium for these things because they take up more die space and require dedicated eDRAM cache to feed them...just like discrete GPUs take up more die space, and require dedicated DRAM to feed them. Where is the efficiency gain in this crap?
defaultluser - Monday, May 23, 2016 - link
Oh, I just noticed the review uses 2133 DDR4, which would account for the 40% performance increase we saw. I thought for sure a "premium gaming" platform like this would ship with z170, so I didn't give the test setup a second glance.I guess Intel cheaping-out with H170 has forever doomed this machine to mediocrity. Too bad, dropping ten bucks more on the Z170 would have allowed some much more interesting memory configurations. With DDR4 2133 we're probably castrating performance.
tipoo - Monday, May 23, 2016 - link
"Our only concern is that the cooling solution keeps the temperature of the cores too close to the junction temperature during periods of heavy CPU load."My rMBP 15", Iris Pro only model, routinely hovers at the tJunction max at load. Is this a real concern? Or is it designed to do this?
BrokenCrayons - Monday, May 23, 2016 - link
There's probably a little wiggle room built into the processor's design by its engineers, but according to Intel's site here: http://www.intel.com/content/www/us/en/support/pro..."Tjunction Max is the maximum temperature the cores can reach before thermal throttling is activated. Thermal throttling happens when the processor exceeds the maximum temperature. The processor shuts itself off in order to prevent permanent damage. Tjunction Max (Tj Max) is also referred to as TCC Activation Temperature in certain processor datasheets."
Basically, reaching the Tjunction means the CPU is close to shutting itself off to prevent damage. That might mean there are longevity implications related to brushing up against that upper ceiling on a regular basis, but I haven't seen any statistical data regarding a meaningful sample of processors put under such conditions failing more often during their few years of useful life due to CPUs going bad just because the OEM decided to implement a cooling solution that allows the processor to wander up to the Tjunction temp when it's working hard.
I think a bigger concern might be looking into whether or not the rMPB in general will approach Tjunction under load or if that's abnormal. Abnormalities might point to some sort of problem with your specific laptop. I don't know what's status quo for your hardware so its hard to say if that's something you should worry about.
tipoo - Monday, May 23, 2016 - link
Yeah, that's something I looked into, Anandtechs own retina macbook pro 15" review only pegged them at going up to 76 ish celsius if memory serves, but that was the older dGPU model with the 650M. From the threads I'm seeing, the Iris Pro model does regularly hover at 99-101C, I'm guessing since the GPU grunt is right beside the CPU on a single die so heat isn't spread wider like the dedicated GPU model.I don't see any reports of this model failing though, so I'd hope they tested extensively at 100 degrees and found it was fine, and so allowed the processor to keep its boost long enough to get there.
I do wish they could have just added another few mm so that the cooling was better and the CPU and GPU could stay at boost longer, and with that room they could have added some mm to the keyboard too (which I consider the absolute minimum in key travel now).
tipoo - Monday, May 23, 2016 - link
Plus they also only tested Half Life 2, which probably allowed the CPU and GPU not to be at max all the time as it's so old.8steve8 - Monday, May 23, 2016 - link
give us a 65W CPU with iris pro , add a couple inches in height... and use the stock retail CPU cooler. Add USB-c in the front. Use USB power delivery usb-c for power.Done, the perfect little pc.
TheinsanegamerN - Monday, May 23, 2016 - link
Except the niche NUCs fill dont want a NUC the size of a MINI-ITX case. Not to mention intels stock cooler is not the quietest nor the best cooler in existence.8steve8 - Monday, May 23, 2016 - link
well a couple inches of height would not put it near the average mini-itx case size.It's impressive that these NUCs are small, but they goes a bit extreme when they use laptop chips and cooler designs meant for laptops. We can get a very small desktop without sacrificing CPU performance and acoustics/thermals.
Use 65W+ chips w/iris pro, full size intel retail heatsink, usb-c power delivery... no wasted space with expansion slots. 1 m.2 should be the only internal slot.
Kimo19 - Monday, May 23, 2016 - link
thanks a lot for the review. I am thinking to get this machine for photos editing (lightroom) and mobile development. would it be a good choice ? I was thinking the processor/ram/ssd are good enough to provide great performance for the next 2/3 years and the iris pro can be a good gpu to support monitor with high resolutionTheinsanegamerN - Monday, May 23, 2016 - link
For the price you could get something much more powerful, or something along the same power with better GPU support for a cheaper price then this NUC.fanofanand - Monday, May 23, 2016 - link
Keep in mind that you are bringing your own RAM and SSD, the kit does not include those items for the consumer. As for the iGPU providing support for high resolution, I think that will depend entirely on your workload.alpha64 - Monday, May 23, 2016 - link
Ganesh, Did you get confirmation directly from Intel that the PCI-E is limited on this system because it runs through the H170? From my research on ARK and other places, it appears that the H170 acts as a PCI Express passthrough, with a PCI express 3.0 x16 connection to the CPU, and the ability to split the configuration off to smaller widths and more ports coming off the H170. It would seem the DMI3 connection is for other (non-PCI express) peripherals. Granted, from the block diagram, it is not apparent that the H170 connects to the CPU's PCI-E x16 connection, but my guess is that it does.I would just like clarification, as this is a pretty big deal.
ganeshts - Monday, May 23, 2016 - link
I have confirmation from the technical marketing manager for NUC products at Intel that the communication link between the H170 and the CPU is only effectively PCIe 3.0 x4 for bandwidth purposes. It is definitely not a PCIe 3.0 x16.H170 itself can act as a PCIe switch, but, for anything that talks to the CPU, it has to go through the DMI 3.0 lanes.
alpha64 - Monday, May 23, 2016 - link
Thanks for the clarification!extide - Monday, May 23, 2016 - link
The 16 CPU lanes are entirely un-used in this device. The PCH (H170 in this case) is NEVER connected by a PCIe x16 link -- it is always connected via DMI 3.0 in the H, Q, B and Z platforms. DMI 3.0 has the same B/W as PCIe 3.0 x4. All of the stuff hanging off the H170 shares that same DMI 3.0 link.alpha64 - Monday, May 23, 2016 - link
Great to know, can you tell me what "Processor PCI Express Port" under "I/O Specifications" details are for on the Intel's ARK for the H170 part? I thought they were for connecting to the PCI Express on the CPU, but would be happy to learn if I am incorrect.Valantar - Monday, May 23, 2016 - link
I'm disappointed in the lack of teardown pictures. I was at the very least expecting a look at the cpu side of the board. Is that too much to ask?Also, considering the massive power throttling seen in your testing, and the torture test nature of the testing, I'd love it if you could monitor clocks and temps during gaming too - I'd be interested in seeing what kind of cpu clocks this can maintain in a low-threaded gaming workload.
allanmac - Monday, May 23, 2016 - link
Please run SGEMM on the HD 580 ... ASAP! :)https://software.intel.com/en-us/articles/sgemm-fo...
FlyingAarvark - Monday, May 23, 2016 - link
All of the Skull Canyon reviews so far online have been relative failures. I hate to bash this work but a few points that need to be said.To only test this with 2133Mhz is a shame. Intel stated that 2400Mhz works without a FSB OC and you can run up to 3000Mhz with one. That would change the gaming performance tremendously but not a single site has bothered testing this.
We just didn't learn anything that wasn't easily known from just looking at this on Newegg. We knew it would be really fast for the size/power requirements. We knew it would be fairly hot on-load due to past NUCs. But we didn't know how it would react with DDR4 2400/2800/3000+.
The other problem is that there's concern expressed in the review that bidirectional 4GB/sec bandwidth isn't enough. Its been proven and known if you look into it that PCIE 1.0 x16 (4GB/sec) does not bottleneck a GTX 980. Skull Canyon should be closer to 5GB/sec than 4GB as well. This wasn't tested with a Razer Core, but I think there's a really good change this is the fastest stock gaming CPU on the market today when paired with a discrete GPU due to the 128MB L4. It was shown that Broadwell 5775Cs were already holding that crown in the past.
Considering how incredibly impressive this NUC is already with its small size, low power draw, various Thunderbolt3 options (storage/GPU/docks): both of these points on faster DDR4 and 128MB L4 impact with a dGPU would make it even more impressive than it already is and probably result in slam dunk territory.
I think everyone in the tech community is massively missing the mark on this one! It just hasn't been properly tested. Intel absolutely nailed this product but is failing to properly instruct reviewers on what to test. Send me a sample, forumemail123 at g mail. I'll do it right.
jardows2 - Monday, May 23, 2016 - link
So, you want this product reviewed in tests that will show it in a better light, and ignore all the standard tests that give it an apples-to-apples comparison, showing that Intel has a long ways to go to provide good value to their customers in this market segment?FlyingAarvark - Monday, May 23, 2016 - link
Ah an AMD poverty gamer arrives. Apples to apples against what? Older NUCs? There is no other competition for this small of a form factor. Certainly not from AMDone.I'm asking that the things that all of us who have been so excited about this product have wanted to see. DDR4-3000 IGP gaming performance and Razer Core FuryX or 980Ti performance.
As I noted, this review told me absolutely nothing that wasn't already known just through common sense. No one will buy this thing for a ho-hum NUC, there's already plenty of those. We're buying them for the size/performance combo and going to run 3000Mhz DDR4 or Razer Core with it.
JoeyJoJo123 - Monday, May 23, 2016 - link
Nobody said anything about AMD, dude.jardows2 - Tuesday, May 24, 2016 - link
You just showed your true colors. I am anything but an "AMD poverty gamer." I need to know how this device compares to other computing devices, so I can determine if the small form factor benefit is worth the performance hit. Very few people are going to have a demand for a computer that fits a particular small form factor, and are willing to do anything to hit that size requirement. Most people just want the best value, and size is a component to that.If we are performing tests that show best case scenario for this unit, then we'd have to do the same for every other bare-bones unit. Then there would be no true comparison, and each piece would be no better than a cnet review bought and paid for by the manufacturer, and we would be no better informed.
FlyingAarvark - Tuesday, May 24, 2016 - link
Nice attempt at digging yourself out of that hole: you'd be pissed to see this thing shown in a positive light because as you said supposedly, "Intel has a long ways to go to provide good value". Shows how much you know, just taking the typical fanboy stance on this thing without knowing what you're even looking at- much like this review.The point remains: people want to see this used with varying RAM speeds. It affects the gaming performance greatly due to the IGP. Also people want to see it benched with a 980Ti / 1080 to compare to other high end gaming CPUs.
There's absolutely no reason that's "best case" at all. It's just asking for a full review.
stux - Monday, May 23, 2016 - link
I'm curious if the BIOS supports RAID0/1 and if so, what the performance from dual sm950s in RAID0 is.Sounds like that'd be bumping up against the DMI bottleneck.
revanchrist - Monday, May 23, 2016 - link
Very dissapointed TBH. i7 5775C performs on par with a GTX 750, so i thought this 6770HQ packed with a much stronger integrated graphics and double the eDRAM will be a monster but instead it performed much weaker, can't even matched a 5675C let alone a 5775C. I guess the power limit and TDP really limit the potential of the iGPU. Sigh. Probably need to wait until 7nm CPU to have a playable 1080p integrated graphics solution.spikebike - Tuesday, May 24, 2016 - link
Maybe the drivers haven't caught up. Or maybe it's heavily throttled because of heat. Seems very strange that it's not a substantial upgrade from the 5675c or 5775c. Hopefully, something else with a similar form factor will ship with the same CPU. Note the very wide difference between the two gtx 960 based units in this review.ganeshts - Tuesday, May 24, 2016 - link
This is a 45W TDP part, while the 5775C is a 65W TDP part. That is a substantial difference, as a larger TDP allows more leeway for the GPU than just the 20W difference would suggest.Also, not sure why multiple commentators are talking about two GTX 960 / same GPU when it comes to the GB-BXi5G-760 and the MAGNUS EN970. They are not the same GPU at all - the former uses the Kepler GK104-based 870M, while the latter uses the Maxwell GM104-based 970M.
spikebike - Monday, May 23, 2016 - link
Anyone know why the GB-BXi5G-760 is so slow? The spec looks pretty similar to the EN970 (same gpu), to performs radically worse on all the games.Calista - Monday, May 23, 2016 - link
Was it not related to heavy throttling? Like *really heavy* throttling.trane - Tuesday, May 24, 2016 - link
Pleasantly surprised by the GPU. Pretty damn good, around the same as a 750 Ti.tipoo - Wednesday, May 25, 2016 - link
I thought it would be more to be honest. Looks like the size of the heatsink is limiting it. It's good, but at nearly double the EUs of the already ok Iris Pro 5200 in my Haswell machine, I expected the 72 EU to do better than this.potf - Tuesday, May 24, 2016 - link
Missing Noise levels in anandtech reviews, with the thermals ?The product maybe has a target, which is not me, but the problem with the mini pc reviews at anand is that we only get 1 or 2 little measurements on the fan noise, and in the skull canyon Nuc, I think it's disappointing, as you mention it in the final comment
> " We would gladly trade a modest increase in the footprint of the system for lower fan noise. That said, the fan noise is in no way comparable to the BRIX Gaming lineup. It is just that it is not as silent as the traditional NUCs."
For example, measuring noise at idle, at gpu / cpu loads, and getting a noise comparison at these points would be useful : the thermals are very nice, but you cannot compare them really/easily from one pc to the other.
My point is that with the current idle power at 17 watts in your test, this racehorse nuc is never ever silent or even quiet, but I would like to see comparisons with the other nucs or at least the MSI cubi 2 recently reviewed.
Also curious about the noise level of the asus VC65 / VC65R relative to that of the Skull Nuc ;), what I mean is that the chosen form factor can be either optimal, or poorer than other 35W-45W solutions.
Will we be getting noise comparisons betwen PCs in a nice chart soon ?
Osamede - Tuesday, May 24, 2016 - link
I'm asking: what is the use case that Intel are targeting and marketing to? Is it 1080P?milkod2001 - Tuesday, May 24, 2016 - link
I'd say the only NUCs which make sense to buy are the cheapest options(not the one reviewed here). Good enough for office work, youtube, Facebook and internet browsing.For gaming or serious work laptop or mini ATX build will give user much much more than this overpriced fancy NUC
rhx123 - Tuesday, May 24, 2016 - link
So disappointed. I was really looking forward to this product, but the PCIE/DMI Situation is incredibly silly. Must be intentional, but why?Such a waste.
Femton - Tuesday, May 24, 2016 - link
What is your comment to jasonelmore who earlier ( Monday, May 23, 2016) that "it will work fine. Intel has been using the Razer External GPU Chassis and they even commented on it here on Anandtech Comments, on the last article that was posted about it. DMI 3.0 still does 4GB/s and the CPU is not transferring huge amounts of bandwidth hungry texture data back and forth with the CPU." ?Osamede - Tuesday, May 24, 2016 - link
Noise focus is something that has been a weakness here for some time. And not much awareness from the folks who run this place about it.SPCR is a better place to find what you are asking about. Intel is one of their sponsors, so no doubt they'll have a review up before long.
KurtKrampmeier - Tuesday, May 24, 2016 - link
Can Undervolting achieve significantly better thermals and less cpu throttling? And if so, by how much? I want to use this as a 24/7 load and very small and light portable cpu package. Thank you!Drazick - Tuesday, May 24, 2016 - link
Ganesh, When will we have high TDP (65W and above) CPU with Iris Pro?I would even go farther, I'd like to see Extreme Edition CPU's with Iris Pro.
I hope Core I7 7820K will also have a configuration with Iris Pro and 128MB of eDRAM.
It's time Intel to bring Iris Pro to the high end desktop chips.
sharath.naik - Wednesday, May 25, 2016 - link
Not sure about the price. At this price isnt it just better to buy a Laptop with discrete graphics and remove the display if you donot want it?Eva Green - Thursday, May 26, 2016 - link
The PC provides cutting edge hardware to run the best games ->http://www.gamernode.com/the-pc-power-and-money-in...
cm2187 - Saturday, May 28, 2016 - link
Just received mine. It is quite noisy, even when idle.Madpacket - Monday, May 30, 2016 - link
Neat computer but niche. I'll wait for the fire sale on this one. I could see uses as a dev / portable VM box with the m.2 PCI ports (raid striped). As a gaming machine this thing is about as useful as a A10-7870K or even less for driver reasons. But at least it can do some low end gaming however you would be much better off with an Alienware Alpha which is still tiny and packs a real GPU and is about half the price.gue2212 - Saturday, June 4, 2016 - link
"Connecting the Thunderbolt ports on the two machines and allowing the PCs to talk to each other automatically creates a 10Gbps network adapter."Can anyone shed some light: When TB3 can transfer 40Gbps (bundle the 4 PCIe 3 lanes), why do we end up with 10Gbps USB 3.1 Gen2 speed for networking?
Well, woulda been too good at 40, but I guess I´ll abuse the NUC6i7KYK as an external storage (partition backup) for my Dell XPS 9550 until I see a TB3 SSD with the Samsung T3 SSD form-factor. ;-)
mystikmedia - Thursday, June 9, 2016 - link
I have this NUC. I am very happy with it overall. I can't seem to get the Thunderbolt port to work, though. I bought a USB 3.0 hub that has a Type C connection. I figured I might as well put that Type C port to use and not waste an existing USB port. But, it doesn't seem to work. Should it? I had assumed the USB 3.1 aspect of it would be backwards compatible with 3.0, as has been the case in the past. Is that incorrect? TIAgue2212 - Saturday, June 18, 2016 - link
Hey mysticmedia,I don´t understand what you´re trying to accomplish. You got 4 USB 3.0 ports on the NUC6i7KYK. Why in heaven would you hook up a USB 3.0 hub to the TB 3?
gue2212 - Sunday, June 19, 2016 - link
How are the connectors / headers supposed to be used (left back cut-out in the metal under the top plastic cover)?According to the circuit schema they are internal USB 3 and 2, NFC and LPC Debug.
BiTesterEmailer - Wednesday, July 20, 2016 - link
Interesting article as always.BiTesterEmailer - Thursday, July 21, 2016 - link
.BiTesterEmailer - Thursday, July 21, 2016 - link
.