Hello, I’m Guy and this is Guy, Robot. [music] Hello this time I’m gonna talk to you about
getting a new computer for myself. So I’ve been using Apple Macs since about
2005. One of my friends came over showed me off his power PC version and I thought this
thing’s cool. I used to run Linux as my main machine, I had a kind of Unix-style console
available to me, I could customize loads of stuff and I couldn’t crash the thing. I thought
“Why have I not used one of these before?” The practicalities were that I still needed
an x86 architecture because I still did Windows development. Then they bought out the Intel
architecture Macs and I thought “this is great, it’s a reliable, stable computer that
also looks great!” So I bought one. Now ever since 2005 Macs have been my primary
machine. I still do development on other platforms. I’ve always had virtual machine so I could
do my Windows development but Mac OS has been the core operating system I’ve run.
Unfortunately, or fortunately, the time has come for me and Mac OS to part. The reason
being – this thing. This this is a 2016 January model – whatever that was mid-2015 – Retina
MacBook Pro. This is the absolute top-spec Retina MacBook Pro. It’s got the i7, it’s
got the one terabyte SSD. It cost me over £2,000 and you know what? It’s rubbish.
[Noise of dropping computer] I’ve had a lot of Macs over the years. I’ve
had about half a dozen MacBook Pros. I’ve had a couple of Airs, I’ve had a Mac Pro desktop
(before it turned into the trashcan anyway – when it was a nice big beast of a machine)
and that is the worst thing I ever had. All this thing is good for is running its fans
at full capacity which is a shame. As soon as I got it, first thing I always
do, put virtual machines on and it made noise within a few minutes being in windows and
the reason I got that one was that I had an iMac (January 2015 – 27” retina iMac)
and that thing was starting to frustrate me. I thought that maybe it just can’t cope with
the graphics, maybe I’ll try one of the newer ones. No. Also new laptop – terrible! So
Apple… I give-in! I eventually switch to Boot camp on it this
year which is the first when I switched my main operating system from being Mac OS to
Windows and I coped fine, Windows is nice these days. Sorry – don’t hate! And then
I realised that it was still making noise and the fans were crazy and I just thought
I can’t be bothering with this so screw you Apple – I am going back and building my own
machine! It’s been a long time since I’ve actually
built my own PC for my home use. The last time I build PC at home was probably 2002
and that is a long time ago! I’ve built servers since then for business that I’ve
been involved with, in fact I’ve got a couple that I built last year and details are in
my blog and they were fun to build but I haven’t built and specced-up a desktop computer and
things have changed a lot in the last 10 years. So that’s why I’m getting it for what do I
actually need? Well for me – gaming isn’t the priority which
means that almost every single piece of information on the internet these days isn’t relevant
because everyone likes gaming machines. As long as I can drive Visual Studio, word-processing
and Chrome on three monitors (non-4k) I’m good, as far as graphics are concerned.
What I do is I run a lot of virtual machines. At some points I might have six or seven virtual
machines running on my system. Just for doing development testing and then I could have
another three or four as part of architecture within my network that I want to run on there.
So virtual machines are important and a lot of cores are therefore important and a lot
of RAM is important, because virtual machines, more than CPUs, will eat RAM for breakfast
and as a developer I can easily get through 32 gig of RAM these days. I’ve been working
on a trading platform recently that on my MacBook Pro was maxing out all of the cores
and eating up all the RAM within seconds of trying to process all the data I was putting
through. So I need something beefy and going back doing
robotics and especially artificial intelligence I need a lot of cores so the number of CPU
cores for me was more important than individual core speed and that throws up some interesting
choices. So my best friend recently built a 6700K Skylake and it was really good, it
boots in seconds it’s lightning fast but it’s only got four cores – plus hyperthreads
– eight logical processors – not enough for me, so I started looking at the higher
end – kind of 6800s and they’d just crazy prices. If I want to get one of the 6950Xs
I’m paying £1,500 just for a CPU core, so I can overclock it and it still not got that
many cores. You know I’ve got eight cores then, 16 threads, but that’s what I used to
have in 2008 with my Mac Pro and that doesn’t feel like I’ve moved on. So I started looking
at Intel Xeons and I decided that was a way for me to go because not only can I have a
lot of cores, they can be slower cores which means it saves me money rather than trying
to have a bazillion t three gigahertz or four gigahertz cores which I don’t need and also
I can have dual processors which means twice as many cores.
So I started inspecting up various things and what I ended up going for was a little
bit crazy. I ended up selecting the Intel 2660 V4 Xeon which is 2GHz turbo boot to 3-ish,
3.2GHz with 14 cores and hyper threading. So that means that on each processor you have
got 28 threads that can run simultaneously. Across my entire system with two processors
that means I will have 56 different cores running which is insane. The most I had before
was I had 8 cores in a Xeon workstation, I’ve had it a couple of times and somehow this
build got a little bit crazy and I’ve ended up going for the dual 2660 V4s at 2 GHz.
I was going to pick a 2620 which is pretty much the same – I think iut’s 2.1GHz but
with 8 cores so I would’ve had 16 cores total across 2 CPUs, 32 threads with hyper
threading, and they come in at about £350 per CPU. However, when I was doing some research
I found some engineering samples of Intel Xeons on eBay. Now this was a bit of a gamble
because they might not work so I thought “Hey, I’m gonna try it – if they work, brilliant!”
It cost me about the same as the 2620s as-new, the difference being I get a lot more cores
for the money. So I decided, right – I’m gonna go with the
dual 14 core processors and I’m going to have 56 logical cores because… why not?
I specced-up the rest of the build then, I decided to go with the Arctic Freezer i32
CO coolers. Now I was going to go with the Nocturas because I’ve used the Nocturas
whever I’ve done server builds in the past and they are exceptionally quiet. They’re
also exceptionally expensive and when you are spending over a hundred pounds on a CPU
and fan before it gets shipped to you, you’ve got to really want it, it’s got to be that
good. Now I try to a budget of about £1,400 to £1,800 for this which is the money I’ll
get for this from selling my Apple laptop and also selling my iPad because – hey,
why bother staying in ecosystem anymore? That means I’ve got to try and do this to a budget
so I thought I’m gonna go with the Arctic Freezer. The brands not as well-known but
the results and noise seems to be pretty good. We’ll see how they get on later on.
Motherboard was a tricky choice. There are a lot of dual Xeon boards, however most of
them are unsurprisingly for servers because not that many people are mad enough to want
to run 56 cores at home. There are, however, a couple of workstation motherboards. There’s
a few from gigabyte that they don’t have many PCI express slots, which pretty much takes
you down to two Asus boards. There’s the Z10PE-D8 and the Z10PE-D16 WS, both of them
WS variant for workstations. Now of those the D8 was marginally cheaper. It was about
£10 pounds cheaper and it seems to be slightly more gaming focused and the flip side was
the D16 came with ports for off-board management. So it’s got a little chip in it which has
its own network card and that actually give me full remote desktop and keyboard access
regardless of what operating system. So I can even get into the BIOS and manage it remotely
– which is kind of useful. It’s the same off-board management that I’ve got on one
of my Asus server boards and they’re really good. So I decided to go with that one.
It’s gone maximum of one terabyte of memory I can put it in which should see me through
for a while and two CPU sockets which, with current V4, means I could get a total of 44
cores and 88 threads running. That would cost me over £3,000 CPUs thought. Crazy!
Going for the hard-disk – well… no hard disk as I haven’t used hard disks in my main
computer for a few years now. So I’m going with a Samsung SSD. This time I’m going
wit the 950 Pro M.2 which can top out at about 2GB per second. Far beyond SATA’s capabilities
so going for the M2 connection for it. Now the Asus, I should say [Az-oos] but I’m going
to say Asus just to annoy everyone, the motherboard that comes with this has an M2 socket onboard
however it’s only x2-PCIE which I found out during the build as you’ll see later. So I’m
also going to get a 4x PCIE socket for. I’m also going to get couple of bits for it.
I’m going to get a GTX 960 graphics card – I don’t need anything amazing, it’s fairly cheap.
it’s got a few cores on it so hopefully the CUDA will be useful for me when I’m doing
some of my AI development and I’m gonna make this pretty so I’m going to stick this in
a Phanteks Enthoo Luxe case and I’m probably also going to pick a Creative Labs SoundBlaster
Z PCI-E card in there and topping it all off with my Dell 2150 monitor that you see behind
me. I love Dell UltraSharps and wouldn’t change them for anything else. The one thing that,
through being an Apple fan-boy, I’ve always kept Dell monitors. All of that’s going to
go into my Razer Chroma accessories because LEDs are cool!
So that’s what I’m gonna get and that is why I’m going to get it so let’s take a look at
the build. So it was actually pretty easy to put this
together as you can see there’s an awful lot of parts that went into it. The biggest thing
was the case it was huge! The thing that struck me about all of them is actually how good
the quality of the packaging is these days. I mean it’s completely wasteful than this
awful forest in my front garden that I need to chuck-out however it does look nice when
I lay it all out at my desk. Putting the case together, during the whole
thing was a really nice thing for the build. The case itself was incredibly well built
and really easy to work on. The only problem was there was a little bit of rust on the
grid on the top however I’ve been in contact with Phanteks support who’ve already sent
me out a replacement which I got the shipping details for only a couple of days later. So
kudos to them for that. I was a bit unsure about the fans to start
with considering how small they are compared to the Nocturas – particularly when you look
at the box for them – that being said actually they are fairly easy to put together. They’re
a bit more faffy than Nocturas but they are still pretty hefty and look nice on there
when they’re installed and they haven’t got that horrible beige colouring about them.
The motherboard – it’s huge! I mean, it’s the size of many people’s apartments. Which
also led me to problems trying to line up all the different posts for installing it.
Once that was sorted though it went in nice and easily it seems to be really, really well
built in terms of engineering and build quality for the thing. I didn’t manage to break it
during the build which is always a nice surprise. So it was fairly easy to get everything set
up and installed on that. I’ve only got the 32 gig of RAM at the moment
but it’s nice to know that I’ve got eight slots should I wish to put a terabyte of RAM
in there in the future. So putting it all together was easier than
I thought. Doing an initial power up to the thing I was fairly sure that it was not going
to work. I thought the CPUs could be broken, I thought I could have missed something because
it’s such been such a while since I’ve built something it and the boot takes so long to
come on but bang – there you are! First power on and we’re into the BIOS. It was a miracle!
Once that was done it was just a case of actually tidying up all the cables inside. The Enthoo
Luxe makes it really, really easy because you can route them all nice and neatly through
the back because there’s already cable ties attached and a load of cutouts for you to
route your cables. If it had been this easy 15 years ago my computers might have looked
somewhat different in the 1990s. I might try and put a link in the video to show what some
of my early ones looked like. At the end of it – LEDs all turned on, all
set up – I think it looks really nice. I’m really happy with the outcome of it bearing
in mind this is the first sexy build I’ve ever tried to do.
So the CPU power is the main thing in this machine for me. I’ve said before it’s not
necessarily the individual core speed but the number of cores that I want to have running
at any point in time for different tasks. So the first thing I wanted to make sure was
that the CPU and this thing was rock-solid – especially considering it was the engineering
spec. So first thing I did was run a Cinebench test and see how well it held up. Now, actually,
it did phenomenally well at this. This is a real-time playback of a Cinebench test that
I did. This isn’t the highest-scoring one, I’ve actually had even higher scores but it
takes maybe ten seconds to run through this and coming out the other end of the cinebench
test we get crazy results. There’s barely any benchmarks that are this high for it.
This is just using CPU rendering. Now if you take a look at that we’ve got a score almost
3,000 on there. I’ve had a couple of scores peak at over 3,100 for Cinebench – which is
just nuts! The next thing was to make sure that, actually,
the CPUs were stable. Now, considering the amount of heat that can be put out by this
thing and the amount operations and not knowing how good the Noctura coolers were, I wanted
to put these things through some rigorous tests.
So I used Prime95 for burning-in the CPUs. I’ve only got a small sample of it here
but I actually burned this in over the course of eight hours with all of the cores running
on Prime95. Now one of the two CPUs runs at about 40 degrees when it’s not active, the
other one runs at about 48 degrees when it’s not active and they both peak about 10 degrees
higher than that. So an absolute top end heat of around about 60 degrees from one of the
two packages and about 50 degrees from the other. And that’s after hours and hours of
everything being under load and these things are rock-solid not to mention the fact, look
how many cores are in task manager! I have never seen so many cores on a server before
and just the fact that I’ve got 56 logical cores (that 7 by 8 grid of CPUs) really excites
me. These run standardly at just over 2.1GHz.
When they’re all maxed out turbo boost kicks in so that all of the cores up themselves
to about 2.5GHz and this particular CPU actually can boost individual cores to over 3GHz if
there’s just a few cores running. So a normal usage I actually find the most of my active
cores run at about 3.2GHz. The most important thing for me when I’m working
on my system has to be the speed of my SSD. The CPUs are obviously phenomenally important
but with high I/O database operations that I run and a lot of virtual machines it’s really
important to me to have great disk I/O. So I want to make sure that my main drive in
this thing was lighting fast and with the Samsung 950 Pro M2 512GB you can see the speed
I’m getting out of this. A sequential read of over 2 gigabytes per second from an SSD.
Two GIGABYTES a second. To get that speed on my network, I would need everything fiber
optic and top end. It’s not even possible to tax it and the write speeds are just as
bonkers. It doesn’t matter which benchmark I put this through, the speeds that come out
of that are phenomenal. However, that’s only after I upgraded from using the on-board M.
2 socket. The Asus motherboard (or A-zoos motherboard rather) that I’m using actually
runs at only 2x PCI-E lane for the M.2 socket so I have to get a separate PCI-E card. I
went for the StarTech one because I find that the Star Tech cards are incredibly good quality.
Got one of those, popped it in and this thing shot up immediately to over two gigabytes
a second of transfer, just phenomenal. This thing should do everything I need it to do. So what about everything else on the system?
Well, the GPU is to distinctly average. I’m not needing anything faster at the moment.
If any of the work I’m doing needs me to do CUDA development with more cores I will
probably invest in a couple of SLI cards at a future point, but not right now. The memory
again, I haven’t done any particular benchmarks for it, its DDR4 2400. It seems pretty good
and it’s certainly survived all the burn-in tests I’ve done.
But what you’re watching now is how long it takes to boot up the system. We’re currently
at about 30 seconds in and we’ve not even got anything on the screen yet. Now this is
the nature of server motherboards normally, they do take a long time to start up. But
this is a workstation motherboard and the problem with this is when I do a reboot on
it, how long it takes. I actually have time to go downstairs, put the kettle on come back
up and I haven’t even got as far as having the logo on screen. This certainly isn’t a
consumer motherboard, if I’d gone for a 6700 processor, a consumer-grade board, I would
have been into the operating system over a minute ago, considering how quick the CPU
is. We’re at one-minute eight seconds, that would be when I was at the desktop and I’d
had a minute to sit there drawing smiley faces in paint but we’re still not even as far as
Windows starting to boot up and that is the biggest problem and the only real problem
I’ve had with this machine. Now I don’t reboot very often, so it’s not too bad but whenever
I do, I feel frustrated by it. I really wish for a workstation board they’d invested
in making it just a little bit quicker for doing fast boot, rather than scrimping and
not bothering modifying the server variant that they had. Other than that, this thing’s
phenomenal – considering we don’t have fast boot, as soon as we’re through the BIOS were
actually into windows in under 10 seconds anyway. But that initial part is SO slow and
I’ve actually tweaked this, there was more turned on to start with, this is genuinely
as good as I’ve got it so far. However, saying that I am loving this system.
It’s working really well and across all the benchmarks I’ve done it’s performing or exceeding
how I would have expected in every area. So this has been a really fun build for me.
It’s been a long time since I got to really go really out to town building something that
was just for me rather than to spec it around my business needs and this did get a little
bit crazy. What started out as me having a conversation of should I go for a previous
generation, Haswell-E 6-core or should I go for Skylake 6700 four cores suddenly ended
up in me building 56-logical cores. How that happened – I don’t know. But I’m very glad
I did. Yeah I’m amazed this thing booted first time
and it’s been rock solid. Those engineering chips seem to have work really well. I’ve
put them under heavy load for stress testing, done a lot of maths testing on them and they’re
consistently reliable. The cooling has worked really well, it stayed really cool, my room
is like any oven, but the chips are cool and it’s quiet. You can barely hear the thing,
compared to what I’ve been suffering with my Apple laptop for the last nine months I
am over the moon! And the hard disk is just bonkers. So this I’m really happy with, from
a workstation perspective. It performs really quickly, it deals with heavy workloads that
I chuck at it really well. It’s also working really well for video rendering, which I’ve
never had to do before, so, hey, good use of it for starting to do this. The only thing I have a bit of a problem with
is the motherboard. Now, I’m never over the moon with any motherboards that I’ve had.
I have yet to be wowed by one in many, many computers that I’ve built and this one’s no
exception. It has a lot of ports, it’s built beautifully. I also really like the on board
diagnostics on it, fantastic, the ability to update the bios separately. There is loads
of really cool features in it. The thing I don’t like is, how long it takes to boot up.
Now a server motherboard that takes four minutes to boot up isn’t that rare. But I still feel
that for a workstation board they could have done better. If you take a look at the Dell
Precision workstations that you can order, they boot up in pretty much standard PC boot
times, 2-3 seconds extra. You get to your desktop pretty quickly. This thing takes forever
to power on. Now I’m not often going to shut the thing down, I’ve got a bunch of network
virtual machines on there, which act as secondary domain controllers, DNS. I’m not really gonna
wanna shut the thing down. It’s gonna stay on pretty much 24/7 but whenever I do reboot
it I’m gonna be grumpy at the fact it takes seven hours to boot. And the fact that this
is the most expensive Asus motherboard for the dual Xeon workstation and it doesn’t have
X4 M.2 socket, whereas the cheaper D8 does. Now that really did frustrate me. It took
me a while to figure out why, because it was actually when I was originally specing it
up with the D8 I looked and saw the X4 and I didn’t even think to check on the motherboard
spec whether would be an X4 on the more expensive model. My mistake, but that was a bit of a
frustration. Also love the case, LEDs are cool, everything’s purple. I’m loving it.
All I need to do is sort out this mess of an office now. So I hope you found that useful and I hope
you found my build slightly interesting, compared to a lot of the gaming ones that are out there.
Don’t get me wrong, this won’t be able to play anything terribly well but I’ll have
fun with my software development. Hope you enjoyed this and thanks very much
for watching. I’d love to know what you thought about this, leave your feedback in the comments
below and if you’d like to watch more of my videos please hit subscribe. [music] Thanks for watching please check out some
of my other videos. Don’t forget to subscribe. [music]