Epic Xeon Workstation Build – 28 Cores / 56 Threads


Hello, I’m Guy and this is Guy, Robot. [music] Hello this time I’m gonna talk to you about
getting a new computer for myself. So I’ve been using Apple Macs since about
2005. One of my friends came over showed me off his power PC version and I thought this
thing’s cool. I used to run Linux as my main machine, I had a kind of Unix-style console
available to me, I could customize loads of stuff and I couldn’t crash the thing. I thought
“Why have I not used one of these before?” The practicalities were that I still needed
an x86 architecture because I still did Windows development. Then they bought out the Intel
architecture Macs and I thought “this is great, it’s a reliable, stable computer that
also looks great!” So I bought one. Now ever since 2005 Macs have been my primary
machine. I still do development on other platforms. I’ve always had virtual machine so I could
do my Windows development but Mac OS has been the core operating system I’ve run.
Unfortunately, or fortunately, the time has come for me and Mac OS to part. The reason
being – this thing. This this is a 2016 January model – whatever that was mid-2015 – Retina
MacBook Pro. This is the absolute top-spec Retina MacBook Pro. It’s got the i7, it’s
got the one terabyte SSD. It cost me over £2,000 and you know what? It’s rubbish.
[Noise of dropping computer] I’ve had a lot of Macs over the years. I’ve
had about half a dozen MacBook Pros. I’ve had a couple of Airs, I’ve had a Mac Pro desktop
(before it turned into the trashcan anyway – when it was a nice big beast of a machine)
and that is the worst thing I ever had. All this thing is good for is running its fans
at full capacity which is a shame. As soon as I got it, first thing I always
do, put virtual machines on and it made noise within a few minutes being in windows and
the reason I got that one was that I had an iMac (January 2015 – 27” retina iMac)
and that thing was starting to frustrate me. I thought that maybe it just can’t cope with
the graphics, maybe I’ll try one of the newer ones. No. Also new laptop – terrible! So
Apple… I give-in! I eventually switch to Boot camp on it this
year which is the first when I switched my main operating system from being Mac OS to
Windows and I coped fine, Windows is nice these days. Sorry – don’t hate! And then
I realised that it was still making noise and the fans were crazy and I just thought
I can’t be bothering with this so screw you Apple – I am going back and building my own
machine! It’s been a long time since I’ve actually
built my own PC for my home use. The last time I build PC at home was probably 2002
and that is a long time ago! I’ve built servers since then for business that I’ve
been involved with, in fact I’ve got a couple that I built last year and details are in
my blog and they were fun to build but I haven’t built and specced-up a desktop computer and
things have changed a lot in the last 10 years. So that’s why I’m getting it for what do I
actually need? Well for me – gaming isn’t the priority which
means that almost every single piece of information on the internet these days isn’t relevant
because everyone likes gaming machines. As long as I can drive Visual Studio, word-processing
and Chrome on three monitors (non-4k) I’m good, as far as graphics are concerned.
What I do is I run a lot of virtual machines. At some points I might have six or seven virtual
machines running on my system. Just for doing development testing and then I could have
another three or four as part of architecture within my network that I want to run on there.
So virtual machines are important and a lot of cores are therefore important and a lot
of RAM is important, because virtual machines, more than CPUs, will eat RAM for breakfast
and as a developer I can easily get through 32 gig of RAM these days. I’ve been working
on a trading platform recently that on my MacBook Pro was maxing out all of the cores
and eating up all the RAM within seconds of trying to process all the data I was putting
through. So I need something beefy and going back doing
robotics and especially artificial intelligence I need a lot of cores so the number of CPU
cores for me was more important than individual core speed and that throws up some interesting
choices. So my best friend recently built a 6700K Skylake and it was really good, it
boots in seconds it’s lightning fast but it’s only got four cores – plus hyperthreads
– eight logical processors – not enough for me, so I started looking at the higher
end – kind of 6800s and they’d just crazy prices. If I want to get one of the 6950Xs
I’m paying £1,500 just for a CPU core, so I can overclock it and it still not got that
many cores. You know I’ve got eight cores then, 16 threads, but that’s what I used to
have in 2008 with my Mac Pro and that doesn’t feel like I’ve moved on. So I started looking
at Intel Xeons and I decided that was a way for me to go because not only can I have a
lot of cores, they can be slower cores which means it saves me money rather than trying
to have a bazillion t three gigahertz or four gigahertz cores which I don’t need and also
I can have dual processors which means twice as many cores.
So I started inspecting up various things and what I ended up going for was a little
bit crazy. I ended up selecting the Intel 2660 V4 Xeon which is 2GHz turbo boot to 3-ish,
3.2GHz with 14 cores and hyper threading. So that means that on each processor you have
got 28 threads that can run simultaneously. Across my entire system with two processors
that means I will have 56 different cores running which is insane. The most I had before
was I had 8 cores in a Xeon workstation, I’ve had it a couple of times and somehow this
build got a little bit crazy and I’ve ended up going for the dual 2660 V4s at 2 GHz.
I was going to pick a 2620 which is pretty much the same – I think iut’s 2.1GHz but
with 8 cores so I would’ve had 16 cores total across 2 CPUs, 32 threads with hyper
threading, and they come in at about £350 per CPU. However, when I was doing some research
I found some engineering samples of Intel Xeons on eBay. Now this was a bit of a gamble
because they might not work so I thought “Hey, I’m gonna try it – if they work, brilliant!”
It cost me about the same as the 2620s as-new, the difference being I get a lot more cores
for the money. So I decided, right – I’m gonna go with the
dual 14 core processors and I’m going to have 56 logical cores because… why not?
I specced-up the rest of the build then, I decided to go with the Arctic Freezer i32
CO coolers. Now I was going to go with the Nocturas because I’ve used the Nocturas
whever I’ve done server builds in the past and they are exceptionally quiet. They’re
also exceptionally expensive and when you are spending over a hundred pounds on a CPU
and fan before it gets shipped to you, you’ve got to really want it, it’s got to be that
good. Now I try to a budget of about £1,400 to £1,800 for this which is the money I’ll
get for this from selling my Apple laptop and also selling my iPad because – hey,
why bother staying in ecosystem anymore? That means I’ve got to try and do this to a budget
so I thought I’m gonna go with the Arctic Freezer. The brands not as well-known but
the results and noise seems to be pretty good. We’ll see how they get on later on.
Motherboard was a tricky choice. There are a lot of dual Xeon boards, however most of
them are unsurprisingly for servers because not that many people are mad enough to want
to run 56 cores at home. There are, however, a couple of workstation motherboards. There’s
a few from gigabyte that they don’t have many PCI express slots, which pretty much takes
you down to two Asus boards. There’s the Z10PE-D8 and the Z10PE-D16 WS, both of them
WS variant for workstations. Now of those the D8 was marginally cheaper. It was about
£10 pounds cheaper and it seems to be slightly more gaming focused and the flip side was
the D16 came with ports for off-board management. So it’s got a little chip in it which has
its own network card and that actually give me full remote desktop and keyboard access
regardless of what operating system. So I can even get into the BIOS and manage it remotely
– which is kind of useful. It’s the same off-board management that I’ve got on one
of my Asus server boards and they’re really good. So I decided to go with that one.
It’s gone maximum of one terabyte of memory I can put it in which should see me through
for a while and two CPU sockets which, with current V4, means I could get a total of 44
cores and 88 threads running. That would cost me over £3,000 CPUs thought. Crazy!
Going for the hard-disk – well… no hard disk as I haven’t used hard disks in my main
computer for a few years now. So I’m going with a Samsung SSD. This time I’m going
wit the 950 Pro M.2 which can top out at about 2GB per second. Far beyond SATA’s capabilities
so going for the M2 connection for it. Now the Asus, I should say [Az-oos] but I’m going
to say Asus just to annoy everyone, the motherboard that comes with this has an M2 socket onboard
however it’s only x2-PCIE which I found out during the build as you’ll see later. So I’m
also going to get a 4x PCIE socket for. I’m also going to get couple of bits for it.
I’m going to get a GTX 960 graphics card – I don’t need anything amazing, it’s fairly cheap.
it’s got a few cores on it so hopefully the CUDA will be useful for me when I’m doing
some of my AI development and I’m gonna make this pretty so I’m going to stick this in
a Phanteks Enthoo Luxe case and I’m probably also going to pick a Creative Labs SoundBlaster
Z PCI-E card in there and topping it all off with my Dell 2150 monitor that you see behind
me. I love Dell UltraSharps and wouldn’t change them for anything else. The one thing that,
through being an Apple fan-boy, I’ve always kept Dell monitors. All of that’s going to
go into my Razer Chroma accessories because LEDs are cool!
So that’s what I’m gonna get and that is why I’m going to get it so let’s take a look at
the build. So it was actually pretty easy to put this
together as you can see there’s an awful lot of parts that went into it. The biggest thing
was the case it was huge! The thing that struck me about all of them is actually how good
the quality of the packaging is these days. I mean it’s completely wasteful than this
awful forest in my front garden that I need to chuck-out however it does look nice when
I lay it all out at my desk. Putting the case together, during the whole
thing was a really nice thing for the build. The case itself was incredibly well built
and really easy to work on. The only problem was there was a little bit of rust on the
grid on the top however I’ve been in contact with Phanteks support who’ve already sent
me out a replacement which I got the shipping details for only a couple of days later. So
kudos to them for that. I was a bit unsure about the fans to start
with considering how small they are compared to the Nocturas – particularly when you look
at the box for them – that being said actually they are fairly easy to put together. They’re
a bit more faffy than Nocturas but they are still pretty hefty and look nice on there
when they’re installed and they haven’t got that horrible beige colouring about them.
The motherboard – it’s huge! I mean, it’s the size of many people’s apartments. Which
also led me to problems trying to line up all the different posts for installing it.
Once that was sorted though it went in nice and easily it seems to be really, really well
built in terms of engineering and build quality for the thing. I didn’t manage to break it
during the build which is always a nice surprise. So it was fairly easy to get everything set
up and installed on that. I’ve only got the 32 gig of RAM at the moment
but it’s nice to know that I’ve got eight slots should I wish to put a terabyte of RAM
in there in the future. So putting it all together was easier than
I thought. Doing an initial power up to the thing I was fairly sure that it was not going
to work. I thought the CPUs could be broken, I thought I could have missed something because
it’s such been such a while since I’ve built something it and the boot takes so long to
come on but bang – there you are! First power on and we’re into the BIOS. It was a miracle!
Once that was done it was just a case of actually tidying up all the cables inside. The Enthoo
Luxe makes it really, really easy because you can route them all nice and neatly through
the back because there’s already cable ties attached and a load of cutouts for you to
route your cables. If it had been this easy 15 years ago my computers might have looked
somewhat different in the 1990s. I might try and put a link in the video to show what some
of my early ones looked like. At the end of it – LEDs all turned on, all
set up – I think it looks really nice. I’m really happy with the outcome of it bearing
in mind this is the first sexy build I’ve ever tried to do.
So the CPU power is the main thing in this machine for me. I’ve said before it’s not
necessarily the individual core speed but the number of cores that I want to have running
at any point in time for different tasks. So the first thing I wanted to make sure was
that the CPU and this thing was rock-solid – especially considering it was the engineering
spec. So first thing I did was run a Cinebench test and see how well it held up. Now, actually,
it did phenomenally well at this. This is a real-time playback of a Cinebench test that
I did. This isn’t the highest-scoring one, I’ve actually had even higher scores but it
takes maybe ten seconds to run through this and coming out the other end of the cinebench
test we get crazy results. There’s barely any benchmarks that are this high for it.
This is just using CPU rendering. Now if you take a look at that we’ve got a score almost
3,000 on there. I’ve had a couple of scores peak at over 3,100 for Cinebench – which is
just nuts! The next thing was to make sure that, actually,
the CPUs were stable. Now, considering the amount of heat that can be put out by this
thing and the amount operations and not knowing how good the Noctura coolers were, I wanted
to put these things through some rigorous tests.
So I used Prime95 for burning-in the CPUs. I’ve only got a small sample of it here
but I actually burned this in over the course of eight hours with all of the cores running
on Prime95. Now one of the two CPUs runs at about 40 degrees when it’s not active, the
other one runs at about 48 degrees when it’s not active and they both peak about 10 degrees
higher than that. So an absolute top end heat of around about 60 degrees from one of the
two packages and about 50 degrees from the other. And that’s after hours and hours of
everything being under load and these things are rock-solid not to mention the fact, look
how many cores are in task manager! I have never seen so many cores on a server before
and just the fact that I’ve got 56 logical cores (that 7 by 8 grid of CPUs) really excites
me. These run standardly at just over 2.1GHz.
When they’re all maxed out turbo boost kicks in so that all of the cores up themselves
to about 2.5GHz and this particular CPU actually can boost individual cores to over 3GHz if
there’s just a few cores running. So a normal usage I actually find the most of my active
cores run at about 3.2GHz. The most important thing for me when I’m working
on my system has to be the speed of my SSD. The CPUs are obviously phenomenally important
but with high I/O database operations that I run and a lot of virtual machines it’s really
important to me to have great disk I/O. So I want to make sure that my main drive in
this thing was lighting fast and with the Samsung 950 Pro M2 512GB you can see the speed
I’m getting out of this. A sequential read of over 2 gigabytes per second from an SSD.
Two GIGABYTES a second. To get that speed on my network, I would need everything fiber
optic and top end. It’s not even possible to tax it and the write speeds are just as
bonkers. It doesn’t matter which benchmark I put this through, the speeds that come out
of that are phenomenal. However, that’s only after I upgraded from using the on-board M.
2 socket. The Asus motherboard (or A-zoos motherboard rather) that I’m using actually
runs at only 2x PCI-E lane for the M.2 socket so I have to get a separate PCI-E card. I
went for the StarTech one because I find that the Star Tech cards are incredibly good quality.
Got one of those, popped it in and this thing shot up immediately to over two gigabytes
a second of transfer, just phenomenal. This thing should do everything I need it to do. So what about everything else on the system?
Well, the GPU is to distinctly average. I’m not needing anything faster at the moment.
If any of the work I’m doing needs me to do CUDA development with more cores I will
probably invest in a couple of SLI cards at a future point, but not right now. The memory
again, I haven’t done any particular benchmarks for it, its DDR4 2400. It seems pretty good
and it’s certainly survived all the burn-in tests I’ve done.
But what you’re watching now is how long it takes to boot up the system. We’re currently
at about 30 seconds in and we’ve not even got anything on the screen yet. Now this is
the nature of server motherboards normally, they do take a long time to start up. But
this is a workstation motherboard and the problem with this is when I do a reboot on
it, how long it takes. I actually have time to go downstairs, put the kettle on come back
up and I haven’t even got as far as having the logo on screen. This certainly isn’t a
consumer motherboard, if I’d gone for a 6700 processor, a consumer-grade board, I would
have been into the operating system over a minute ago, considering how quick the CPU
is. We’re at one-minute eight seconds, that would be when I was at the desktop and I’d
had a minute to sit there drawing smiley faces in paint but we’re still not even as far as
Windows starting to boot up and that is the biggest problem and the only real problem
I’ve had with this machine. Now I don’t reboot very often, so it’s not too bad but whenever
I do, I feel frustrated by it. I really wish for a workstation board they’d invested
in making it just a little bit quicker for doing fast boot, rather than scrimping and
not bothering modifying the server variant that they had. Other than that, this thing’s
phenomenal – considering we don’t have fast boot, as soon as we’re through the BIOS were
actually into windows in under 10 seconds anyway. But that initial part is SO slow and
I’ve actually tweaked this, there was more turned on to start with, this is genuinely
as good as I’ve got it so far. However, saying that I am loving this system.
It’s working really well and across all the benchmarks I’ve done it’s performing or exceeding
how I would have expected in every area. So this has been a really fun build for me.
It’s been a long time since I got to really go really out to town building something that
was just for me rather than to spec it around my business needs and this did get a little
bit crazy. What started out as me having a conversation of should I go for a previous
generation, Haswell-E 6-core or should I go for Skylake 6700 four cores suddenly ended
up in me building 56-logical cores. How that happened – I don’t know. But I’m very glad
I did. Yeah I’m amazed this thing booted first time
and it’s been rock solid. Those engineering chips seem to have work really well. I’ve
put them under heavy load for stress testing, done a lot of maths testing on them and they’re
consistently reliable. The cooling has worked really well, it stayed really cool, my room
is like any oven, but the chips are cool and it’s quiet. You can barely hear the thing,
compared to what I’ve been suffering with my Apple laptop for the last nine months I
am over the moon! And the hard disk is just bonkers. So this I’m really happy with, from
a workstation perspective. It performs really quickly, it deals with heavy workloads that
I chuck at it really well. It’s also working really well for video rendering, which I’ve
never had to do before, so, hey, good use of it for starting to do this. The only thing I have a bit of a problem with
is the motherboard. Now, I’m never over the moon with any motherboards that I’ve had.
I have yet to be wowed by one in many, many computers that I’ve built and this one’s no
exception. It has a lot of ports, it’s built beautifully. I also really like the on board
diagnostics on it, fantastic, the ability to update the bios separately. There is loads
of really cool features in it. The thing I don’t like is, how long it takes to boot up.
Now a server motherboard that takes four minutes to boot up isn’t that rare. But I still feel
that for a workstation board they could have done better. If you take a look at the Dell
Precision workstations that you can order, they boot up in pretty much standard PC boot
times, 2-3 seconds extra. You get to your desktop pretty quickly. This thing takes forever
to power on. Now I’m not often going to shut the thing down, I’ve got a bunch of network
virtual machines on there, which act as secondary domain controllers, DNS. I’m not really gonna
wanna shut the thing down. It’s gonna stay on pretty much 24/7 but whenever I do reboot
it I’m gonna be grumpy at the fact it takes seven hours to boot. And the fact that this
is the most expensive Asus motherboard for the dual Xeon workstation and it doesn’t have
X4 M.2 socket, whereas the cheaper D8 does. Now that really did frustrate me. It took
me a while to figure out why, because it was actually when I was originally specing it
up with the D8 I looked and saw the X4 and I didn’t even think to check on the motherboard
spec whether would be an X4 on the more expensive model. My mistake, but that was a bit of a
frustration. Also love the case, LEDs are cool, everything’s purple. I’m loving it.
All I need to do is sort out this mess of an office now. So I hope you found that useful and I hope
you found my build slightly interesting, compared to a lot of the gaming ones that are out there.
Don’t get me wrong, this won’t be able to play anything terribly well but I’ll have
fun with my software development. Hope you enjoyed this and thanks very much
for watching. I’d love to know what you thought about this, leave your feedback in the comments
below and if you’d like to watch more of my videos please hit subscribe. [music] Thanks for watching please check out some
of my other videos. Don’t forget to subscribe. [music]

61 thoughts on “Epic Xeon Workstation Build – 28 Cores / 56 Threads

  1. When I got my two Intel Xeon E5-2697 v3 chips for my build (14 cores each) I wondered the same thing… A friend of mine that knows a thing or two said the slow start up times are due to the bios structuring and recognizing all these cores in not one, but two processors. Also, the ratio of cores to ram makes a difference too. If this is not correct, please let me know by replying to this post. Hope this helps… (I'd suggest you check the wattage usage on your system during your typical load times, such as your virtual'n… The board and chips can be damaged by overwhelming the power supply if it's not able to keep up the demand with a stable current. Maybe you don't need more than 750 watts due to your little use of the video card)

  2. Just gotta ask, are you running the ASMB8-iKVM module? My Z9PE-D16 is slow as F! with the AMSB module plugged in, without it, it POST's as quickly as any other motherboard.

  3. This is the exact system I want to build but I expected a bit higher on cinebench with the single chip score of 1774 (heavy scaling) Maybe you can crank up the Bclk? Thanks for the info on the M2, that was a surprise to me. Also should have gotten a little better graphics cause you well never know if you get into the mood of a game of GTA5 (Ha ha ha).

  4. Nice rig! BTW, I was almost certain that you would run FreeBSD!
    http://420.thrashbarg.net/im_a_pc_mac_linux_bsd_tronguy_hippie.jpg

  5. Hi,
    I bought the same motherboard, the thing is that i put a pair of xeon e5 2683 v3, 64gb ram crucial .. i put all together and when i turn it on its take maybe 5 min to get windows.. after i installing windows i install the mother drivers and the pc never get to windows even in safe mode the pc restart.. could you give me some advice?
    Thanks

  6. I used to use a Macbook Pro (late 14 model, i7 512gb SSD) and even running Photoshop for editing made it run pretty hot. I was batch processing 20mp RAW files for 360 photography. Coupled with sluggish performance for web development (my IDE's were pretty slow to scan, hint and work with) I ended up using Sublime Text. Which I still do on my Air. But I'm making a purchase for a Dual Xeon setup which will be mixed usage – fractal rendering, photography and linux development. Great video explaining everything too.

  7. have you thought about adding 4 ram stick per cpu?. I've read that your config with only 1 ram stick per cpu really limits your performance.

  8. I was wondering, could you use the Ai Overclocker in the Bios? I could only get it up to Level 2. I cant get windows to load or even in to the bios under level 3.

  9. Hey there. Thanks for the great video. It's very helpful. Tempted to take the leap and get a couple ES CPUs. Where you able to buy 2 ES CPUs with matching stepping numbers from Ebay? Thanks again!

  10. Sample units do not include boost mode. So frequency will be always 2.0Ghz. If it is retail unit it will have boost frequency 3.2Ghz. That is massive amount more power. Maybe over 9000! So i do not know do you really have sample units or not if you see 3.2Ghz. http://www.cpu-world.com/sspec/QK/QK8Z.html

  11. That case wont fit my Intel workstation board w2600cr. Had to get a Cooler Master Cosmos II. I'm running 2x E5 Xeon 2680's V1

  12. Hi guys, can I use Win10 Home for dual CPU system? I've z10pe d8 2x xeon 2683 v3 but I can't make them run together. In BIOS I see both processors as well in Windows device manager (58 threads) but in Task manager or CPUZ is only 1 CPU (28 threads). Single CPU is working fine. What I'm doing wrong? I appreciate any help that you can provide

  13. Hello Guy, Iam getting ready to buy the asus mobo although imam fearing of the loooong boot times. I too will use the nvme ssd and i am planning to use v4 xeons and 2400 ddr4 ram. Ive read some horror stories regarding this board so i would like to know your opinion. My runner up board would be supermicros X10DAX but i am unsure wheather that one would boot from the samsung evo 960 throught the adapter…..ive been very happy with asus board so far but this one gives me creeps cos i am building 9000 dollar workstation and dont want or cant affor to mess up or end up in a rma loop…..woudl you please share your thoughts on this?

  14. I have a Xeon 2683 which turbos to 3ghz when gaming and 2.5ghz all cores turbo. I got the OEM part $450. This video gave me a big erection.

  15. Have you considered running a RAM disk? That would be 10x faster than even your SSD…

    I have 2 SSDs in RAID 0, so I don't have to wait too long for the RAM disk being read/written from/to physical disk when starting the system or shutting it down…

  16. Guy,

    Much like you, I have been a dedicated Apple user and I haven't had a Windows machine since XP. I feel as though Apple has gone down hill and I am considering switching but I am very hesitant. I had a couple questions.

    Are you concerned with privacy with Windows 10, and now that's it been 5 months, are you still content with switching to Windows?

  17. I had the top spec 2012 MacBook Pro Retina with Nvidia card and the higher clocked i7… Long story short, It shat itself and Apple replaced it with the top spec mid 2015.. again i7, yes newer architecture but this has an AMD card and i suspect thats why its always running so hot.. i downloaded TG Pro and much to my surprised the CPU kept hitting 98 degrees and at times the AMD card would get really hot… sometimes all i was doing is watching YouTube videos…. Major issue was the CPU though…. I thought it was only my machine… the 2012 MacBook Pro Retina was perfect almost never heard the fans even in some of the adobe suite applications. Time to sell it and move on i guess… Id downgrade since i used it for less intensive task these days and let my PC take care of them but the New 2016 13" MBP only come is 8gb RAM unless you custom order it… AND its DDR3 and 1866mhz .. like wtf? then the price.. and no native SD card support … I'm done.

  18. Just for information I have an Asrock EP2C602 that takes 20 seconds before seeing anything on the screen. I see the Windows 10 logo at 30 seconds and fully booted up at 45sec. That is using Windows Boot Manager because it won't boot directly from the hard drive because of the Intel 600P Series cannot be converted from GPT to MBR (or the other way around, can't remember)

    Cinebench score of 1500 with 2 E5-2628l V2. Not the greatest score since I can get over 900 with single X5660 ASUS P6X58D-E LGA 1366 but went in favor of lower TDP. Maybe price will drop on the V2's as more of the interest will shift towards the V4's and grab a pair of 2670 V2's..

    I would put up with the slow boot times for your setup 🙂

  19. that arctic cooler is really underrated, its priced lower than the coolermaster 212 and works better. plus with it being semi passive, the fans turn off when not needed

  20. If i had a need for many VMs i would buy a sidekick computer and use both with a 10Core cpu as this scales currently better economically and i would go for DDR3 which is as fast but still costs only 1,5 Euro per GB now on EBay. The only use case i would see for me is huge compilation. If you ever worked on Chrome source code you know you can't have enough cores. I'm not doing AI or maths so i dont know about that. And with just a 512G disk your dataset can't be too large anyway.

  21. Thanks Guy.  Very informative and creative video (the filming of the build was excellent).  Did I miss it or did you mention what kind of expenses you incurred on this build?Keep them coming, thanks again.Christopher

  22. arghhh that boot time!!!!
    I'm getting frustrated just by watching a video about how long it takes to boot.

  23. These boot times must be something specific to the D16 board or that board when running Broadwell-EP procs.

    My Z10PE-D8 WS board from cold boot into Windows is around 10 s or so.

  24. The slow boot is most likely because the firmware waits for the BMC to boot so that you can remote control the machine from the very beginning. Workstation and server boards also have more tests turn on and aren't optimized for booting fast. My server has a BMC and cold boots are insanely slow. You could use fast boot to make Windows or what ever OS you want to use, to boot faster once firmware and BMC is done.

  25. Great Video ! Running a desktop Dell I5 3rd Gen. LGA 1155 @ 3.2 GHz… The LGA 1155 Socket maxes out at four cores… And may up-grade to a XEON E3-1270 V2 @ 3.5 GHz (still only four cores but eight Threads). Must be fun at a party unless you bump into some one from a National Lab with a Super Computer… Thank you for the video ! tjl

  26. I built a system like this at the end of 2012. Only 24 threads on mine though 🙁 Still using it now and love it still. Best development machine ever.

  27. Bro, you talk so much about cores this and that. Blah-blah blah-blah 😐🔫 shut up already and build, Jesus christ

  28. Is there compatibility between the Asus Z10PE-D16 WS Motherboard and an NVIDIA Titan RTX 24GB GPU graphics card?

  29. You MAC boys flush SO MUCH $ up Apples ass, then run crying back to the PC/Windows bed where prior you were hating and talking shit. FUK off!

Leave a Reply

Your email address will not be published. Required fields are marked *

Copyright © 2019 Explore Mellieha. All rights reserved.