10Gb Home Network (P1) – Introduction


Welcome to the first video in our Ten Gigabit
Home Area Network Series. I’m going to cover five major points why it is finally time for
ten gigabit networking at home, and the small business office. I’m going to be brief and
realistic as possible. Then in the following video I’m going to show you how to do it.
It’s essential to watch this video so you understand important concepts necessary to
blow the doors off your tired one gig networking gear. So let’s get started. What:
Ten Gigabit networking has been around for a while now, actually since 2002 when 10Gb
fiber was introduced. Then in 2006 SFP+ Direct Attach Copper (DAC) hit the market and grew
in popularity due to it being cheaper, lower latency, and lower power consumption. Also
in 2006 10GBASE-T Copper was released, which requires the use of a category 6, 6a, or 7
twisted pair cable. These cables showed a slightly higher latency and higher degree
of power consumption, in addition to the bulkier cable profile. Many believe these factors
are what led a greater adoption of Direct Attach Copper. Although, as switch vendors
work to lower the power consumption of 10GBASE-T, and pricing comes down, many will start to
consider 10GBASE-T over Direct Attach Copper as it’s more flexible from a structured cabling
point of view. So currently it’s a matter of low latency and low power (Direct Attach
Copper) versus longer runs, and custom cable lengths by terminating them in-house (Cat6,
6a and 7). Why:
Why do I need a ten gigabit network for my Home Area Network (HAN) or Small Office Home
Office (SOHO) you say? Well there are many reasons and I’ll list a few of them here for
you to chew on. Obviously it’s not for everyone. That’s a decision you will have to make for
yourself. I’m just presenting the case why I feel it’s finally time to hit the “10G Turbo
Button” on your pent-up PC. First, I’ll start with the 800 pound gorilla
in the room, or should I say 1/10th pound SSD in the PC. Solid State Drives have revolutionized
storage transfer speeds across the consumer spectrum. Add to that PCIe SSD, mSATA, m.2
and the many other SSD based technologies to follow. Let me put it this way; A single
Western Digital Green drive can already max out a one gigabit network link! What?!? You
say, that’s crazy talk? Nope, Western Digital specs for a Green drive currently show sustained
transfer speeds of 123 MB/s for a 3TB model. 123 MB/s is 984 Mb/s. There’s goes your one
gig link up in smoke! Second, RAM matters! Have you ever noticed
on the vendor spec sheets, sometimes they list the buffer to host transfer speed separate
from the drive to host (sustained) speeds? Well if we look back at the line “buffer to
host” that same Green drive says 6 Gb/s. Really? Sweet! Oh ya, and you have your operating
system caching files on top of that too! And here’s the kicker, your operating system is
not limited to 6 Gb/s. That is a limitation of the SATA III protocol. What I’m trying
to say, is that your various levels of cache provide an incredible boost on top of your
storage drive speeds. So let’s review, we are currently somewhere north of 6 Gb/s transfer
speeds. Third, there are a lot of miscellaneous considerations
depending on what tech you employ at home or in the small office. For example, are you
using RAID-based storage to serve up your data? Depending on how the RAID is implemented
it could be pushing a ton of data, but you’ll never see the speed because you are capped
at 1 Gb/s (125 MB/s). But what about Link Aggregation (LAG) or Teamed NICs? I should
be able to combine multiple interfaces to get more than 1 Gb/s, right? Yes and No. There’s
a big caveat here. You still can’t exceed 1 Gb/s if you are only connected using a single
session. Now, if you are transferring using the Link Aggregation pipe and someone else
makes another connection across the same pipe, they will have access to another slice of
that 1 Gig aggregate. Keep in mind protocols play a major role in performance and network
utilization. After all, this is a complicated topic. You didn’t think this was going to
be easy, did you? In upcoming videos, we are going to be walking you through a few strategies
to squeeze the remaining bits from this sometimes ugly tech lemon. Who:
Okay, now we are going to talk about who would benefit from 10 gig connectivity. I’m not
going to spend much time on this topic since you probably have a good idea by now, if you
are interested in exploring what 10 gig performance can do for you. This video series is targeting
your average geek, small business (SOHO), technology enthusiast, gamers, and really
just people that enjoy working with stupid fast speeds. If setup correctly, you’ll be
transferring files pointlessly just to partake in the sheer power and awesomeness that is
10 Gig networking. When:
When should people demand 10 gig at home? Right now! Do you like massive amounts of
performance even if you can’t justify it? Absolutely! Let’s review. The Direct Attach
Copper and 10GBASE-T standards, along with available hardware, have been in place since
2007. That was 8 years ago! Seriously, why am I not using this in my business or at home?
Eight years is a long time in the IT world, and a lot can happen in that timeframe. Point
in case, it would take 14 hours 39 minutes to transfer 6 TB over a 1 Gig connection.
With a 10 Gig connection that time would be reduced to 1 hour 28 minutes. If you are wondering,
I pulled these figures from an online calculator. Of course these numbers are based on pure
calculation, but you get the point. Where:
Alright, now you’ve decided you want a piece of this delicious 10 Gig pie? I’m going to
talk about what ingredients to use, and where exactly to use them. You will need to identify
the location in your network where you stand to gain the most benefit, and work outward
from there. Using my Home Area Network as an example, I identified three main systems
that would see substantial gains by migrating to a 10 Gig pipe. My workstation, VMware server,
and Network Attached Storage (NAS) are poised to take full advantage of 10 Gig speeds. My
workstation is using SSDs, the VMware server has local SSD storage, and my NAS is using
software RAID in the form of ZFS. Now that I’ve identified what systems will be upgraded
to 10 Gig, I need to consider connectivity. The goal is not to replace your existing 1
Gig connections with the 10 Gig links, but instead add a 10 Gig port to each system.
What we want to do is create a separate “storage” or “high speed” network in effect. This will
simplify troubleshooting, and make implementation easier in a number of ways. This will eliminate
path issues and isolate your high speed network, keeping chatter to a minimum. Join me in the following video, where we start
with a simplified configuration. You only need two hosts to talk, and this is known
as peer-to-peer. I will circle back around, in a third video of this series, and cover
the above scenario where I described connecting three systems together. In that video I have
an awesome tip to share. If you want to keep things affordable and upgrade your geek badge
in the process, make sure you continue to the next video in this series on the topic
of ten gig peer-to-peer networking. Don’t forget to like this video, share it,
and subscribe to get the follow-on videos! Your support is tremendously appreciated,
and I always look forward to your comments.

88 thoughts on “10Gb Home Network (P1) – Introduction

  1. Hey everyone! I've been working on this one longer than I wanted to, but I'm extremely happy with the result. I really hope you enjoy it. I put a tremendous amount of work into all the content, and had a blast doing it. Subscribe to get the additional videos that follow. Enjoy!

  2. Great video!
    When is the next video? I want to know because i have a full box of 10gbe network cards in my home. All the cards are from a friend but I can play with it he said ๐Ÿ™‚

  3. Thanks for the nice work, I learned a lot. I have been wanting to build my first home wired network and this video inspired me to dream big! /cheers!

  4. Nice video! You got one subscriber more! Also shared the video with my IT friends ๐Ÿ™‚ Please publish your next video soon.

  5. +1 subscriber here! Great video, easy to understand! Can't wait to see the next one!

    Best regards from Sweden ๐Ÿ™‚

  6. You can get 10Gbe relatively cheap with 2 Intel X520 DA1 cards for around $100 and a SFP+ direct attach cable which can range from $10-$30 depending on length. I have my desktop connected to my 6TB Raid 10 server and I get around 7Gbps. With perfect optimisation you can achieve around 9.2Gbps. You'll never reach 10Gbps. That same bottleneck can be seen on 1Gbps where you will only get around 920-930Mbps max.

  7. Great work here I'm an IT pro and focus on IT education and wanted to give you props for it and can't wait for the next one in the series

  8. Nice video & looking forward to more. New subscriber! I'm an IT guy and a software guy… And I kind of doubt some of the claims about cached backed drives. As a software guy, I have written a lot of sockets based software and typically with large data transfers over a network, the initial short burst and higher transfer rates I think can be attributed to those caches but once a large transfer is underway and the cache size exceeded, I think the cache has little to no impact on performance. After all, a 10Gb network really benefits these scenarios… Also, as an aside, I was the lead engineer for the first Cisco commercial 10Gb network. It was a US government project for one of their combat colleges. This was circa 2002. Things certainly have come a long way since then ๐Ÿ˜‰ Great channel!

  9. Excellent video. It probably took you a lot of time to finish it, but the result is great.
    Subscribed.

  10. 10Gb Home Network (P2) – Peer-to-Peer: Update:
    https://www.facebook.com/itechstorm/posts/1740581282823775

  11. But Mb or Megabits are smaller than MB, Megabytes. The conversion that you did wasn't necessary besides the fact that you tried to multiply Megabytes by 8 to convert to a smaller unit. The point at 2:33 is wrong. The single WD Caviar Green drive can saturate maybe around 10% realistically of the 1GB network connection since 123MB / 1024 MB(1GB) = 12% . But you could boost the speed with RAID 0

  12. Correction: no way in hell hell you will get sustained 125 Mbytes/sec from a spinning 7200 rpm drive….; you might get an occasional burst of data at SATA3 speeds, but, good luck otherwise.

  13. The price is still insane when you can have USB3.1 with 10GBit for $20 but network cards still cost $200 for the last 5 years.

  14. Subscribe and Like my Facebook channel for frequent updates on the release of my latest videos!
    Check it out:
    https://www.facebook.com/itechstorm

  15. Awesome series, I was thinking of doing something similar, but wow, you did this very nicely! I will certainly recommend for anyone looking for something similar! ๐Ÿ˜€

    I released my own video about a 10Gbps backbone and router on a budget a few days ago myself.

  16. Ah ok I see what you mean. I would actually quite like to get a media server. Last month I order a 2u hp dl380 g5 server of eBay but the item was listed wrong and the server only actually had 1 cpu and 8gb of ram rather than 2 cpu and 24gb. There was lots of other things wrong too. Anyway that got sent back and I got my full money back. Point is I now have money to spend and want a media pc to run a plex server on and also active directory as I want to mess about with that. I will probably run esxi and have them setup over separate virtual machines. What are the specs of your pc that runs esxi? So my question to you is should I buy either one of those hp micro servers or a hp ProLiant ml series tower server. Since I don't want a rack server now. I want a low power fairly quite 24/7 operation media pc. But i dont want a Nas as I want to have pci express x8 and multiple of them to make it possible to put a 10gb nic in and upgrade to a 10gb network in the future. So any ideas, should I buy or build (from new and/or old pc parts, probably from eBay) a media pc? Any opinions or help would be amazing. Thanks.

  17. Nice video, with the dual port cards is it still possible to use link aggregation /teaming? I know you need Windows server to do it with Windows but for Windows 7-10 is there any software available from Mellanox do achieve this?

  18. Can't we now do more than 1gbps to a single PC with smb 3.0 multichannel and multiple 1gbit links?? It no longer limits 1gbps to a single host

  19. Hi techstorm I want to setup 10gbps setup for my gaming cafe for ccboot eg want to connect my server with 10gbps card to 10 gbps router can u help with that with cheapest nd quality network

  20. Why? You mention numbers of what is good about it but what real world use does it have. Also, your cache speed number is only for what is in cache, on that SSD that's 64MB, so you have 6GB/s for the first 64MB and then it's doing direct reads.. I like the video but your reasons for why don't seem to add up..

  21. 2007 is now 11 years ago… I still have 10Mb/s down, .75Mb/s over dsl up & no ethernet lol. America is screwed internet wise.

  22. I work at a Aquantia and we supply Intel, Cisco, Dell, HPE, Lenovo and even Apple with our 5G or 10Gpcs Ethernet controllers. We will be launching soon a PCIe 10G Ethernet adapter for under $100. Loved the video. It's all about market timing. We all want faster network speeds and various pieces of the ecosystem are there but others are coming quickly behind. Join the 10G revolution!

  23. 1. I don't have the Money for 10Gb Ethernet NICs and Switches
    2. I don't have the Money for a 4TB SSD (I use a 4TB HDD in my NAS)
    3. I don't have a PC that can handle 100% 10Gb Ethernet

  24. Nice video. A lot of clicking in the audio. Are you using a nose gate or just really chopping up the audio? Leave your inhales in before speaking in the audio track. It makes for a much more natural sounding dialog.

  25. Yeah, Iโ€™m going to build a 10G LAN so my daughters and I can play State of Decay 2 without lag.

  26. Great video but you definitely need to clarify some stuff. In this video you mix SAN (Storage area network) and LAN (local area network) with all of their purposes and standards together. Where realistically they are separate systems and dont work together. With switches, SFP's, and network cards there are ethernet based and fiber channel based. Fiber channel is exclusively for SAN and ethernet is obviously LAN. The two wont mix and can result in wasted money in parts or confusion with it not working. A SAN network is only for providing storage and cannot be used for regular network traffic. At my office we have 10gbs LAN network with fiber switches for ethernet to connect servers together and run one line to 20pcs we also have a 10gbs fiber SAN fabric network which is used to provide storage for VM's and thin clients. They are separate systems and used for very diffrent purposes. If you want more information about this feel free to contact me or "back yard tech" and "Eli the computer guy" both have great SAN vs LAN videos.

  27. I've actually gone backwards. I had a 1Gbit ethernet in 2005, but now I have a 100Mb ethernet because that's what my ISP's modem/router does.

  28. He's not considering ROI at all; yes, 10 gb is awesome but it's still relatively expensive to implement in mid-2018 even just adding a 10gbNIC to your NAS and computer and doing a direct connect crossover; so it's not even really networking… my opinion is 1 gbps is perfect for home still, no home user (w/ exceptions such as leasers / etc that essentially operate a MDW out of their house) will ever get a valuable ROI on upgrading to 10 gbps, buttttttt not gonna argue a home office that serves clients on a WAN or geek who doesn't fiscally care totally might…

  29. so wait does this mean i have to pay extra to my service provider if this is wayyyyyyyyyyyyyyyy more than they give

  30. Point-to-point between two computers is easy and cheap–two used 10G NICs and a cable. But if you want a third machine, you need a switch (or messy forwarding/routing among the computers), which has been expensive. Mikrotik has a 4-port SFP+ switch now for under $150, making it somewhat more cost-effective to network a few machines with 10G links.

  31. actually VERY VERY FEW people actually need 10GBPS networks โ€ฆ interesting but for home use is kind of unnecessary.

  32. 4:16 – 5:00 As you mentioned this is a complicated topic but I'm happy you pointed out all factors. I think you probably know already it boils down to the hashing algorithm. Some Juniper platforms support using L4 hashing algorithms that would utilize multiple links for one "session". Although I haven't seen this on any other platform and wish it was available more commonly. Great video!

  33. Yeah, the price is insane… A single pcie network card is still like $120 and then you have to by a cable spool, crimper, and all the ends which adds another $500. So you are looking at $1000 just to do a small 1000 sqft area with 2 computers. It is currently (as of 2/2019) more economical to do LACP with regular cabling on manageable switches.

  34. Thank you for posting this video. A lot of useful information. You sir have a new subscriber. Keep up the great work.

  35. Amazing how dated these videos get, now we have cheap SSDs, cheap ram and cheap cpu cores but weirdly, 10gbe is STILL fuking expensive.

  36. Throw in a DELL 0272F MELLANOX CONNECTX-4 100GBE DUAL PORT NIC for $350.00 and you'll have 10/100/1000/10000/100000 Ethernet capability.

  37. Not just with specific point to point connections but also for network backbone between switches and other appliances. Having a 1Gb switch in one area and having it up-link to the router or firewall with a fiber or copper 10Gb back bone helps a ton with aggregation and latency even if the end point device like a workstation or media server doesn't have a 10Gb NIC. If you had a server on the network with a 10Gb NIC on top of that you would still see improvements in transfer rate at the client over just having 1Gb network throughout.

Leave a Reply

Your email address will not be published. Required fields are marked *

Copyright ยฉ 2019 Explore Mellieha. All rights reserved.