ML
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    NVMe and RAID?

    IT Discussion
    11
    72
    5.1k
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • 1
      1337 @taurex
      last edited by 1337

      @taurex said in NVMe and RAID?:

      @Pete-S I'd stay away from the 7xx Intel NICs, I heard lots of bad things on different IT forums how they play up. The Mellanox NICs would be my first choice for anything with RDMA support.

      I just picked it because that is what Dell sells. It's a simple card, no RDMA, but I don't think RDMA is needed in a fileserver application like this with huge files.

      I'm surprised to hear that people have problems with it because it's been around for 5-6 years something now and Intel have newer cards as well. You would kind of assume they've worked out the kinks by now.

      Anyway, it more a proof-of-concept at this point. You got to have some numbers to play with to see if it's economically feasible for the customer. What you end up will depend on the budget and what the needs actually are. Switches are also a big cost when it comes to 10GbE and faster.

      And yes, Mellanox is good stuff.

      1 Reply Last reply Reply Quote 1
      • B
        biggen
        last edited by

        I appreciate all the help guys. Yeah I'm compiling a price list but it ain't cheap. Server alone would be about $7k and that's on the low end with smaller NVMe drives (1.6TB). Then still have to purchase the switch and then have to purchase the 10Gbe NICs for the workstations themselves.

        Its a large investment that I bet never sees the light of day. It will turn into "I have $2k, what can you build with that?"

        PhlipElderP 1 Reply Last reply Reply Quote 0
        • PhlipElderP
          PhlipElder @biggen
          last edited by

          @biggen said in NVMe and RAID?:

          I appreciate all the help guys. Yeah I'm compiling a price list but it ain't cheap. Server alone would be about $7k and that's on the low end with smaller NVMe drives (1.6TB). Then still have to purchase the switch and then have to purchase the 10Gbe NICs for the workstations themselves.

          Its a large investment that I bet never sees the light of day. It will turn into "I have $2k, what can you build with that?"

          FleaBay is your best friend. 😉

          10GbE pNIC: Intel x540: $100 to $125 each.

          For 10GbE switch go for NETGEAR XS712T XS716T or XS728T depending on port density needed. The 12-port is $1K.

          As far as the server goes, is this a proof of concept driven project?

          • ASRock Rack Board
            ** Dual 10GbE On Board (designated by -2T)
          • Intel Xeon Scalable or AMD EPYC Rome
          • Crucial/Samsung ECC Memory
          • Power Supply

          The board should have at least one SlimSAS x8 or preferably two. Each of those ports gives you two NVMe drives. An SFF-8654 Y cable to connect to a two drive enclosure would be needed. I suggest ICYDOCK.

          The build will cost a fraction of a Tier 1 box.

          Once the PoC has been run and the kinks worked out, then go for the Tier 1 box tailored to your needs.

          scottalanmillerS M 2 Replies Last reply Reply Quote 0
          • scottalanmillerS
            scottalanmiller @PhlipElder
            last edited by

            @PhlipElder said in NVMe and RAID?:

            FleaBay is your best friend.

            Is that where people trade their pets?

            1 Reply Last reply Reply Quote 0
            • M
              marcinozga @PhlipElder
              last edited by

              @PhlipElder said in NVMe and RAID?:

              @biggen said in NVMe and RAID?:

              I appreciate all the help guys. Yeah I'm compiling a price list but it ain't cheap. Server alone would be about $7k and that's on the low end with smaller NVMe drives (1.6TB). Then still have to purchase the switch and then have to purchase the 10Gbe NICs for the workstations themselves.

              Its a large investment that I bet never sees the light of day. It will turn into "I have $2k, what can you build with that?"

              FleaBay is your best friend. 😉

              10GbE pNIC: Intel x540: $100 to $125 each.

              For 10GbE switch go for NETGEAR XS712T XS716T or XS728T depending on port density needed. The 12-port is $1K.

              As far as the server goes, is this a proof of concept driven project?

              • ASRock Rack Board
                ** Dual 10GbE On Board (designated by -2T)
              • Intel Xeon Scalable or AMD EPYC Rome
              • Crucial/Samsung ECC Memory
              • Power Supply

              The board should have at least one SlimSAS x8 or preferably two. Each of those ports gives you two NVMe drives. An SFF-8654 Y cable to connect to a two drive enclosure would be needed. I suggest ICYDOCK.

              The build will cost a fraction of a Tier 1 box.

              Once the PoC has been run and the kinks worked out, then go for the Tier 1 box tailored to your needs.

              I love Asrock Rack products, their support is great, if they can actually fix the damn issues, if not, you're sol. My next server refresh will have this board: https://www.asrockrack.com/general/productdetail.asp?Model=ROMED8-2T#Specifications

              scottalanmillerS PhlipElderP 2 Replies Last reply Reply Quote 0
              • scottalanmillerS
                scottalanmiller @marcinozga
                last edited by

                @marcinozga said in NVMe and RAID?:

                I love Asrock Rack products

                I'm on an ASRock right now. Got another sitting beside me.

                1 Reply Last reply Reply Quote 0
                • PhlipElderP
                  PhlipElder @marcinozga
                  last edited by PhlipElder

                  @marcinozga said in NVMe and RAID?:

                  @PhlipElder said in NVMe and RAID?:

                  @biggen said in NVMe and RAID?:

                  I appreciate all the help guys. Yeah I'm compiling a price list but it ain't cheap. Server alone would be about $7k and that's on the low end with smaller NVMe drives (1.6TB). Then still have to purchase the switch and then have to purchase the 10Gbe NICs for the workstations themselves.

                  Its a large investment that I bet never sees the light of day. It will turn into "I have $2k, what can you build with that?"

                  FleaBay is your best friend. 😉

                  10GbE pNIC: Intel x540: $100 to $125 each.

                  For 10GbE switch go for NETGEAR XS712T XS716T or XS728T depending on port density needed. The 12-port is $1K.

                  As far as the server goes, is this a proof of concept driven project?

                  • ASRock Rack Board
                    ** Dual 10GbE On Board (designated by -2T)
                  • Intel Xeon Scalable or AMD EPYC Rome
                  • Crucial/Samsung ECC Memory
                  • Power Supply

                  The board should have at least one SlimSAS x8 or preferably two. Each of those ports gives you two NVMe drives. An SFF-8654 Y cable to connect to a two drive enclosure would be needed. I suggest ICYDOCK.

                  The build will cost a fraction of a Tier 1 box.

                  Once the PoC has been run and the kinks worked out, then go for the Tier 1 box tailored to your needs.

                  I love Asrock Rack products, their support is great, if they can actually fix the damn issues, if not, you're sol. My next server refresh will have this board: https://www.asrockrack.com/general/productdetail.asp?Model=ROMED8-2T#Specifications

                  We just received two ROMED6U-2L2T boards:
                  https://www.asrockrack.com/general/productdetail.asp?Model=ROMED6U-2L2T#Specifications

                  They are a perfect board for our cluster storage nodes with two built-in 10GbE ports. An AMD EPYC Rome 7262 processor, 96GB or 192GB of ECC Memory, four NVMe via SlimSAS x8 on board, and up to twelve SATA SSDs or HDDs for capacity and we have a winner.

                  FYI: We only use EPYC Rome processors with a TDP of 155 watts or higher. Cost wise, there's very little increase while the performance benefits are there.

                  EDIT: Missed the Slimline x8 beside the MiniSAS HD ports. That's six NVMe drives if we go that route.

                  M B 1 4 Replies Last reply Reply Quote 0
                  • M
                    marcinozga @PhlipElder
                    last edited by

                    @PhlipElder said in NVMe and RAID?:

                    @marcinozga said in NVMe and RAID?:

                    @PhlipElder said in NVMe and RAID?:

                    @biggen said in NVMe and RAID?:

                    I appreciate all the help guys. Yeah I'm compiling a price list but it ain't cheap. Server alone would be about $7k and that's on the low end with smaller NVMe drives (1.6TB). Then still have to purchase the switch and then have to purchase the 10Gbe NICs for the workstations themselves.

                    Its a large investment that I bet never sees the light of day. It will turn into "I have $2k, what can you build with that?"

                    FleaBay is your best friend. 😉

                    10GbE pNIC: Intel x540: $100 to $125 each.

                    For 10GbE switch go for NETGEAR XS712T XS716T or XS728T depending on port density needed. The 12-port is $1K.

                    As far as the server goes, is this a proof of concept driven project?

                    • ASRock Rack Board
                      ** Dual 10GbE On Board (designated by -2T)
                    • Intel Xeon Scalable or AMD EPYC Rome
                    • Crucial/Samsung ECC Memory
                    • Power Supply

                    The board should have at least one SlimSAS x8 or preferably two. Each of those ports gives you two NVMe drives. An SFF-8654 Y cable to connect to a two drive enclosure would be needed. I suggest ICYDOCK.

                    The build will cost a fraction of a Tier 1 box.

                    Once the PoC has been run and the kinks worked out, then go for the Tier 1 box tailored to your needs.

                    I love Asrock Rack products, their support is great, if they can actually fix the damn issues, if not, you're sol. My next server refresh will have this board: https://www.asrockrack.com/general/productdetail.asp?Model=ROMED8-2T#Specifications

                    We just received two ROMED6U-2L2T boards:
                    https://www.asrockrack.com/general/productdetail.asp?Model=ROMED6U-2L2T#Specifications

                    They are a perfect board for our cluster storage nodes with two built-in 10GbE ports. An AMD EPYC Rome 7262 processor, 96GB or 192GB of ECC Memory, four NVMe via SlimSAS x8 on board, and up to twelve SATA SSDs or HDDs for capacity and we have a winner.

                    FYI: We only use EPYC Rome processors with a TDP of 155 watts or higher. Cost wise, there's very little increase while the performance benefits are there.

                    EDIT: Missed the Slimline x8 beside the MiniSAS HD ports. That's six NVMe drives if we go that route.

                    You're probably overpaying with that CPU, here's a deal not many know about, Epyc 7302P for $713
                    https://www.provantage.com/hpe-p16667-b21~7CMPTCR7.htm

                    PhlipElderP 1 Reply Last reply Reply Quote 0
                    • PhlipElderP
                      PhlipElder @marcinozga
                      last edited by

                      @marcinozga said in NVMe and RAID?:

                      @PhlipElder said in NVMe and RAID?:

                      @marcinozga said in NVMe and RAID?:

                      @PhlipElder said in NVMe and RAID?:

                      @biggen said in NVMe and RAID?:

                      I appreciate all the help guys. Yeah I'm compiling a price list but it ain't cheap. Server alone would be about $7k and that's on the low end with smaller NVMe drives (1.6TB). Then still have to purchase the switch and then have to purchase the 10Gbe NICs for the workstations themselves.

                      Its a large investment that I bet never sees the light of day. It will turn into "I have $2k, what can you build with that?"

                      FleaBay is your best friend. 😉

                      10GbE pNIC: Intel x540: $100 to $125 each.

                      For 10GbE switch go for NETGEAR XS712T XS716T or XS728T depending on port density needed. The 12-port is $1K.

                      As far as the server goes, is this a proof of concept driven project?

                      • ASRock Rack Board
                        ** Dual 10GbE On Board (designated by -2T)
                      • Intel Xeon Scalable or AMD EPYC Rome
                      • Crucial/Samsung ECC Memory
                      • Power Supply

                      The board should have at least one SlimSAS x8 or preferably two. Each of those ports gives you two NVMe drives. An SFF-8654 Y cable to connect to a two drive enclosure would be needed. I suggest ICYDOCK.

                      The build will cost a fraction of a Tier 1 box.

                      Once the PoC has been run and the kinks worked out, then go for the Tier 1 box tailored to your needs.

                      I love Asrock Rack products, their support is great, if they can actually fix the damn issues, if not, you're sol. My next server refresh will have this board: https://www.asrockrack.com/general/productdetail.asp?Model=ROMED8-2T#Specifications

                      We just received two ROMED6U-2L2T boards:
                      https://www.asrockrack.com/general/productdetail.asp?Model=ROMED6U-2L2T#Specifications

                      They are a perfect board for our cluster storage nodes with two built-in 10GbE ports. An AMD EPYC Rome 7262 processor, 96GB or 192GB of ECC Memory, four NVMe via SlimSAS x8 on board, and up to twelve SATA SSDs or HDDs for capacity and we have a winner.

                      FYI: We only use EPYC Rome processors with a TDP of 155 watts or higher. Cost wise, there's very little increase while the performance benefits are there.

                      EDIT: Missed the Slimline x8 beside the MiniSAS HD ports. That's six NVMe drives if we go that route.

                      You're probably overpaying with that CPU, here's a deal not many know about, Epyc 7302P for $713
                      https://www.provantage.com/hpe-p16667-b21~7CMPTCR7.htm

                      We're in Canada. We overpay for everything up here. :S

                      scottalanmillerS 1 Reply Last reply Reply Quote 0
                      • scottalanmillerS
                        scottalanmiller @PhlipElder
                        last edited by

                        @PhlipElder said in NVMe and RAID?:

                        @marcinozga said in NVMe and RAID?:

                        @PhlipElder said in NVMe and RAID?:

                        @marcinozga said in NVMe and RAID?:

                        @PhlipElder said in NVMe and RAID?:

                        @biggen said in NVMe and RAID?:

                        I appreciate all the help guys. Yeah I'm compiling a price list but it ain't cheap. Server alone would be about $7k and that's on the low end with smaller NVMe drives (1.6TB). Then still have to purchase the switch and then have to purchase the 10Gbe NICs for the workstations themselves.

                        Its a large investment that I bet never sees the light of day. It will turn into "I have $2k, what can you build with that?"

                        FleaBay is your best friend. 😉

                        10GbE pNIC: Intel x540: $100 to $125 each.

                        For 10GbE switch go for NETGEAR XS712T XS716T or XS728T depending on port density needed. The 12-port is $1K.

                        As far as the server goes, is this a proof of concept driven project?

                        • ASRock Rack Board
                          ** Dual 10GbE On Board (designated by -2T)
                        • Intel Xeon Scalable or AMD EPYC Rome
                        • Crucial/Samsung ECC Memory
                        • Power Supply

                        The board should have at least one SlimSAS x8 or preferably two. Each of those ports gives you two NVMe drives. An SFF-8654 Y cable to connect to a two drive enclosure would be needed. I suggest ICYDOCK.

                        The build will cost a fraction of a Tier 1 box.

                        Once the PoC has been run and the kinks worked out, then go for the Tier 1 box tailored to your needs.

                        I love Asrock Rack products, their support is great, if they can actually fix the damn issues, if not, you're sol. My next server refresh will have this board: https://www.asrockrack.com/general/productdetail.asp?Model=ROMED8-2T#Specifications

                        We just received two ROMED6U-2L2T boards:
                        https://www.asrockrack.com/general/productdetail.asp?Model=ROMED6U-2L2T#Specifications

                        They are a perfect board for our cluster storage nodes with two built-in 10GbE ports. An AMD EPYC Rome 7262 processor, 96GB or 192GB of ECC Memory, four NVMe via SlimSAS x8 on board, and up to twelve SATA SSDs or HDDs for capacity and we have a winner.

                        FYI: We only use EPYC Rome processors with a TDP of 155 watts or higher. Cost wise, there's very little increase while the performance benefits are there.

                        EDIT: Missed the Slimline x8 beside the MiniSAS HD ports. That's six NVMe drives if we go that route.

                        You're probably overpaying with that CPU, here's a deal not many know about, Epyc 7302P for $713
                        https://www.provantage.com/hpe-p16667-b21~7CMPTCR7.htm

                        We're in Canada. We overpay for everything up here. :S

                        And even when you pay a lot, you often can't get things. We tried to order stuff from Insight Canada for our Montreal office and after a week of not being able to ship, they eventually just told us that they couldn't realistically service Canada.

                        PhlipElderP 1 Reply Last reply Reply Quote 0
                        • B
                          biggen
                          last edited by biggen

                          Yeah I have no problem whiteboxing stuff for me (or close family), but when you do it for others, they expect tech support for life. I don't really want to go down that road 🙂

                          But a PoC build may be more "in line" with his budge needs. Thanks for that @PhlipElder !

                          scottalanmillerS PhlipElderP 2 Replies Last reply Reply Quote 0
                          • scottalanmillerS
                            scottalanmiller @biggen
                            last edited by

                            @biggen said in NVMe and RAID?:

                            Yeah I have no problem whiteboxing stuff for me (or close family), but when you do it for others, they expect tech support for life. I don't really want to go down that road

                            That's great in an IT setting. Either you as the employee have to support forever anyway, or you as a consultant get them to keep coming back to hire you.

                            If you think about it, all IT is whiteboxing. Just the boxes are different sizes. Sometimes it's tiny, inside the server. Sometimes it's the server. Sometimes it's the cabinet. Sometimes it's the whole company. But to a company everything we provide as a solution is a whitebox in some sense.

                            1 Reply Last reply Reply Quote 1
                            • PhlipElderP
                              PhlipElder @scottalanmiller
                              last edited by

                              @scottalanmiller said in NVMe and RAID?:

                              @PhlipElder said in NVMe and RAID?:

                              @marcinozga said in NVMe and RAID?:

                              @PhlipElder said in NVMe and RAID?:

                              @marcinozga said in NVMe and RAID?:

                              @PhlipElder said in NVMe and RAID?:

                              @biggen said in NVMe and RAID?:

                              I appreciate all the help guys. Yeah I'm compiling a price list but it ain't cheap. Server alone would be about $7k and that's on the low end with smaller NVMe drives (1.6TB). Then still have to purchase the switch and then have to purchase the 10Gbe NICs for the workstations themselves.

                              Its a large investment that I bet never sees the light of day. It will turn into "I have $2k, what can you build with that?"

                              FleaBay is your best friend. 😉

                              10GbE pNIC: Intel x540: $100 to $125 each.

                              For 10GbE switch go for NETGEAR XS712T XS716T or XS728T depending on port density needed. The 12-port is $1K.

                              As far as the server goes, is this a proof of concept driven project?

                              • ASRock Rack Board
                                ** Dual 10GbE On Board (designated by -2T)
                              • Intel Xeon Scalable or AMD EPYC Rome
                              • Crucial/Samsung ECC Memory
                              • Power Supply

                              The board should have at least one SlimSAS x8 or preferably two. Each of those ports gives you two NVMe drives. An SFF-8654 Y cable to connect to a two drive enclosure would be needed. I suggest ICYDOCK.

                              The build will cost a fraction of a Tier 1 box.

                              Once the PoC has been run and the kinks worked out, then go for the Tier 1 box tailored to your needs.

                              I love Asrock Rack products, their support is great, if they can actually fix the damn issues, if not, you're sol. My next server refresh will have this board: https://www.asrockrack.com/general/productdetail.asp?Model=ROMED8-2T#Specifications

                              We just received two ROMED6U-2L2T boards:
                              https://www.asrockrack.com/general/productdetail.asp?Model=ROMED6U-2L2T#Specifications

                              They are a perfect board for our cluster storage nodes with two built-in 10GbE ports. An AMD EPYC Rome 7262 processor, 96GB or 192GB of ECC Memory, four NVMe via SlimSAS x8 on board, and up to twelve SATA SSDs or HDDs for capacity and we have a winner.

                              FYI: We only use EPYC Rome processors with a TDP of 155 watts or higher. Cost wise, there's very little increase while the performance benefits are there.

                              EDIT: Missed the Slimline x8 beside the MiniSAS HD ports. That's six NVMe drives if we go that route.

                              You're probably overpaying with that CPU, here's a deal not many know about, Epyc 7302P for $713
                              https://www.provantage.com/hpe-p16667-b21~7CMPTCR7.htm

                              We're in Canada. We overpay for everything up here. :S

                              And even when you pay a lot, you often can't get things. We tried to order stuff from Insight Canada for our Montreal office and after a week of not being able to ship, they eventually just told us that they couldn't realistically service Canada.

                              We're creative with our procurement process so we don't have issues with getting product.

                              Insight is tied to Ingram Micro. If they don't have it, Insight doesn't.

                              Our Canadian distribution network used to be quite homogeneous with all three major distributors having similar line cards. The competition was good though pricing was fairly consistent across the three.

                              We have a number of niche suppliers that help when we can't get product from the Big Three always making sure we're dealing with legit product not grey market. We verify that with our vendor contacts.

                              PING if you need anything. 😉

                              1 Reply Last reply Reply Quote 0
                              • PhlipElderP
                                PhlipElder @biggen
                                last edited by PhlipElder

                                @biggen said in NVMe and RAID?:

                                Yeah I have no problem whiteboxing stuff for me (or close family), but when you do it for others, they expect tech support for life. I don't really want to go down that road 🙂

                                But a PoC build may be more "in line" with his budge needs. Thanks for that @PhlipElder !

                                That's what we do as a business.

                                We've been system builders since day one of MPECS in 2003 but since the late 1990s for myself.

                                We have a parts bin full of broken promises.

                                But, we also have a defined solution set that we know works so we run with them.

                                Our support terms are clearly defined and require a contract.

                                We are either building a mutually beneficial business relationship or it ain't gonna happen. We don't do one-offs unless there's good reason to.

                                1 Reply Last reply Reply Quote 0
                                • B
                                  biggen @PhlipElder
                                  last edited by biggen

                                  @PhlipElder

                                  The ROMED6U-2L2T is mATX? Whats the advantage there over a full size ATX board?

                                  scottalanmillerS PhlipElderP 2 Replies Last reply Reply Quote 0
                                  • scottalanmillerS
                                    scottalanmiller @biggen
                                    last edited by

                                    @biggen said in NVMe and RAID?:

                                    @PhlipElder said in NVMe and RAID?:

                                    EPYC Rome 7262

                                    The ROMED6U-2L2T is mATX? Whats the advantage there over a full size ATX board?

                                    it's smaller, so takes up less space 😉

                                    1 Reply Last reply Reply Quote 1
                                    • B
                                      biggen
                                      last edited by

                                      Ha I just found an Anandtech article about that exact board: https://www.anandtech.com/show/15835/asrock-rack-offers-rome-matx-motherboard-with-only-6-memory-channels

                                      1 Reply Last reply Reply Quote 0
                                      • PhlipElderP
                                        PhlipElder @biggen
                                        last edited by

                                        @biggen said in NVMe and RAID?:

                                        @PhlipElder

                                        The ROMED6U-2L2T is mATX? Whats the advantage there over a full size ATX board?

                                        Smaller chassis. It's the next best thing to Mini-ITX but without the pains of dealing with Mini-ITX.

                                        1 Reply Last reply Reply Quote 1
                                        • B
                                          biggen
                                          last edited by biggen

                                          So this Icy Dock enclosure would connect to both of those SlimSAS port with what exactly? Four of these?

                                          Edit: No that wouldn't work. Like you said, need a Y-cable. Something like this?

                                          PhlipElderP 1 Reply Last reply Reply Quote 0
                                          • PhlipElderP
                                            PhlipElder @biggen
                                            last edited by

                                            @biggen said in NVMe and RAID?:

                                            So this Icy Dock enclosure would connect to both of those SlimSAS port with what exactly? Four of these?

                                            Edit: No that wouldn't work. Like you said, need a Y-cable. Something like this?

                                            Correct on both counts.
                                            https://blog.mpecsinc.com/2020/07/27/custom-build-s2d-the-elusive-slimsas-8x-sff-8654-cable/

                                            B 1 Reply Last reply Reply Quote 1
                                            • 1
                                            • 2
                                            • 3
                                            • 4
                                            • 3 / 4
                                            • First post
                                              Last post