ML
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    ServerBear Performance Comparison of Rackspace, Digital Ocean, Linode and Vultr

    Scheduled Pinned Locked Moved IT Discussion
    serverbearserver benchmarkingrackspaceiaasvpsdigital oceanvultrcentoscentos 7linuxlinux serverkvmxen
    56 Posts 12 Posters 18.4k Views
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • wirestyle22W
      wirestyle22
      last edited by

      updated above

      1 Reply Last reply Reply Quote 0
      • brianlittlejohnB
        brianlittlejohn
        last edited by

        Ping statistics for 108.61.151.173:
        Packets: Sent = 204, Received = 203, Lost = 1 (0% loss),
        Approximate round trip times in milli-seconds:
        Minimum = 58ms, Maximum = 62ms, Average = 58ms

        Ping statistics for 104.236.119.59:
        Packets: Sent = 231, Received = 229, Lost = 2 (0% loss),
        Approximate round trip times in milli-seconds:
        Minimum = 56ms, Maximum = 66ms, Average = 56ms

        Ping statistics for 162.242.243.171:
        Packets: Sent = 95, Received = 94, Lost = 1 (1% loss),
        Approximate round trip times in milli-seconds:
        Minimum = 51ms, Maximum = 56ms, Average = 51ms

        About the same from west Texas.

        1 Reply Last reply Reply Quote 0
        • scottalanmillerS
          scottalanmiller
          last edited by

          OMG WE HAVE A WINNER!!!!

          Linode just took everyone out back and took their lunch money!! They have load balancers too!! (a la Rackspace and Amazone.) Look at that IO capacity!!! And that UNIX Bench! Their single thread was by far the fastest too!

          wirestyle22W 1 Reply Last reply Reply Quote 3
          • wirestyle22W
            wirestyle22 @scottalanmiller
            last edited by wirestyle22

            @scottalanmiller said:

            OMG WE HAVE A WINNER!!!!

            Linode just took everyone out back and took their lunch money!! They have load balancers too!! (a la Rackspace and Amazone.) Look at that IO capacity!!! And that UNIX Bench! Their single thread was by far the fastest too!

            Wow. That's fantastic.

            scottalanmillerS 1 Reply Last reply Reply Quote 0
            • scottalanmillerS
              scottalanmiller @wirestyle22
              last edited by

              @wirestyle22 said:

              Wow. That's fantastic.

              I'm so excited. No question that they are by FAR the hardest to use, but who cares. That performance is crazy!!

              wirestyle22W 1 Reply Last reply Reply Quote 1
              • wirestyle22W
                wirestyle22 @scottalanmiller
                last edited by

                @scottalanmiller said:

                @wirestyle22 said:

                Wow. That's fantastic.

                I'm so excited. No question that they are by FAR the hardest to use, but who cares. That performance is crazy!!

                Rewarded complexity is fine by me 😄

                1 Reply Last reply Reply Quote 0
                • scottalanmillerS
                  scottalanmiller
                  last edited by

                  Throughout the range, Linode comes in either as cheap or cheaper than everyone else, too. It pretty much tracks Vultr until it outscales them. Then it matches or beats DO.

                  1 Reply Last reply Reply Quote 0
                  • scottalanmillerS
                    scottalanmiller
                    last edited by

                    Of additional consideration... Vultr and RS cap out pretty small. DO and Linode make massive single nodes, which is important when we are running epic databases, which we are doing. The growth rate on the database is quite healthy.

                    1 Reply Last reply Reply Quote 1
                    • A
                      Alex Sage
                      last edited by Alex Sage

                      http://www.theregister.co.uk/2016/01/04/linode_back_at_last_after_ten_days_of_hell/

                      travisdh1T 1 Reply Last reply Reply Quote 0
                      • travisdh1T
                        travisdh1 @Alex Sage
                        last edited by travisdh1

                        @aaronstuder said:

                        http://www.theregister.co.uk/2016/01/04/linode_back_at_last_after_ten_days_of_hell/

                        That's just painful, but have to expect that to happen now and then. Guess I'll start an overnight ping test, see how bad it still is.

                        Edit: Never mind, don't have the IP address for the Linode.

                        scottalanmillerS 1 Reply Last reply Reply Quote 0
                        • scottalanmillerS
                          scottalanmiller
                          last edited by

                          Here is the blog response to that...

                          https://blog.linode.com/2016/01/29/christmas-ddos-retrospective/

                          wrx7mW 1 Reply Last reply Reply Quote 1
                          • scottalanmillerS
                            scottalanmiller @travisdh1
                            last edited by

                            @travisdh1 said:

                            Edit: Never mind, don't have the IP address for the Linode.

                            It's already offline but I will get you a new one privately in a few minutes.

                            1 Reply Last reply Reply Quote 1
                            • A
                              Alex Sage
                              last edited by Alex Sage

                              Will Mangolassi be moving to Linode?

                              DustinB3403D scottalanmillerS 2 Replies Last reply Reply Quote 1
                              • DustinB3403D
                                DustinB3403 @Alex Sage
                                last edited by

                                @aaronstuder said:

                                Will Mongolassi be moving to Linode?

                                Nope Mongolassi doesn't exist!

                                1 Reply Last reply Reply Quote 0
                                • scottalanmillerS
                                  scottalanmiller @Alex Sage
                                  last edited by

                                  @aaronstuder said:

                                  Will Mangolassi be moving to Linode?

                                  Yes, going to make an attempt of it.

                                  1 Reply Last reply Reply Quote 0
                                  • wrx7mW
                                    wrx7m @scottalanmiller
                                    last edited by

                                    @scottalanmiller said:

                                    Here is the blog response to that...

                                    https://blog.linode.com/2016/01/29/christmas-ddos-retrospective/

                                    This was really interesting.

                                    DashrenderD 1 Reply Last reply Reply Quote 1
                                    • DashrenderD
                                      Dashrender @wrx7m
                                      last edited by

                                      @wrx7m said:

                                      @scottalanmiller said:

                                      Here is the blog response to that...

                                      https://blog.linode.com/2016/01/29/christmas-ddos-retrospective/

                                      This was really interesting.

                                      Wow - this sounds nearly the same as the GRC DDOS attack, only on a HUGE scale.

                                      scottalanmillerS 1 Reply Last reply Reply Quote 1
                                      • scottalanmillerS
                                        scottalanmiller @Dashrender
                                        last edited by

                                        @Dashrender said:

                                        @wrx7m said:

                                        @scottalanmiller said:

                                        Here is the blog response to that...

                                        https://blog.linode.com/2016/01/29/christmas-ddos-retrospective/

                                        This was really interesting.

                                        Wow - this sounds nearly the same as the GRC DDOS attack, only on a HUGE scale.

                                        Yeah, the scale here is crazy.

                                        DashrenderD 1 Reply Last reply Reply Quote 0
                                        • DashrenderD
                                          Dashrender @scottalanmiller
                                          last edited by

                                          @scottalanmiller said:

                                          @Dashrender said:

                                          @wrx7m said:

                                          @scottalanmiller said:

                                          Here is the blog response to that...

                                          https://blog.linode.com/2016/01/29/christmas-ddos-retrospective/

                                          This was really interesting.

                                          Wow - this sounds nearly the same as the GRC DDOS attack, only on a HUGE scale.

                                          Yeah, the scale here is crazy.

                                          80 GB attack - damn!

                                          1 Reply Last reply Reply Quote 0
                                          • JaredBuschJ
                                            JaredBusch
                                            last edited by JaredBusch

                                            @scottalanmiller I got busy and never got back to this.

                                            I just cancelled the pings. here are the results.

                                            64 bytes from 172.99.75.133: icmp_seq=34832 ttl=50 time=39.6 ms
                                            ^C
                                            --- 172.99.75.133 ping statistics ---
                                            34832 packets transmitted, 20406 received, 41% packet loss, time 34860502ms
                                            rtt min/avg/max/mdev = 38.082/39.183/122.033/2.152 ms
                                            [root@keygen ~]#
                                            
                                            64 bytes from 108.61.151.173: icmp_seq=34830 ttl=54 time=49.9 ms
                                            ^C
                                            --- 108.61.151.173 ping statistics ---
                                            34830 packets transmitted, 34821 received, 0% packet loss, time 34873888ms
                                            rtt min/avg/max/mdev = 45.852/47.901/842.335/7.692 ms
                                            [root@keygen ~]#
                                            
                                            64 bytes from 104.236.119.59: icmp_seq=34838 ttl=56 time=48.6 ms
                                            ^C
                                            --- 104.236.119.59 ping statistics ---
                                            34838 packets transmitted, 34807 received, 0% packet loss, time 34890326ms
                                            rtt min/avg/max/mdev = 48.341/48.750/1051.915/5.427 ms, pipe 2
                                            [root@keygen ~]#
                                            
                                            1 Reply Last reply Reply Quote 1
                                            • 1
                                            • 2
                                            • 3
                                            • 3 / 3
                                            • First post
                                              Last post