Skip to content
  • Hjem
  • Seneste
  • Etiketter
  • Populære
  • Verden
  • Bruger
  • Grupper
Temaer
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Kollaps
FARVEL BIG TECH
  1. Forside
  2. Ikke-kategoriseret
  3. Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

Planlagt Fastgjort Låst Flyttet Ikke-kategoriseret
72 Indlæg 43 Posters 224 Visninger
  • Ældste til nyeste
  • Nyeste til ældste
  • Most Votes
Svar
  • Svar som emne
Login for at svare
Denne tråd er blevet slettet. Kun brugere med emne behandlings privilegier kan se den.
  • mark@mastodon.fixermark.comM mark@mastodon.fixermark.com

    @david_chisnall I like this one specifically because the Cloudflare gate is there to address the problem of "Too many visitors."

    autiomaa@mementomori.socialA This user is from outside of this forum
    autiomaa@mementomori.socialA This user is from outside of this forum
    autiomaa@mementomori.social
    wrote sidst redigeret af
    #5

    @mark @david_chisnall Instead of fixing broken code with proper logging and code performance observability, lets stop all the effort and expect Cloudflare to care about actual humans (and not just about their PaaS billing). 😓

    internic@mathstodon.xyzI 1 Reply Last reply
    0
    • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

      Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

      Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

      meilin@tech.lgbtM This user is from outside of this forum
      meilin@tech.lgbtM This user is from outside of this forum
      meilin@tech.lgbt
      wrote sidst redigeret af
      #6

      @david_chisnall
      It's also the tens of MByte of Frameworks and JavaScript and ad services that have to be loaded every single time.

      drwho@masto.hackers.townD 1 Reply Last reply
      0
      • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

        Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

        Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

        jackeric@beige.partyJ This user is from outside of this forum
        jackeric@beige.partyJ This user is from outside of this forum
        jackeric@beige.party
        wrote sidst redigeret af
        #7

        @david_chisnall I'd like to automate the process of responding to Cloudflare's checks

        mo@mastodon.mlM 1 Reply Last reply
        0
        • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

          Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

          Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

          alexskunz@mas.toA This user is from outside of this forum
          alexskunz@mas.toA This user is from outside of this forum
          alexskunz@mas.to
          wrote sidst redigeret af
          #8

          @david_chisnall why is that there? Bots and AI scraping. None of this would be necessary otherwise.

          N 1 Reply Last reply
          0
          • mark@mastodon.fixermark.comM mark@mastodon.fixermark.com

            @david_chisnall I like this one specifically because the Cloudflare gate is there to address the problem of "Too many visitors."

            danherbert@mastodon.socialD This user is from outside of this forum
            danherbert@mastodon.socialD This user is from outside of this forum
            danherbert@mastodon.social
            wrote sidst redigeret af
            #9

            @mark @david_chisnall I don't think that's actually the case, at least not entirely. The main issue is that the Internet is currently being inundated with LLM content crawlers to the point that it overwhelms websites or scrapes content some sites don't want sucked into AI training data. It has caused a massive number of sites to serve those bot-detection pages to everyone. So it's not quite an issue of too many visitors but actually "too many non-human visitors"

            mark@mastodon.fixermark.comM jackyan@mastodon.socialJ 2 Replies Last reply
            0
            • mark@mastodon.fixermark.comM mark@mastodon.fixermark.com

              @david_chisnall I like this one specifically because the Cloudflare gate is there to address the problem of "Too many visitors."

              david_chisnall@infosec.exchangeD This user is from outside of this forum
              david_chisnall@infosec.exchangeD This user is from outside of this forum
              david_chisnall@infosec.exchange
              wrote sidst redigeret af
              #10

              @mark

              This morning, Cloudflare decided that a company I wanted to place an order with shouldn't trust me, so I went to one of their competitors.

              mark@mastodon.fixermark.comM 1 Reply Last reply
              0
              • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

                Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

                Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

                benhm3@saint-paul.usB This user is from outside of this forum
                benhm3@saint-paul.usB This user is from outside of this forum
                benhm3@saint-paul.us
                wrote sidst redigeret af
                #11

                @david_chisnall

                On top of all the broken links we’ll send if your not using the proper browser.

                1 Reply Last reply
                0
                • alexskunz@mas.toA alexskunz@mas.to

                  @david_chisnall why is that there? Bots and AI scraping. None of this would be necessary otherwise.

                  N This user is from outside of this forum
                  N This user is from outside of this forum
                  nothacking@infosec.exchange
                  wrote sidst redigeret af
                  #12

                  @alexskunz @david_chisnall

                  The thing is, you don't a CAPTCHA. Just three if statements on the server will do it:

                  1. If the user agent is chrome, but it didn't send a "Sec-Ch-Ua" header: Send garbage.
                  2. If the user agent is a known scraper ("GPTBot", etc): Send garbage.
                  3. If the URL is one we generated: Send garbage.
                  4. Otherwise, serve the page.

                  The trick is that instead of blocking them, serve them randomly generated garbage pages.

                  Each of these pages includes links that will always return garbage. Once these get into the bot's crawler queue, they will be identifiable regardless of how well they hide themselves.

                  I use this on my site: after a few months, it's 100% effective. Every single scraper request is being blocked. At this point, I could ratelimit the generated URLs, but I enjoy sending them unhinged junk. (... and it's actually cheaper then serving static files!)

                  This won't do anything about vuln scanners and other non-crawler bots, but those are easy enough to filter out anyway. (URL starts with /wp/?)

                  bertkoor@mastodon.socialB 1 Reply Last reply
                  0
                  • autiomaa@mementomori.socialA autiomaa@mementomori.social

                    @mark @david_chisnall Instead of fixing broken code with proper logging and code performance observability, lets stop all the effort and expect Cloudflare to care about actual humans (and not just about their PaaS billing). 😓

                    internic@mathstodon.xyzI This user is from outside of this forum
                    internic@mathstodon.xyzI This user is from outside of this forum
                    internic@mathstodon.xyz
                    wrote sidst redigeret af
                    #13

                    @autiomaa @mark @david_chisnall Honestly I'm kind or surprised there isn't a "pay Cloudflare for X connections without a challenge/captcha", because it would be another revenue stream for them.

                    autiomaa@mementomori.socialA 1 Reply Last reply
                    0
                    • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

                      Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

                      Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

                      hp@mastodon.tmm.cxH This user is from outside of this forum
                      hp@mastodon.tmm.cxH This user is from outside of this forum
                      hp@mastodon.tmm.cx
                      wrote sidst redigeret af
                      #14

                      @david_chisnall This was when the tech bros realized that it is all in comparison to everything else.

                      If you just make EVERYTHING worse then it doesn't matter that you're bad.

                      The real story of computing (and perhaps all consumer goods)

                      grumble209@kolektiva.socialG 1 Reply Last reply
                      0
                      • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

                        Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

                        Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

                        hex0x93@mastodon.socialH This user is from outside of this forum
                        hex0x93@mastodon.socialH This user is from outside of this forum
                        hex0x93@mastodon.social
                        wrote sidst redigeret af
                        #15

                        @david_chisnall it's funny, everytime I try to access a website that uses Cloudflare, I have to use sth else or disable my VPN && my DNS resolver.
                        So if they can have my data, they let me use them. So don't tell me it is about prorection against bots.
                        It's about gathering data - or am I just paranoid af?

                        zeborah@mastodon.nzZ 1 Reply Last reply
                        0
                        • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

                          Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

                          Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

                          jernej__s@infosec.exchangeJ This user is from outside of this forum
                          jernej__s@infosec.exchangeJ This user is from outside of this forum
                          jernej__s@infosec.exchange
                          wrote sidst redigeret af
                          #16

                          @david_chisnall I don't even care about Cloudflare (and Anubis) checks – those at least rarely last more than a few seconds. What I loathe are the throbbing placeholders that seem to be everywhere now, causing simple text pages to load slower than similarly-looking pages (once the content renders) loaded on dial-up.

                          jernej__s@infosec.exchangeJ the_wub@mastodon.socialT 2 Replies Last reply
                          0
                          • jernej__s@infosec.exchangeJ jernej__s@infosec.exchange

                            @david_chisnall I don't even care about Cloudflare (and Anubis) checks – those at least rarely last more than a few seconds. What I loathe are the throbbing placeholders that seem to be everywhere now, causing simple text pages to load slower than similarly-looking pages (once the content renders) loaded on dial-up.

                            jernej__s@infosec.exchangeJ This user is from outside of this forum
                            jernej__s@infosec.exchangeJ This user is from outside of this forum
                            jernej__s@infosec.exchange
                            wrote sidst redigeret af
                            #17

                            RE: https://infosec.exchange/@jernej__s/116028286564917007

                            @david_chisnall Oh, and also this:

                            1 Reply Last reply
                            0
                            • internic@mathstodon.xyzI internic@mathstodon.xyz

                              @autiomaa @mark @david_chisnall Honestly I'm kind or surprised there isn't a "pay Cloudflare for X connections without a challenge/captcha", because it would be another revenue stream for them.

                              autiomaa@mementomori.socialA This user is from outside of this forum
                              autiomaa@mementomori.socialA This user is from outside of this forum
                              autiomaa@mementomori.social
                              wrote sidst redigeret af
                              #18

                              @internic There is such payment model on Cloudflare for the LLM companies (giving them much faster download speeds for 3rd party content scraping), but not for regular consumers.

                              @mark @david_chisnall

                              internic@mathstodon.xyzI 1 Reply Last reply
                              0
                              • danherbert@mastodon.socialD danherbert@mastodon.social

                                @mark @david_chisnall I don't think that's actually the case, at least not entirely. The main issue is that the Internet is currently being inundated with LLM content crawlers to the point that it overwhelms websites or scrapes content some sites don't want sucked into AI training data. It has caused a massive number of sites to serve those bot-detection pages to everyone. So it's not quite an issue of too many visitors but actually "too many non-human visitors"

                                mark@mastodon.fixermark.comM This user is from outside of this forum
                                mark@mastodon.fixermark.comM This user is from outside of this forum
                                mark@mastodon.fixermark.com
                                wrote sidst redigeret af
                                #19

                                @danherbert @david_chisnall I wasn't limiting "visitors" to humans.

                                1 Reply Last reply
                                0
                                • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

                                  @mark

                                  This morning, Cloudflare decided that a company I wanted to place an order with shouldn't trust me, so I went to one of their competitors.

                                  mark@mastodon.fixermark.comM This user is from outside of this forum
                                  mark@mastodon.fixermark.comM This user is from outside of this forum
                                  mark@mastodon.fixermark.com
                                  wrote sidst redigeret af
                                  #20

                                  @david_chisnall There is a hilarious possible future where the government fails to do anything about monopolies but Cloudflare has a de-factor competition increase effect because it makes it so onerous for everyone to use one site that people start self-selecting to use other sites.

                                  1 Reply Last reply
                                  0
                                  • jackeric@beige.partyJ jackeric@beige.party

                                    @david_chisnall I'd like to automate the process of responding to Cloudflare's checks

                                    mo@mastodon.mlM This user is from outside of this forum
                                    mo@mastodon.mlM This user is from outside of this forum
                                    mo@mastodon.ml
                                    wrote sidst redigeret af
                                    #21

                                    @jackeric that's exactly what their code is designed to prevent
                                    It's still possible, but... not without some fighting

                                    @david_chisnall

                                    1 Reply Last reply
                                    0
                                    • meilin@tech.lgbtM meilin@tech.lgbt

                                      @david_chisnall
                                      It's also the tens of MByte of Frameworks and JavaScript and ad services that have to be loaded every single time.

                                      drwho@masto.hackers.townD This user is from outside of this forum
                                      drwho@masto.hackers.townD This user is from outside of this forum
                                      drwho@masto.hackers.town
                                      wrote sidst redigeret af
                                      #22

                                      @david_chisnall @MeiLin 400-500 separate data tracking recipients on each page..m

                                      meilin@tech.lgbtM 1 Reply Last reply
                                      0
                                      • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

                                        Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

                                        Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

                                        eldersea@expressional.socialE This user is from outside of this forum
                                        eldersea@expressional.socialE This user is from outside of this forum
                                        eldersea@expressional.social
                                        wrote sidst redigeret af
                                        #23

                                        @david_chisnall BRB, going to use my landline phone to call my local lunch spot to place an order that I will walk to go get.

                                        1 Reply Last reply
                                        0
                                        • david_chisnall@infosec.exchangeD david_chisnall@infosec.exchange

                                          Web design in the early 2000s: Every 100ms of latency on page load costs visitors.

                                          Web design in the late 2020s: Let's add a 10-second delay while Cloudflare checks that you are capable of ticking a checkbox in front of every page load.

                                          the_wub@mastodon.socialT This user is from outside of this forum
                                          the_wub@mastodon.socialT This user is from outside of this forum
                                          the_wub@mastodon.social
                                          wrote sidst redigeret af
                                          #24

                                          @david_chisnall Just had a bunch of these whilst trying to do a reverse lookup on a number used to call me this evening.

                                          I think that the peak internet speed was in the early 1990s. Dial up was slow but pages were static html with no javascript/font/whatever else calls to other sites hosting the resources.

                                          Each search on AltaVista would produce a first page full of genuinely useful websites one of which would be guaranteed to answer your question.

                                          This is NOT just nostalgia.

                                          1 Reply Last reply
                                          0
                                          Svar
                                          • Svar som emne
                                          Login for at svare
                                          • Ældste til nyeste
                                          • Nyeste til ældste
                                          • Most Votes


                                          • Log ind

                                          • Har du ikke en konto? Tilmeld

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          Graciously hosted by data.coop
                                          • First post
                                            Last post
                                          0
                                          • Hjem
                                          • Seneste
                                          • Etiketter
                                          • Populære
                                          • Verden
                                          • Bruger
                                          • Grupper