Esben Rasmussen
esbenrasmussen.bsky.social
Esben Rasmussen
@esbenrasmussen.bsky.social
Senior #SEO manager specializing in automation, tech and AI.

Digital optimizer with a love for metal music 🤘 and a nerdy interest in #data + #webanalytics.

Working @ Transact Denmark.

I love SEO news - when reported honestly (sorry SEJ for being a pain)
I know this is really geeky but would love to see examples from Google on what a Gbot server load/priority could look like based on request, response header, speed and content fetched.

Not giving away specifics but more so we gain a better understanding of what weighs when "budgeting" for Gbot.
September 22, 2025 at 12:27 PM
Love the answer and completely agree!
Thanks.

My curiosity just drove me to looking into whether Google had made public declarations as to what constitutes a crawl/hit in terms of their own processes.

If it is the total load on Gbot infrastructure then why not just have article examplifying this?
September 22, 2025 at 12:27 PM
Thanks. Yeah, I am trying to get to the bottom of it. It seems they are a part of some scripts - which seem to generate the URLs with unique ID's on each request.

Next up: reaching out to the developers 😅

Am I correct: The help docs has no definition of what constitutes a crawl in terms of budget?
September 22, 2025 at 9:12 AM
Well, my plan is to tell the client to block those URLs with robots.txt.

I have no idea why it is crawlable 🙈

But it just made me think about the crawl budget - and made me wonder if returning no content is still classified as a crawl or not.
September 22, 2025 at 8:28 AM
Also my hunch. Would be quite impressive with timetravelling Gbot though!

Is the crawl time reported using PDT timezone (so when I see 21:28 in GSC and I am located in the UK I need to add 8 hours) or using the users timezone?

If PDT, where do I suggest that this is made much clearer in GSC? 😊
May 23, 2025 at 3:27 PM
Awesome. Thanks for that. I tried looking up the answer on your long page w. general feature info but this info wasn't listed. Might be worth adding in terms making sure your computer clock is set to the correct time zone 🤦‍♂️😅
March 10, 2025 at 12:06 PM
How did you identify the need for the content brief in the first place?

Often I would use a combo of 1 and 2 to identify main topics that do not overlap.

Then I would use 2 and create a page strategy matching intent with topic. From that I would build the brief.
March 6, 2025 at 5:53 AM
Haha, they could just as well have said:

As famous captain Picard of Star Wars once said: "So long and thanks for all the fish"

But yes, love seeing how IT issues are combined with climate challenges... I guess cloud computing and AI enforces that.
March 5, 2025 at 4:40 PM
Thanks! Will definitely also be my recommendation.

I also just learned that the server sometimes serves a different variant of the robots.txt file.
Sometimes it includes the line "Disallow: /oplevelser/*$" and sometimes it doesn't.

🤯
March 3, 2025 at 11:54 AM
😂
February 13, 2025 at 8:23 AM
Awesome! I guess I could have told myself that. 🤦‍♂️😅
Will check it out.
February 13, 2025 at 6:56 AM
Sounds really interesting!

Is there any way to watch your TikTok if I do not have TikTok installed (it's a tinfoil hat-thing)?
February 12, 2025 at 9:55 PM
Interesting!

1) What do you monitor? Is it a certain prompt?

2) GPTs return different answers for each request, so how do evaluate output that can be both plain text, bullets and tables?
February 11, 2025 at 6:02 PM
Thanks. Could be I should look into Make at some point to see if it makes sense to use for some projects.
February 10, 2025 at 2:53 PM
What does the use of Make add to the process instead of just connecting chatGPT directly to the Google Sheet using an extension?
February 8, 2025 at 11:12 AM
Just joined a week ago and already loving it!
So much more focused than Elons nightmare. It reminds me of old Twitter.
February 1, 2025 at 9:23 PM
Zoomet lidt ud.
February 1, 2025 at 12:53 PM