Digital optimizer with a love for metal music 🤘 and a nerdy interest in #data + #webanalytics.
Working @ Transact Denmark.
I love SEO news - when reported honestly (sorry SEJ for being a pain)
Not giving away specifics but more so we gain a better understanding of what weighs when "budgeting" for Gbot.
Not giving away specifics but more so we gain a better understanding of what weighs when "budgeting" for Gbot.
Thanks.
My curiosity just drove me to looking into whether Google had made public declarations as to what constitutes a crawl/hit in terms of their own processes.
If it is the total load on Gbot infrastructure then why not just have article examplifying this?
Thanks.
My curiosity just drove me to looking into whether Google had made public declarations as to what constitutes a crawl/hit in terms of their own processes.
If it is the total load on Gbot infrastructure then why not just have article examplifying this?
Next up: reaching out to the developers 😅
Am I correct: The help docs has no definition of what constitutes a crawl in terms of budget?
Next up: reaching out to the developers 😅
Am I correct: The help docs has no definition of what constitutes a crawl in terms of budget?
I have no idea why it is crawlable 🙈
But it just made me think about the crawl budget - and made me wonder if returning no content is still classified as a crawl or not.
I have no idea why it is crawlable 🙈
But it just made me think about the crawl budget - and made me wonder if returning no content is still classified as a crawl or not.
Is the crawl time reported using PDT timezone (so when I see 21:28 in GSC and I am located in the UK I need to add 8 hours) or using the users timezone?
If PDT, where do I suggest that this is made much clearer in GSC? 😊
Is the crawl time reported using PDT timezone (so when I see 21:28 in GSC and I am located in the UK I need to add 8 hours) or using the users timezone?
If PDT, where do I suggest that this is made much clearer in GSC? 😊
Often I would use a combo of 1 and 2 to identify main topics that do not overlap.
Then I would use 2 and create a page strategy matching intent with topic. From that I would build the brief.
Often I would use a combo of 1 and 2 to identify main topics that do not overlap.
Then I would use 2 and create a page strategy matching intent with topic. From that I would build the brief.
As famous captain Picard of Star Wars once said: "So long and thanks for all the fish"
But yes, love seeing how IT issues are combined with climate challenges... I guess cloud computing and AI enforces that.
As famous captain Picard of Star Wars once said: "So long and thanks for all the fish"
But yes, love seeing how IT issues are combined with climate challenges... I guess cloud computing and AI enforces that.
I also just learned that the server sometimes serves a different variant of the robots.txt file.
Sometimes it includes the line "Disallow: /oplevelser/*$" and sometimes it doesn't.
🤯
I also just learned that the server sometimes serves a different variant of the robots.txt file.
Sometimes it includes the line "Disallow: /oplevelser/*$" and sometimes it doesn't.
🤯
Will check it out.
Will check it out.
Is there any way to watch your TikTok if I do not have TikTok installed (it's a tinfoil hat-thing)?
Is there any way to watch your TikTok if I do not have TikTok installed (it's a tinfoil hat-thing)?
1) What do you monitor? Is it a certain prompt?
2) GPTs return different answers for each request, so how do evaluate output that can be both plain text, bullets and tables?
1) What do you monitor? Is it a certain prompt?
2) GPTs return different answers for each request, so how do evaluate output that can be both plain text, bullets and tables?
So much more focused than Elons nightmare. It reminds me of old Twitter.
So much more focused than Elons nightmare. It reminds me of old Twitter.