Deliver search-friendly JavaScript-powered websites (Google I/O ’18)

Comments 21

  • Will GoogleBot use Chrome 59 in 2018 ?
    Because, you know, ES20**.

  • Great an helpfully information! Thanx Google!

  • Awesome info.. Thank you!

  • Thanks, Google!

  • Superb work, Thanks a lot. Keep it up. 🙂

  • 23:55 provides a solution of implementing server side rendering only for google bot. That might be a good solution, however, I thought that is considered as search engine cloaking (providing different result to users / bots), which will penalize your SEO… isn't it?

  • I have a question: we work on an brand new site, which build on JS. We close it by robots.txt as we afraid that bot might index a lot of "empty" pages, that are without dinamic redenring… However, i am want to test and to see – how google bot will see those pages? But I can't test it until I unblock the robots.txt file, right? I mean – I even can't use GWT's "Fetch as google bot" while it is closed by robots.txt. So, what might be the solution to check how google bot will render my sites without openeing robots.txt file?

  • How to make sure that Google will not consider Dynamic Rendering as a Cloaking? Previously there was a recommendation to not checking for a google bot.

  • Very useful information, loving the transparency.

  • Google, Please provide a link to the documentation regarding dynamic rendering and the official policy change.

  • The dynamic rendering is so ridiculous…. What make you think that I'm going to code like a #$%#! just to make your job simplier when implement that, requires an important infrstructure? Google many times does incredible things, but this…. this goes nowhere. I really don't think people are going to implement this, or if they try, they are going to leave it after try…..

  • Questions: You mention using the mobile friendly tool and the rich results testing tool as rendering test platforms, essentially. Why do this instead of using Fetch and Render in Search Console? In fact the first time I tried to use the rich results tool it told me that the page was not eligible for "rich results known by this test."

  • Interesting point about the complete dynamic rendering for search bots user-agent and not users!

  • You're missing a 'b' in a part of the info 🙂
    "Watch more >Wemasters< sessions from I/O '18 here"
    Just trying to help, keep being awesome and an inspiration! 🙂

    Awesome video <3

  • On my website I use the fragment #! and it is perfect for the users, I show the content without refreshing the whole page. But now Google does not recommend this and my site has fallen in terms of indexed pages and therefore its positioning too.

    I do not understand why they do not take the content that comes after #!. Google always recommends focusing on users when the site is done, but this is no longer the case. Since in my case the site works perfect for users, they see the content, but now for Google this is insignificant and if now I have to change something from my code it is for Google to interpret it. Contradictory, no?

    Anyway in search engines like Bing or duckduckgo this does not happen, there if they crawl all the content of my web. They say to make use of the API History, which I was trying to do and I can not make it work for my case.

    So, do we focus on users or search engines?

  • This technical aspect is really important for the following up of website building.Truly thanks.

  • Use React Static. Problem solved 🙂

  • you ( must make a video tutorial in indonesian language too.

  • Why Googlebot still can not update the version of Chrome ??? (

  • just learned about rendora and dynamic rendering, SEO problem is now solved

  • This presentation just begs the same question over and over: why not make Googlebot better? Shifting the burden onto all these web developers… or just improve Googlebot to handle modern practices? Oh, your indexing bot doesn't know how to read/index pages that a human can reason about? Sounds like your bot could be improved. It uses Chrome 41—why? etc.

    Don't get me wrong, I think web developers should do all they can to improve SEO (especially with JSON-LD structured data), but some of these limitations of Googlebot are just annoying.

Leave a Reply

Your email address will not be published. Required fields are marked *