English Google Webmaster Central office-hours hangout

JOHN MUELLER: All right Welcome everyone to today’s Google Webmaster Central Office Hours Hangouts My name is John Mueller I’m a Webmaster Trends Analyst here at Google in Switzerland And part of what we do are these Webmaster Office Hours Hangouts with webmasters, and publishers, and SEOs– all kinds of people who make websites And as always, bunch of questions got submitted already But if any of you want to get started with the first question or two, feel free to jump on in now SPEAKER 1: I have a couple of questions if that’s OK JOHN MUELLER: Right go for it SPEAKER 1: So the first one is the new mobile phone index which has gone live and in the best practices for the mobile phone indexes That’s that the content has to be the same on desktop as it is on mobile And if it’s not, then you need [INAUDIBLE] so that they match But if you have content that is shorter on mobile than desktop, would you receive a penalty for that? Or would that go against when ranking? Or what? Because I imagine a lot of sites out there have a lot to compound because of the user experience And they ought to condense what usually have on desktop down JOHN MUELLER: Yeah, so I don’t know I guess on the one hand, people internally when they ask them to state– they, like, reask if it really is the case that you need to provide less content on mobile because I assume more and more as people use mobile as their primary device, they want the full experience And if something is not necessary for the full experience, then why do you keep it on your desktop? So that’s kind of just as an aside But in general, there are two things that come into play here On the one hand, when we switch sites over to the mobile-first index, we do check to see if the content is equivalent enough so that we feel from a content point of view, there wouldn’t be any issues with ranking So if we see that there are significant differences between the mobile and the desktop version, then probably we wouldn’t switch that site over If we did switch that site over, what would happen is we would only index the mobile content So if there is anything on the desktop version that is critical for that site’s ranking and it’s not on the mobile version, then we wouldn’t index it and we wouldn’t be able to rank that site based on that content So it’s not that there’s a penalty, or that anyone manually looks at these and says, oh, you missing five words here, therefore I’ll demote you It’s just those five words won’t be taken into account for indexing and ranking And in a lot of cases if this is just additional and I don’t know, filler content that you have on your pages, things like your terms of service or just general descriptive content that doesn’t really matter so much for ranking at all– that’s something that probably doesn’t make a difference Like, if you have it there or not, if it’s indexed or not, it doesn’t really change that much But I think kind of going back to the first one, I’d really think about whether it actually makes sense to reduce the amount of content and reduce functionality on mobile Because as people who use mobile as their primary device, they will want the full content They’ll want the full functionality SPEAKER 1: When you said you won’t switch it over, are you saying that you would likely shove in the desktop version instead of the mobile version, then? When you said you wouldn’t switch it over, is that what you mean? JOHN MUELLER: It’s not so much knowing the desktop or the mobile version It’s just that the content that we use for indexing So if you have separate mobile urls, then we have the desktop url and the mobile url And we know they belong together We only index the content once We just swap out the url and show the appropriate one So that doesn’t change with the mobile-first index It’s really just a matter of what content we use for indexing SPEAKER 1: OK And my second question is a quick one But it seems to be– I’m going to call it an [INAUDIBLE] box now, so if you access a website and then you come back, around that url, you know, at the top, you can write a description There’s a kind of– people also searched for Is that just directly related to the phrases? So is that directly relevant to the phrases that you search for? Or is there more being taken into consideration there? Because the box is now, rather than being at the bottom of the page, it’s around the website that you visited, which seems quite new

So what is that based on? JOHN MUELLER: I don’t know [LAUGHTER] A lot of search features are things that teams within Google are experimenting with to try to figure out what the right approach is, what kind of the thresholds are, where to use content, what kind of signals to use for these things So, especially the newer features, I would assume that they’re very much in flux And people are experimenting and seeing which way does it make sense to show this to people, how do people actually get value out of this additional information that we provide in search SPEAKER 1: OK OK, thank you JOHN MUELLER: Sure All right, I think someone else wanted to jump in with a question or two SPEAKER 2: Hi, John JOHN MUELLER: Hi SPEAKER 2: I have a question For a few of our clients, when I checked their Google [INAUDIBLE],, we got some notification, like, when we discovered an issue So we got into discussion like you haven’t discovered this issue, and you could fix this Now we are getting this issue because we have added no index tag to some pages which are less important or which don’t have too much content– so don’t want Google to index all those pages So is this bad for the website if we get this type of notification? Or if we are doing this tag to a page? JOHN MUELLER: If you’re doing that on purpose and you know why they’re there? Well, that’s perfect We sent out this notification because a lot of people put no index on their pages accidentally And then we want to make sure that they are aware of this problem so that they can fix it But if you put that out there on purpose and you say this how I want it, then that’s perfectly fine SPEAKER 2: Do you think it is better if we removed those pages from the [INAUDIBLE] xml so that Google will not find those pages anyway? JOHN MUELLER: We’ll probably continue crawling them anyway, just because we’ve seen it before I think if you want to remove them from the search, then I would also remove them from the [INAUDIBLE] file, just to be consistent But that’s usually not SPEAKER 2: OK Thank you, John JOHN MUELLER: Sure SPEAKER 3: Hi, John JOHN MUELLER: Hi SPEAKER 3: I have one question about canonical Now if we have a product category page with a url that’s like example, example.com slash product category, with some pages, page two, page three, and more The correct canonical should be product category? Or for every page we must same canonical– page two, page three What do you think? JOHN MUELLER: It depends on how you want them indexed So if you don’t want the other pages indexed, you can certainly set the canonical to the first one But if there’s something on those pages that you want have indexed or that you want have links follow to individual products, then I would not use the canonical tag there SPEAKER 3: OK If we have thousands of pages, it is better if we use canonical to the product category page because we don’t lose our crawl budget JOHN MUELLER: I think for most sites, they don’t have to worry about the crawl budget cuts especially if you’re looking at a range of, I don’t know, let me make up a number, like less than a couple hundred thousand pages We can crawl those pages just fine for the most part, unless you’re on a really, really old server that’s like barely able to make it through But then you know that your server is old and slow anyway So that’s something you’d probably want to fix anyway But otherwise, if you have a reasonable server and you have, I don’t know, a couple hundred thousand pages, I would not worry about crawl budget SPEAKER 3: OK And another option, if we use for same page, like example, page two and canonical is for page two, we will have a problem in Google Search Console for meta description and title tags We must worry about them, or? JOHN MUELLER: No, that’s perfectly fine And that again is kind of like a warning if you weren’t aware of that But if you know that this is how you have your website set up and you think that’s OK, no problem SPEAKER 3: OK, John, thank you so much JOHN MUELLER: Sure All right Let me run through some of the questions that were submitted There are a whole bunch of them here, but we will see how far we go And as always, if you have questions or comments in between, feel free to jump on in My client’s developer duplicated content in a way that looks like cloaking

After that, the traffic just dropped critically Is that something that would be bad? Should I just remove it? Will it return to normal? So I guess I don’t really know what exactly was implemented here If you’re saying it looks like cloaking, that sounds pretty bad It sounds like something you’d want to fix anyway It might also be that it’s just– the change in traffic could also just be unrelated to that So it’s really hard to say based on just the information here what exactly was that on the website, how exactly could that be like floating For example, if you serve Googlebot empty pages and users would see a full page, that would be cloaking And that would essentially result in us dropping all of the information from the index for that site So it really kind of depends on what exactly happened there My recommendation here would be to go to the Webmaster help forum and check with the folks there to look at your specific case so that people can take a look there and say this is technically OK, or this is not 100% great, but it’s not bad enough that it would cause any issues from a technical point of view And based on that information, you can kind of make a judgment call a little bit better on how urgently you need to fix this If we have a site that was hit by the March 7 through 9 quality update that’s a legitimate site that’s generating world class quality content and getting lots of links, and it gets scraped– and now after that, the other scraper sites are ranking before us I guess, kind of like, what happened? What can we do to fix that? For the most part, I think that can be kind of tricky because a lot of the updates that we’ve made are more around relevance where we’re trying to figure out which sites are relevant for certain queries and not so much a matter of quality overall So that might be something where we just don’t think your site is exactly relevant for those specific queries It doesn’t mean that it’s a bad site It’s just not relevant for those specific queries So that’s something that I think just happens to a lot of sites over time in that they might be really high quality content, but over time they’re just not seen as being as relevant in the overall picture of the web So I understand this is always a kind of a tricky situation as a site owner, because you want to figure out what you can do to kind of get back into the previous situation And sometimes, especially when it comes to relevance that, it’s not the case that you did anything wrong that you need to change to get back It’s just things have changed So what I would recommend doing there is, on the one hand, trying to get feedback from your users to figure out how they feel about your website and to try to really get objective feedback on what you could be doing differently, or what you might want to target differently, or set up differently As always with a lot of these changes in traffic, I’d also still recommend double-checking all of the technical details, so things like, are we able to crawl all of your content? Are able to index the content properly? All of these things that could potentially change, depending on even small changes that you sometimes make on a website So I’d really double check those as well SPEAKER 4: John, do you have an example of that? JOHN MUELLER: Of which part? SPEAKER 4: Of a site that would rank for something and then it was no longer relevant for those queries I mean, I can think of an example where we had a site and it was ranked for something completely unrelated And over time, that corrected itself But I’m not sure if that’s the best case of kind of what you’re describing JOHN MUELLER: I think that can be the case where a site kind of accidentally ranks for some really popular keywords And you get a lot of traffic from that, and you think, wow, this is awesome And at some point our algorithms kind of figure it out, and they try to fix it We also see this– so I don’t know if this is related to the March updates, but it’s something that I’ve seen over time in the forums as well where sites might rank for someone else’s brand name, for example, because maybe they have a blog post about it

Or maybe they have a page like “How to Log into Gmail.” And then suddenly it ranks for Gmail log-in, which is kind of nice, because you get a lot of traffic, I guess And people probably click on your ads And it’s kind of worthwhile for you But from an algorithmic point of view, from a quality point of view, your site is probably not what people are searching for So that’s something that our algorithms will probably try to figure out over time, like what would be the better result for a query like this And that might be a change that would happen So it’s not that your site is worse or is seen as being worse It’s just not as relevant for those particular keywords And our algorithms change over time And it’s something where we try to reflect what the web is also looking for with regards to relevance So the these things can change even without any change on your website Besides the more technical issues that are frequently discussed, such a speeding up a site and making it mobile-friendly, what are the most important things business owners should be working towards today, perhaps the things they are not doing so well? That’s a really broad question It’s really hard to say So one of the things that I see regularly, especially from small businesses, is that it’s sometimes really unclear what the website is meant for or what it’s meant to target, what people are meant to do when they reach your website They might have worked together with a designer to make a really nice and fancy website, but you go there– you know, are they trying to sell me software? Are they offering a service? Do they do consulting? What is it exactly that they want me to do? It’s really unclear Or sometimes they’ll have lots of fancy photos and images on the website which are more kind of for an emotional attachment to the website and less about the products and services that they sell So it’s not so much a matter of SEO directly, like technical things like speed and having the right meta tags on the page It’s really more about the basics of, what do you want to be found for? What do you think people will be searching for where your website, where your business is supposed to be relevant? And how can you make that clear? On the one hand, for users, when they go to your website On the other hand, also for search engine crawlers when they go to your website and try to figure out where can we recommend this site So that’s really one of the bigger things that I see over and over again I see that regularly at site clinics when we do them Even for bigger companies, you go to the website and it looks really nice and fancy, but you have no idea what it is they’re actually doing and what they would like to rank for SPEAKER 4: So describing their content as [INAUDIBLE] So that’s useful to know I think my question was– [INAUDIBLE] JOHN MUELLER: Whoops SPEAKER 4: [INAUDIBLE] –I don’t know what to call it [INAUDIBLE] –view of things? JOHN MUELLER: I didn’t get that last part SPEAKER 4: So your focus is sharing it– [INAUDIBLE] JOHN MUELLER: Aw, man You have a really bad connection It’s really hard to hear you SPEAKER 4: If you’re– [INAUDIBLE] Sorry– [INAUDIBLE] JOHN MUELLER: Oh, man Or maybe you can type it into the chat? Into a comment? Or maybe the connection will get better over time You can jump in again SPEAKER 5: Hey Hi, John On the same line, basically, you wanted to know [INAUDIBLE],, but you’re saying though, the message is not really clear on most basic websites So it is really necessary to give out those kind of summary or you can say a little about what– so this is our department is all about on the specific pages? Or you want to do it on a home page, like, I mean, we are doing certain things and it’s something we do And is it is a good idea to present on a specific pages and telling that this is something we do and this is part of it on a specific page-wise? JOHN MUELLER: I think that’s totally up to you if you want to do that on the home page or on a product page

For a large part, it’s a matter of knowing what we should be able to show your page for So kind of understanding and thinking about what people would be searching for, where you think your website is really useful for So that’s– SPEAKER 5: So kind of understanding– JOHN MUELLER: –reverse– SPEAKER 5: –and thinking about what people would be searching for, where you think– JOHN MUELLER: –echo So essentially it’s about kind of– [INAUDIBLE] –so that people can recognize what it is that you offer, and search engines can figure out where should we actually rank this content A lot of times it will be easy for us to rank that site based on the business name If you search for the business name, we can find you But you probably want to get like some kind of visitors for, I don’t know, whatever specific activity that you do And for that, it’s not a matter of putting like hundreds of keyword variations on the page It’s just like, I don’t know, just a tag line sometimes Or just an extra sentence saying, we do this, and this is our speciality and we love doing X, Y, and Z for people in this area That kind of information makes a really big difference Also for local businesses what we sometimes see is they won’t list their location Like they’ll be, I don’t know, a bakery that makes really fancy cakes and trying to find their address is really hard So if you know that they’re in the city then obviously you can find them there But if you don’t know where they’re located, you could come to their website, and you’re like, do they deliver it to Switzerland? Or do they deliver to this other city? So that’s something that you can make really clear on a website, and it doesn’t take a lot of work It’s not magic You can do it with any CMS Just putting a bit more textual content on those pages makes a really big difference SPEAKER 5: Thanks, John Some of the biggest fight we had with mostly with the dev people in [INAUDIBLE],, basically, whenever they’d say, they’d come visit, it was not that important And it’s sometimes really hard to make them understand like when why is it something important to tell about the products and everything JOHN MUELLER: Yeah OK, let me see There’s something in the chat as well We changed the domain, configured our 301 redirect from the old to the new domain for more than a year And Google is still showing the site page with the old domain What can we do to solve that? I think one thing that you might want to kind of watch out for is if you are explicitly looking for the old domain name, then probably we’ll try to show it to you, even if we’ve seen the new one So if you’re looking for something general where your site should rank, then probably we should be showing the new one, especially if this redirect has been in place for a year now But if you’re explicitly looking for the old one, then we’ll try to show you the old one because we think this is probably what you’re looking for, and we don’t want to confuse you with a different domain name So that’s sometimes what throws people off in that they think the redirect is not processed, but actually it is processed and we’re just trying to be extra friendly and give you the whole domain because it looks like you’re asking for it So that might be something to double check All right So in the new Search Console there’s “Error Submitted URL marked no index” and “Excluded by no index” tag It’s pretty confusing, but I guess it also falls under error And I guess it depends on how it was submitted or found Yes, this is something that we differentiate by source of the URL Because if we think that you’re submitting the URL to us, then we think it’s most likely a mistake on your side Because you wouldn’t be, on the one hand saying, here’s a URL that Google needs to index, and at the same time saying, oh but, by the way, it’s no index You can’t index it, actually That’s kind of almost a sign that you might have a mistake on your side On the other hand, if it’s just a random URL we found with crawling that has a no index on it, then it’s a no index We don’t really need to worry about it too much It might be something that you put no index on purpose And this is not so much something

that we use for ranking or for figuring out if your website is high quality Obviously, a no index page is dropped from the index completely So it’s more a matter of– this looks like you have a mistake and this looks like we just found a no index, and that could be perfectly fine Just trying to make it a little bit clearer for you to figure out where maybe there was a mistake and maybe everything is just normal If a site has generic content but it provides unique functionality such as features or other interactive elements, is this enough to be considered original by the algorithms? Some webmaster guidelines hint that it might be OK, but it’s not very clear I think in a case like that, you really want to make sure that your site can stand on its own, that it has enough unique and compelling content so that people, when presented with these different versions of the content, will say, well, this is a site that I would recommend to my friends It’s really the best one of its kind It provides that most functionality that I actually need to get my wherever it is I want to do So that’s kind of the level you should be aiming for instead of just saying– let me take an extreme case You have a bunch of affiliate feeds that lead to a site And you just offer like a price comparison between two affiliate sites I don’t know, that seems like very low effort and not extremely useful On the other hand, if you take that to the next level and make it such that it’s really like a high quality comparison system where you really see the differences between the different suppliers, and why you might want to go here and why you might want to go there, then that’s something that does provide a lot of significant value And people will probably want to recommend that directly So that’s kind of the level that I would aim for Also, in general, when it comes to websites and kind of the quality of the website, I would not aim for the lowest bar that you can possibly reach just to be acceptable, but rather really aim to be something that is seen as number one by far Because only if you’re aiming for actually being number one by far are you in this situation where you don’t have to worry about individual quirks of the algorithm, because things kind of fluctuate a little bit anyway And if you’re number 10 and sometimes number 15, then you could have a really big difference on the visibility of your site But if you’re like clearly number one and you fluctuate between number one and number two, then it’s like– you’re kind of in a comfortable spot, right? So that’s what I would aim for there, not just like the lowest bar of what you can just get away with, but really try to figure out a way to be the best by far How to change the site’s domain name correctly? The site has not been used in a long time, and as a result, Google has indexed a bad site with duplicate pages with out-links to other sites Now we have developed a new site, filled it with content, and want to launch it on a new domain How to we do it right? Do I delete the old site? Do we need to change contacts on the new site? So I guess it kind of depends on your situation here It sounds like you’ve kind of left the old site more or less to get stale and obsolete, and you want to move to a new domain with essentially a new website not really related to the old one And if it’s essentially a new website then maybe it just makes sense to just set up the new website and leave the old one and remove it at some point On the other hand, if it’s kind of additional variation of your old websites that you’re putting out, then I would just do a normal site move and really set up all of those signals, those redirects, everything that we have in our documentation to do a normal site move to your new domain And in both of these cases, it’s not going to be that it will immediately jump up and be the best website in search But over time we should be able to process that and deal with that fairly well SPEAKER 4: Hello, John I have a question JOHN MUELLER: All right SPEAKER 4: One of our client, they are actually share trading brokers, so they have a website So they provide good national and international share trading service

Now we suggest them to create two separate page, one for the national share trading, another one for the international share trading But they don’t want to create page They want to post blog post on this topic So which one will be dealing the better option, creating your landing page or posting content on blog? JOHN MUELLER: I think that’s more a matter of just general strategy that the site or the business wants to pursue So from our point of view, both of these variations could show up in search and could be fairly reasonable I think having blog posts is really a useful way to bring content out there and to kind of be associated with the audience directly Essentially, it’s more a matter of how you want to position your website and your business That’s more up to you SPEAKER 4: Thank you JOHN MUELLER: All right Now we have a long question about single page apps We have a case in which we’ve been called to optimize a certain sites with the following conditions The site uses Angular JS, single page application So when a call is issued, kind of the framework is called, and then JavaScript pulls in the content At the moment, it’s a demo site with no index tag Our main question is appropriate use of caching mechanisms for bots Or tests show that response time varies between 2 and 4 seconds Would that be a problem? Let’s see Oh, OK This is response time– so they’re basically serving a static HTML version to crawlers I don’t see any problem with that I think that’s one way to deal with this For the large part, we can render JavaScript-based pages fairly well For really large websites with a lot of content, I realize sometimes it makes sense to pre-render them to kind of dynamically serve a static HTML version to search engines, to social media services, or whatever else needs to process content from a website That sometimes makes it a little bit more predictable what kind of happens And for search engines in particular, it probably makes it so that we’re able to crawl a little bit faster and index the content a little bit faster So if this is something that you can set up and that you feel comfortable maintaining in the sense that you can confirm that the pre-rendered version is really equivalent to the version that a user would see, then that’s something that sounds like a reasonable approach One thing to watch out for here, especially for search engines, is that most search engines have a desktop and a mobile-type browser or user agent that they use And if you’re dynamically generating the pages for these, you need to make sure that you’re serving the appropriate version through the appropriate device type So don’t just look for Googlebot user agent and then serve the desktop version of the page to Googlebot all the time Because when we crawl with our smartphone crawler, it will also have Googlebot in the user agent name, but we’d like to see the mobile version of the page then With regards to the delay, if this is a couple seconds, that’s generally OK I would still try to find ways to improve this So if you can do something fancier with caching to make it go even faster, that would be fantastic Usually the delay has more an effect on how much we can crawl on your website overall So if we see that a page takes a long time to be served to Googlebot, then we assume that your server is kind of overloaded And we generally kind of back off from crawling a little bit just to make sure that we don’t cause any additional problems on your server or your normal users So that’s kind of where I would say, well, you want to serve these static pages as quickly as you can so that search engines don’t feel that they need to back off, and like, oh, I’ll just crawl a few hundred pages a day from this website because it’s really slow, which is probably not what you’d like SPEAKER 3: Thank you, John

This is very important for us JOHN MUELLER: Cool Yeah, I think it’s one of those topics that’s going to be more and more important because these frameworks are really, really popular And you can do really fancy things with them And getting experience on how to set them up properly is, I think, really valuable, especially for SEOs Because traditionally, it’s always been like, oh, this website uses JavaScript It will never be indexed And now you kind of have a new reality in the last couple of years that SEOs, for a large part, have been kind of slow in kind of recognizing So getting started on this, I think, is fantastic SPEAKER 3: OK, John, where we can test the demo version? JOHN MUELLER: What I would use is the Rich Results test– SPEAKER 3: Yeah JOHN MUELLER: –from the search console because what you have there is the ability to see the rendered DOM as well– so the rendered HTML– and that way you can double check to see that things like image tags are pulled in properly, the titles are working, all of that SPEAKER 3: OK Thank you Thank you, really JOHN MUELLER: Sure [HUMMING] Let’s see Question about bounce rates I feel that Google has been focused more on user intent and– doo-doo-doo, doo-doo-doo, let’s see– does Google count the visitor bounce when ranking a site? I don’t think Google would keep a site number one if they have a high bounce rate So what would it be? So we do look at a lot of signals in particular when we evaluate algorithms So if we have different algorithms that we’re looking at that we’re trying out, then we track a whole bunch of user signals to see, are we getting it right? Sometimes we have to be careful with how we evaluate that though because there’s things like, kind of click-baity titles where people click on and they always go there, but actually it’s not really that great content So this is the kind of thing that when we evaluate algorithms we have to watch out for On a per-site level or per-URL level, I think that’s really, really hard and tricky A lot of sites have kind of short form content, and that’s perfectly fine So if you have information that people can pick up on fairly quickly– I don’t know, maybe you have a website for an event that’s happening in the near future and you just want to get the date and time and the location out there If people can go to your page, get that information, and they’re done, it’s like, that’s perfect That’s not something bad that we think we should hold sites back from So with things like that in mind, it’s not a metric that you can just blindly use and say, well, it has a high bounce rate, therefore it must be bad Maybe it’s particularly good because people are getting the information that they need fairly quickly So I think this is one of those metrics that people focus on a little bit too much and actually from a search point of view is not really what you think it is or what do you think it’s doing Does showing ads inside content– is that considered as poor quality by the Google algorithm even though we properly label them and put them in a container so that people know that they’re an ad For the most part, ads are perfectly fine There is the better ad standard which I believe that the Chrome team is working together with to try to recognize sites that are particularly problematic with regard to ads But it sounds like that wouldn’t be the case here The other thing that comes to mind is that we want to be able to recognize or find the content above the fold for most sites So a user is landing on a product page or a page for an individual piece of content on your website, and all they see on top are like a whole bunch of ads, then that’s a really bad user experience And that’s something that our algorithms try to figure out On the other hand, if their primary content is visible to users, if there’s some ads on the page, that’s perfectly fine That’s not something to worry about We’re seeing a ranking difference in mobile and desktop Is this the mobile-first indexing effect? No That would not be from mobile-first indexing

Mobile-first indexing is basically when we indexed the content with a mobile device So in a case like that, that indexed content would be used for both desktop and mobile rankings If you’re seeing a difference in rankings for mobile and desktop, that’s for the most part just the normal differences that there are between desktop and mobile search results It’s not a matter of the content being indexed differently So indexing is basically collecting the information from the page and storing it on our side And then the ranking side is taking that collected information and trying to find the right order in the search results for that site So if you’re seen differences with ranking on desktop and mobile, that would not be from mobile-first indexing At what period of time and to what percentage of users does Google start to have issues with A/B tests? There are some areas of the website where traffic is low so it takes longer to get a good sample set Sometimes this takes six months to get a significant sample set For the most part, that’s OK What we really need to have is that the content is equivalent during that time So if you’re doing an A/B test and you’re tweaking things like colors and button placement and things like that, that for us is totally unproblematic You can run those tests forever from our point of view On the other hand, if you’re doing an A/B test where one page has just the title and the other page has all of the textual content on it, then that would be a bit more tricky That’s also the kind of thing where if we index maybe the B version or the A version of a page and suddenly we have vastly different content available for ranking And that’s something that you’d probably want to avoid as well We have structured data for paywalls in the planning but not yet integrated When will Google identify our pages as cloaking? Or how long is the transition period, if there is one? It sounds like you know what you you need to do So from that point of view, I’d kind of try to get this finished as quickly as possible For the most part, we’re aware that this is sometimes tricky to do And we do kind of take that into account when we look at these things on our side But I would really try to get this resolved as quickly as possible so that you don’t have to worry about, will someone from the webspam team take a look at my site and think that I’m doing something bad? So that you’re really on the safe side I’ve seen some sites ranking even though 95% of their content is copied Are those sites ranking due to a lot of backlinks? Should we focus on unique content or backlinks? So I think, for the most part, the important part here is you should not be– [BACKGROUND NOISE] Let me just mute you for a second So I think the important part here is you should not be focusing on some random other spammy sites and saying, oh, I want to be just as bad as them You should really aim to be significantly better and make sure that what you’re providing is not just 94% copy if the others are 95% copy, but really, that you’re writing something significantly valuable and useful to the web, where our algorithms when they look at your site, they’re not saying, oh, this is mostly bad, but look there is one little piece of thing or this small signal that says it’s kind of OK But rather, it should be the case that when our algorithms look at your site, they’re saying all of this stuff is fantastic And maybe there are some things that they forgot or that they missed out And I know, that happens That’s the kind of situation where our algorithms will try to rank that a lot better Then if you’re just trying to be just as bad as all of the others, then that’s really a really bad strategy So we see this a lot in the forums as well where an affiliate type will come in and say, well, for this query, you’re showing nine other cheap affiliate sites that are just scraping the feeds that they’re getting from the provider Why don’t you show my affiliate site as well, which is also scraping the feeds? If we’re already showing nine low quality sites, then why would we need to show another low quality site On the other hand, if we’re showing nine low quality sites, and we have this one really fantastic site that we could also show, then that’s an argument That’s something that works for our algorithms And in particular, that works really

well for our teams as well where if we see this kind of situation in the help forum or somewhere else, then we can go to the team and say hey, for this query, this that we should actually be showing, we’re not showing at all And they can look at that and say, yeah, you’re probably right We should be showing this for this query We should be showing this as number one because clearly the best site of its kind for this query On the other hand, if we go to them and say, hey, there are nine sites that are just doing the same thing Why can’t this one guy here get a break and also be shown there? They’re like, why? Why do we need to add yet another low-quality site to the mix? It doesn’t significantly improve things There’s no reason to do it So that’s kind of what I would aim for there, not just to be kind of similar to everyone else, but really to be significantly better Let’s see According to the Developers site, Google uses a web rendering service based on Chrome 41 For mobile-first indexing, will that be changed or not? At the moment, we’re still using this I expect over time that we’ll switch to kind of a more modern Chrome release schedule But that’s probably still a little bit out For mobile-first indexing, it’ll continue to use kind of the Chrome 41 set-up With regards to PageSpeed– it seems a lot of SEOs and developers get hung up on metrics like Time To First Byte or DOM Complete, possibly because it’s easier to report progress Any comment on that? With regards to speed, we do look at the overall picture to try to figure out what really is happening here with regards to speed In particular with the changes that are happening I think in June or July with regards to mobile speed, we try to look at a number of different metrics to figure out what is actually relevant here or what isn’t so relevant here Because we know that people like to focus on individual numbers and then try to find ways to optimize those without actually improving things for users, and so we want to focus on a variety of different signals to figure out what is actually relevant for these individual sites I think using tools that pull out individual metrics is a great way to recognize where low hanging fruit might be, where you can make a significant improvement on your site speed in an easy way sometimes It’s also a really useful way to kind of monitor if your site is still in the range where you’d like to be And that’s also really important to do, that you have some kind of automated monitoring set up that tells you when changes that you’ve made on your site have resulted in significantly slow downs And for all of these, you might use any one of these metrics You might use multiple tools that look at multiple metrics That’s really up to you Does HTTPS count as a ranking signal if the website has implemented HTTPS but is using an exploitable cipher For example, it is vulnerable to the OpenSSL Padding Oracle vulnerability Yes So if we can recognize that your site is using a modern HTTPS certificate and that works on a modern browser, then for the most part, we will accept that And we’ll kind of use that as a signal that we should be indexing the HTTPS version of your pages And once we index the HTTPS version of your page, then that’s kind of what we look for there And the ranking signal is not something where sites jump up to number one from number ten It’s really a more subtle signal that we use mostly as a tiebreaker in situations where all else might be kind of equal So that’s kind of there With regards to exploitable setups on a server, at the moment at least, we don’t differentiate between the exact type of HTTPS set-up We don’t differentiate between the exact certificate type that you have or the certificate valability period, anything like that We just try to see if there is any valid HTTPS certificate

or not Let’s see How does Google treat sites that are created on the basis of the marketplace? The site is hosted on its domain but has the same template and through link to the home page to the marketplace itself Does this affect the ranking of the site and the market? I’m not really sure how you mean marketplace It sounds like this might be kind of an affiliate site where you have essentially the same content with your affiliate links, and you’re guiding people to the same content In a case like that, I’d really make sure that your site has significant unique and compelling content so that we can actually recognize that there is a reason to index this page separately There’s a reason to show this page in the search results separately rather than us to look at it and say, well, it’s actually exactly the same as the primary site as well, so we don’t really need to index both of these I have a site with very long content When I use fetch and render, only one third of the page renders for Googlebot and users Would this affect ranking or is this something with the scroll bar length? I’ve seen a few of these cases And it’s mostly just the testing tools have some limits with regards to what they think makes sense to kind of show directly in the tool And for the most part that seems to be what are people running into A simple way to double check indexing is to just take a snippet of text that’s only on the lower part of your page and just to search for that And if your pages show up for that snippet of text, then obviously we can index the page that far May business review markups, star ratings, be applied to all pages of the website? Structured data should be specific to the primary topic of the page So if the primary topic of your page is something that people can review, for example, if you are selling your product and people can review that product, then sure You can put that on all the pages that have products like that On the other hand, if the markup is specific to your website in general or specific to one product and not all of them, then that’s something where putting the same structured data across your whole site would be wrong So kind of depends on what you have there I’m trying to move my website to a different domain The Google forum is not responding, and we can’t move So you’re welcome to kind of drop me a link to your thread, or if you have a thread in the English form, I can double check there as well But in general, the site moves are really kind of unproblematic in the meantime in the sense that we can process them fairly well If you set up the appropriate redirects, if you follow the instructions in our help center, then for the most part, these are pretty uneventful They should just work How does one optimize to get into software carousels that show up above the search results? Where does Google pull these results from? These show up for all kinds of queries So I think there are two main things there On the one hand, there are kind of these general carousels that we sometimes show based on information we have crawled from the web in general And I believe there are also unique kind of carousels or OneBoxes that we have for some types of content based on the structured data on the pages So in particular, I believe for software we have kind of a structured data that you can apply your pages And with this structured data, we can understand what your pages are about a little bit easier We can recognize maybe images on the page a little bit easier And we can show that content as matching into this kind of category of content a little bit easier So that’s kind of what I would aim for there Make sure that you have the right structured data so that everything kind of aligns there All right We’re running low on time What else can I help you all with? SPEAKER 6: Hi, John JOHN MUELLER: Hi SPEAKER 6: So I found in the Lighthouse Reports

the widget for Google Chrome that it tests both sites on mobile devices And I wanted to ask you how important is that for SEO? Or is it just for user experience, from an user experience perspective [INAUDIBLE]?? JOHN MUELLER: That’s mostly for user experience We do take that into account for mobile friendliness So I don’t know if the threshold is exactly the same as in Lighthouse, but with the mobile-friendly test we do try to understand how much of a page has a small font size that can’t be read And if that’s a lot, then we wouldn’t treat the page as being mobile-friendly I don’t know if the thresholds are exactly the same with the Lighthouse tests though The Lighthouse test is really more oriented for user experience SPEAKER 6: Is there any specific metric of the form, the sizes, something like a standard for all kind of mobile devices for programming for [INAUDIBLE] JOHN MUELLER: I thought we had something documented for the mobile friendly sites section in the developer docs, but I’m not 100% sure Yeah I’d double-check the developer documentation for mobile-friendly sites I believe we have something there SPEAKER 6: So you recommend me to go through the Developer documents to check it OK JOHN MUELLER: Yup SPEAKER 6: OK Thank you Thank you JOHN MUELLER: Sure SPEAKER 7: Hello, John JOHN MUELLER: Hi SPEAKER 5: Hey Hey, John So I had one question regarding this Fetch as Google So I have a few URLs when I do this But just Google, I see sometimes my render view it’s coming not showing my full kind of rendering of the page But for some of the URLs it is showing full kind of renders So just wondering, why is it happening? For some URLs, it is not showing the full render, and for some URLs, it is showing full JOHN MUELLER: So how do you mean not showing the full render? What is it? SPEAKER 5: Just there is tool online that I’m getting this to view One is a Google one and one is a user, basically, and then you scroll down to that particular window So I only see it around one point of time I mean, it’s not true in all of the data But when I’m checking for some other URLs from the same website that it’s showing for the full width kind of data So that’s why I’m wondering, why is it happening? JOHN MUELLER: So it’s rendering the view, but not like all the way down, or? SPEAKER 5: Yes Yeah Yeah JOHN MUELLER: I don’t know I assume that’s just the limitation from the tool with regard to how high of a viewport we show in that screen shot So that, from my point of view, wouldn’t necessarily be a bad sign Again, I would just double check the content and try to find a snippet on the lower part of the page and search for that And if you can find that in search, then that’s perfectly fine SPEAKER 5: All right Thank you SPEAKER 7: Hello, John JOHN MUELLER: Hi Go ahead SPEAKER 7: So I want to ask one question about HTML validation issues Does it really affect site quality? I mean we know that most browser knows how to correct this problem But what do you think if you have a computer with– you have a lot of HTML errors JOHN MUELLER: For the most part, it doesn’t matter So we can deal with a lot of broken HTML because most of the world has broken HTML You’re not alone But in particular for structured data, if you want to mark things up on a page, it helps a lot to have clean HTML because then you can really say, this is the section that has the title, and this is the image, and this is the reviews for this product And that makes it a lot easier if you have clean HTML because you don’t have to worry the HTML breaking the structured data apart But for the most part for normal content pages, if it works in a browser, it should work for Googlebot SPEAKER 7: OK Thank you SPEAKER 8: Hello? JOHN MUELLER: Hi SPEAKER 8: How are you doing? Hello? JOHN MUELLER: Yeah Go for it SPEAKER 8: Yeah OK John, I have a couple of questions

Do canical tags pass link juice from one page to another page like redirects do? JOHN MUELLER: Sorry? I didn’t get the first part of what you– SPEAKER 8: All right Do canical tags– JOHN MUELLER: Canonical tags SPEAKER 8: Yeah, yeah, yeah JOHN MUELLER: OK SPEAKER 8: –pass link juice from one page to another page like redirects do? JOHN MUELLER: So with the canonical tag, you’re basically saying that two pages are the same And we combine all of the signals that we have for those pages So that includes things like page rank which people usually call link juice, but– Yeah SPEAKER 8: All right All right So another– [INAUDIBLE] –like, let’s say page A has canonical pointing to page B So I suppose our page A will get the index, right? JOHN MUELLER: Usually But not always So when we have this situation with multiple pages that have the same content, we use signals like the canonical tag, like redirects, like internal links, like site maps to figure out which one of these, A or B should be the one that we keep So if everything tells us that A is the one that is the canonical, that we should keep, then we will use A If there is a mix– that some signals say this one, some signals say that one– then sometimes we will use this one and sometimes we will use the other one It’s not 100% guarantee SPEAKER 8: All right All right OK The other one, if we are not able to see the caching of some of the [INAUDIBLE] of the pages that are not showing they are caching So what might be the reason for this thing? JOHN MUELLER: That can be completely normal That’s something where our algorithms or our systems sometimes just don’t have a cached page that we can link to directly And that’s completely normal SPEAKER 8: OK, OK Great Thank you SPEAKER 9: John? JOHN MUELLER: All right Go for it SPEAKER 9: Let’s say you have a site, and it’s pure content It’s very sterile Is there value in adding imagery just to make it kind of easier for the users to read and assuming that adding imagery is going to make them want to read the articles, if that makes sense? JOHN MUELLER: I think there’s value in doing that from the long term point of view It’s not the case that we would have any kind of SEO signal saying, there are images in this article therefore it must be better Obviously, if the images are things that you want to have indexed for image search, then, by having them in the article, we can pick them up But just by putting images into a textual article doesn’t kind of make our algorithms think more of the page than without those images SPEAKER 9: Right, right No, I understand Thank you JOHN MUELLER: All right We’re a bit over time Let’s take a break here It’s been great having you all here, and lots of cool questions and comments in-between Hope to see you all again one of the future Hangouts I’ll set the next ones up probably later today It’s a bit tricky with the timing, but I’ll figure something out And again, I wish you all a great weekend Thanks for dropping by and have a good day SPEAKER 3: Thanks for this ten minutes more, John Thank you We appreciate it JOHN MUELLER: Sure Very welcome SPEAKER 5: Have a great day JOHN MUELLER: Bye, everyone SPEAKER 3: Bye SPEAKER 1: All right Bye, John