Posted:

Webmaster level: all

Security is a top priority for Google. We invest a lot in making sure that our services use industry-leading security, like strong HTTPS encryption by default. That means that people using Search, Gmail and Google Drive, for example, automatically have a secure connection to Google.

Beyond our own stuff, we’re also working to make the Internet safer more broadly. A big part of that is making sure that websites people access from Google are secure. For instance, we have created resources to help webmasters prevent and fix security breaches on their sites.

We want to go even further. At Google I/O a few months ago, we called for “HTTPS everywhere” on the web.

We’ve also seen more and more webmasters adopting HTTPS (also known as HTTP over TLS, or Transport Layer Security), on their website, which is encouraging.

For these reasons, over the past few months we’ve been running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms. We've seen positive results, so we're starting to use HTTPS as a ranking signal. For now it's only a very lightweight signal — affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content — while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.


Lock


In the coming weeks, we’ll publish detailed best practices (it's in our help center now) to make TLS adoption easier, and to avoid common mistakes. Here are some basic tips to get started:

  • Decide the kind of certificate you need: single, multi-domain, or wildcard certificate
  • Use 2048-bit key certificates
  • Use relative URLs for resources that reside on the same secure domain
  • Use protocol relative URLs for all other domains
  • Check out our Site move article for more guidelines on how to change your website’s address
  • Don’t block your HTTPS site from crawling using robots.txt
  • Allow indexing of your pages by search engines where possible. Avoid the noindex robots meta tag.

If your website is already serving on HTTPS, you can test its security level and configuration with the Qualys Lab tool. If you are concerned about TLS and your site’s performance, have a look at Is TLS fast yet?. And of course, if you have any questions or concerns, please feel free to post in our Webmaster Help Forums.

We hope to see more websites using HTTPS in the future. Let’s all make the web more secure!

Posted:
(Cross-posted on the Google News Blog)

Webmaster level: All

If you're a news publisher, your website has probably evolved and changed over time -- just like your stories. But in the past, when you made changes to the structure of your site, we might not have discovered your new content. That meant a lost opportunity for your readers, and for you. Unless you regularly checked Webmaster Tools, you might not even have realized that your new content wasn’t showing up in Google News. To prevent this from happening, we are letting you make changes to our record of your news site using the just launched Google News Publisher Center.

With the Publisher Center, your potential readers can be more informed about the articles they’re clicking on and you benefit from better discovery and classification of your news content. After verifying ownership of your site using Google Webmaster Tools, you can use the Publisher Center to directly make the following changes:

  • Update your news site details, including changing your site name and labeling your publication with any relevant source labels (e.g., “Blog”, “Satire” or “Opinion”)
  • Update your section URLs when you change your site structure (e.g., when you add a new section such as http://example.com/2014commonwealthgames or http://example.com/elections2014)
  • Label your sections with a specific topic (e.g., “Technology” or “Politics”)

Whenever you make changes to your site, we’d recommend also checking our record of it in the Publisher Center and updating it if necessary.

Try it out, or learn more about how to get started.

At the moment the tool is only available to publishers in the U.S. but we plan to introduce it in other countries soon and add more features.  In the meantime, we’d love to hear from you about what works well and what doesn’t. Ultimately, our goal is to make this a platform where news publishers and Google News can work together to provide readers with the best, most diverse news on the web.

Posted:

Webmaster level: intermediate-advanced

To crawl, or not to crawl, that is the robots.txt question.

Making and maintaining correct robots.txt files can sometimes be difficult. While most sites have it easy (tip: they often don't even need a robots.txt file!), finding the directives within a large robots.txt file that are or were blocking individual URLs can be quite tricky. To make that easier, we're now announcing an updated robots.txt testing tool in Webmaster Tools.

You can find the updated testing tool in Webmaster Tools within the Crawl section:

Here you'll see the current robots.txt file, and can test new URLs to see whether they're disallowed for crawling. To guide your way through complicated directives, it will highlight the specific one that led to the final decision. You can make changes in the file and test those too, you'll just need to upload the new version of the file to your server afterwards to make the changes take effect. Our developers site has more about robots.txt directives and how the files are processed.

Additionally, you'll be able to review older versions of your robots.txt file, and see when access issues block us from crawling. For example, if Googlebot sees a 500 server error for the robots.txt file, we'll generally pause further crawling of the website.

Since there may be some errors or warnings shown for your existing sites, we recommend double-checking their robots.txt files. You can also combine it with other parts of Webmaster Tools: for example, you might use the updated Fetch as Google tool to render important pages on your website. If any blocked URLs are reported, you can use this robots.txt tester to find the directive that's blocking them, and, of course, then improve that. A common problem we've seen comes from old robots.txt files that block CSS, JavaScript, or mobile content — fixing that is often trivial once you've seen it.

We hope this updated tool makes it easier for you to test & maintain the robots.txt file. Should you have any questions, or need help with crafting a good set of directives, feel free to drop by our webmaster's help forum!

Posted:
Webmaster level: all
A common annoyance for web users is when websites require browser technologies that are not supported by their device. When users access such pages, they may see nothing but a blank space or miss out a large portion of the page's contents.
Starting today in our English search results in the US, we will indicate to searchers when our algorithms detect pages that may not work on their devices. For example, Adobe Flash is not supported on iOS devices or on Android versions 4.1 and higher, and a page whose contents are mostly Flash may be noted like this:

Developing modern multi-device websites

Fortunately, making websites that work on all modern devices is not that hard: websites can use HTML5 since it is universally supported, sometimes exclusively, by all devices. To help webmasters build websites that work on all types of devices regardless of the type of content they wish to serve, we recently announced two resources:
  • Web Fundamentals: a curated source for modern best practices.
  • Web Starter Kit: a starter framework supporting the Web Fundamentals best practices out of the box.
By following the best practices described in Web Fundamentals you can build a responsive web design, which has long been Google's recommendation for search-friendly sites. Be sure not to block crawling of any Googlebot of the page assets (CSS, JavaScript, and images) using robots.txt or otherwise. Being able to access these external files fully helps our algorithms detect your site's responsive web design configuration and treat it appropriately. You can use the Fetch and render as Google feature in Webmaster Tools to test how our indexing algorithms see your site.
As always, if you need more help you can ask a question in our webmaster forum.

Posted:
If you are targeting users in more than one country, chances are you already heard about rel-alternate-hreflang. If you haven't, in short, this annotation enables Google and other search engines to serve the correct language or regional version of pages to searchers, which can lead to increased user satisfaction.

Making sure the deployed annotations are usable by search engines can be rather difficult, especially on sites with many pages, and site owners all around the world haven’t been shy telling us about this. Today we're releasing a feature that should make debugging rel-alternate-hreflang annotations much easier.

The Language Targeting section in the International Targeting feature enables you to identify two of the most common issues with hreflang annotations:
  • Missing return links: annotations must be confirmed from the pages they are pointing to. If page A links to page B, page B must link back to page A, otherwise the annotations may not be interpreted correctly.
    For each error of this kind we report where and when we detected them, as well as where the return link is expected to be.
incorrect_backlinks.png

  • Incorrect hreflang values: The value of the hreflang attribute must either be a language code in ISO 639-1 format such as "es", or a combination of language and country code such as "es-AR", where the country code is in ISO 3166-1 Alpha 2 format.
    In case our indexing systems detect language or country codes that are not in these formats, we provide example URLs to help you fix them.

incorrect_language.png

Additionally, we've moved the geographic targeting setting to this part of Webmaster Tools, so that you can find all information relevant to international and multilingual targeting in the same place.

We hope you'll find this new feature useful and that it helps you to identify issues with the rel-hreflang-implementation on your site. If you have comments or questions about the feature, please post in our Webmaster Help Forum.

Posted by Gary Illyes, Webmaster Trends

Posted:
Webmaster level: All

Do you have an Android app in addition to your website? You can now connect the two so that users searching from their smartphones and tablets can easily find and reach your app content.

App deep links in search results help your users find your content more easily and re-engage with your app after they’ve installed it. As a site owner, you can show your users the right content at the right time — by connecting pages of your website to the relevant parts of your app you control when your users are directed to your app and when they go to your website.


Hundreds of apps have already implemented app indexing. This week at Google I/O, we’re announcing a set of new features that will make it even easier to set up deep links in your app, connect your site to your app, and keep track of performance and potential errors.

Getting started is easy

We’ve greatly simplified the process to get your app deep links indexed. If your app supports HTTP deep linking schemes, here’s what you need to do:

  1. Add deep link support to your app
  2. Connect your site and your app
  3. There is no step 3 (:

As we index your URLs, we’ll discover and index the app / site connections and may begin to surface app deep links in search results.

We can discover and index your app deep links on our own, but we recommend you publish the deep links. This is also the case if your app only supports a custom deep link scheme. You can publish them in one of the following ways:

There’s one more thing: we’ve added a new feature in Webmaster Tools to help you debug any issues that might arise during app indexing. It will show you what type of errors we’ve detected for the app page-web page pairs, together with example app URIs so you can debug:



We’ll also give you detailed instructions on how to debug each issue, including a QR code for the app deep links, so you can easily open them on your phone or tablet. We’ll send you Webmaster Tools error notifications as well, so you can keep up to date.



Give app indexing a spin, and as always, if you need more help ask questions on the Webmaster help forum.

Posted:
Webmaster level: intermediate-advanced
Few topics confuse and scare webmasters more than site moves. To help you avoid surprises, we've created an in-depth guide on how to handle site moves in a Googlebot-friendly way. So what, exactly, is a site move and how do you go about moving a site correctly?

Basics of site moves

A site move is, broadly, one of two types of content migrations:
  • Site moves without URL changes. Only the underlying infrastructure serving the website is changed without any visible changes to the URL structure. For example, you might move www.example.com to a different hosting provider while keeping the same URLs and site structure on www.example.com.
  • Site moves with URL changes. Here, the URLs on the website change in any number of ways:
    • The protocol: http://www.example.com to https://www.example.com
    • The domain name: example.com to example.net
    • The URL paths: http://example.com/page.php?id=1 to http://example.com/widget
We've seen cases where webmasters implemented site moves incorrectly, or missed out steps that would have greatly increased the chances of the site move completing successfully. To help webmasters design and implement site moves correctly, we've updated the site move guidelines in our Help Center. In parallel, we continue to improve our crawling and indexing systems to detect and handle site moves if you follow our guidelines.

Moving to responsive web design

A related increasingly-common question we're seeing is how can a website move from having separate mobile URLs or dynamic serving to using responsive web design. To help you implement this configuration change, please see this new page on our smartphones recommendations site.
And, as always, please ask on our Webmaster Help forums if you have more questions.