Welcome to the latest issue of our newsletter. Here are the latest website promotion and Internet marketing tips for you. Top article of the week Internet marketing news of the week Previous articles We hope that you enjoy this newsletter and that it helps you to get more out of your website. Please forward this newsletter to your friends. Best regards,
1. How Google handles JavaScript Google can handle websites with JavaScript. However, there are several things that you should do to ensure that Google can index the content of your web pages correctly. Google wants to show unique URLs If a dynamically created web page does not have a unique URL, Google has difficulty to show it in the search results. For that reason, your web pages should have individual URLs. Do not create a JavaScript website that uses the same URL for all of the pages. Here's an excerpt from an official Google video: "We only index pages that can be jumped right into. [...] I click on the main navigation for this particular page and then I click on this product and then I see and everything works. But that might not do the trick because you need that unique URL. It has to be something we can get right to. Not using a hashed URL and also the server needs to be able to serve that right away. If I do this journey and then basically take this URL and copy and paste it into an incognito browser... I want people to see the content. Not the home page and not a 404 page." You should pre-render your JavaScript pages for Googlebot Pre-rendering your web pages means that your web server processes much of the JavaScript before it is sent to the user and/or the search engine crawler. Search engine crawlers get much more parseable content if you pre-render your JavaScript web pages. Pre-rendering is also good for website visitors: "I think most of the time it is because [pre-rendering] has benefits for the user on top of just the crawlers. [...] Giving more content over is always a great thing. That doesn't mean that you should always give us a page with a bazillion images right away because that's just not going to be good for the users. [...] It should be always a mix between getting as much crucial content and as possible, but then figuring out which content [images and other non-crucial content] you can load lazily in the end of it. So for SEOs that would be you know, we we know that different queries are different intents. Informational, transactional... so elements critical to that intent should really be in that initial rush [i.e. pre-rendered]." JavaScript should not make a page slower If fancy JavaScript effects make your website slower, it's better to avoid them. Website speed is important and Google doesn't like slow loading pages. "As a user, I don't care about if the first byte has arrived quicker, if I'm still looking at a blank page because JavaScript is executing or something is blocking a resource." Check how search engine crawlers see your web pages Use the website audit tool in SEOprofiler to check how search engines see the content of your web pages. If the website audit tool cannot find the content of your web pages, there are issues on your website that have to be fixed: Check your web pages now 2. Internet marketing news of the week Google: the Page Experience update is more than a tie breaker
"It's more than a random ranking factor, it's also something that affects your site's usability after it ranks (when people actually visit). If you get more traffic (from other SEO efforts) and your conversion rate is low, that traffic is not going to be as useful as when you have a higher conversion rate (assuming UX/speed affects your conversion rate, which it usually does). CWV is a great way of recognizing and quantifying common user annoyances." Bing: Make every feature binary
"To make search more accurate and dynamic, MEB better harnesses the power of large data and allows for an input feature space with over 200 billion binary features that reflect the subtle relationships between search queries and documents."
John Mueller: the disavow tool is a big hammer for big problems
"The disavow tool is not about relevance, it's about discounting bad links which you placed and can't remove. I would not use it as a SEO tweak -- it's a big hammer to fix big link problems." Editor's note: You can find bad links with the link disinfection tool. Google: the cached pages in our results are not important
"The Google cached page is just the HTML from the page as we fetched it. Sometimes images, javascript, css work in that context, sometimes they don't. I wouldn't optimize for the cached page view - that's not representative of indexing." +++ SEARCH +++ ENGINE +++ NEWS +++ TICKER +++ Google: myths and facts about crawling. Discussion: No address, stars, call or website button for all local pack results? Google says it has created a time crystal in a quantum computer, and it's weirder than you can imagine. Google: Simplifying the Page Experience report. Leaked document says Google fired dozens of employees for data misuse. 3. Previous articles Google releases link spam update Official: How Google ranks web pages Google: don't confuse search engines Official: how Google ranks search results and prevents spam Google has released the July 2021 core update Google has released a spam algorithm update View all past issues... Do you need better rankings for your website? SEOprofiler is a web-based search marketing tool that helps you to get better Google rankings. You can create a free SEOprofiler account here. The step-by-step manual will help you to get started as quickly as possible. |