How To Create SEO Friendly Websites & Rank Faster on Google

wix web design agency

In this Article

I’ve been reading Matt McGee’s Small Business SEM blog in recent months and I must say the guy really knows his stuff. In addition to great posts on practical search engine optimization strategies for small businesses, Matt usually recaps the SEO-related conferences that he has attended in the past (I actually found the Small Business SEM site while browsing Flickr for conference photos).

Most small business owners I meet frequently ask me to evaluate their websites and perhaps propose a comprehensive SEO plan that would rank them higher on Google. This is a fairly broad question to answer in one sitting, and I’ve never been a fan of giving half-baked answers.

As search engine optimization and search marketing become more mainstream, business owners looking to create, improve, or extend their marketing strategies online are constantly looking for guidelines to develop and implement a related SEO strategy.

From Matt’s SEO guides, I’ve prepared a handy checklist of things to look for when evaluating the “SEO Quality” of your website (or your competitors).

Here is a list of search engine friendly factors that every website owner or webmaster should consider when building, or rebuilding a website designed with SEO in mind.

How To Create SEO Friendly Websites for Google

The Code

Whether you are using HTML, PHP, or any other programming language to build your web pages, consider the following:

  • Use a CSS file to define font characteristics, page properties, and visual appearance. In today’s design environment, a good CSS designer should be able to create an entire website whose visual settings are entirely controlled using one central CSS file.
  • If you are not going that far, at least make certain that all CSS files and JavaScript files are linked from an external location. This means that the actual code is not on the web page itself, but referenced from a different location in the website’s structure.
  • Avoid “code bloat”. When building a website from scratch, limiting the amount of extraneous HTML and web design code is a great practice that will allow a clean environment for search engine bots to crawl and index the website efficiently.

Web Address Management

If the text that appears in your web address can be controlled, then it should be controlled.

  • Use keyword-specific text within each web address and separate words with hyphens (not underscores). Remember, as with any opportunity to use keywords, don’t overuse them.
  • My recommendation is to create a web address that is similar to the HTML title of the web page, concatenated a bit to limit character length. A longer string of characters in a web address is less friendly from a user perspective (have you ever emailed a MapQuest link?) and can run a greater risk of breaking, either through copying, typing, or other random errors when distributed.
  • When considering the folder structure, it’s best to limit the number of sub-directories whenever possible. It’s debatable whether greater than 4 apparent sub-directories in a website will cause issues with a search engine’s ability to crawl the deeper pages. Whether or not that idea is accurate, if it’s possible to avoid extended sub-directories, I would do that.
  • Make certain that either the “non-www” versions of web pages 301 redirect to the “www” versions or vice versa. Having duplicate versions of the web address can impact the value of inbound links (for example, different sites link to both the www and non-www versions)
  • Avoid the use of session variables or session tracking directly attached to the web address. While Googlebot and other search engine crawlers are not supposed to see these variables, they can be captured in user bookmarks or scraped material – which could potentially add confusion in indexing (although I just heard – from Google Product Management – that this is resolved at the indexing level).

Miscellaneous Technical Requirements

  • Take advantage of a Robots.txt file to let search engines know what files and folders should not be indexed for search.
  • Add an XML Sitemap to the website structure. This lets search engines know what pages are included on your website when you update them and if certain pages are of higher priority than others.
  • Utilize a custom 404 error page which allows users to click into specific sections of the website instead of having to use the back button in the web browser.
  • Ensure all broken pages and web addresses actually generate a 404 browser response. I’ve seen CMS systems that don’t handle error pages correctly and generate a 200 response (which means “OK”). This can be detrimental since outdated/inaccurate pages in a search engine’s index or broken hyperlinks will likely not resolve accurately in search engine databases.

SEO Specific HTML Tags

SEO Tagging includes HTML Titles, Meta Tags (at the very least, Meta Descriptions and Meta Keywords), and Page Headings. Website administrators need to be able to individually create, change, and manage this information on a regular basis. If the site is built using standard HTML or through a software application like Macromedia Dreamweaver, then this is usually not an issue.

But if the site is designed in a CMS, or uses templated page information (server-side includes for header files, etc), you need to ensure that these HTML Tags can be incorporated into each page’s structure, as needed.

In summary, website owners must have the capability to create unique:

  • Title information (HTML Titles)
  • Meta descriptions and keywords
  • Page Headings

For specific recommendations on SEO tagging (always a popular topic), Search Engine Watch has a great article that reviews and discusses proper meta tag creation.

Layout of Textual Content

Pages text should be presented in a clean, organized manner. The best example I can think of has to be derived from the lessons learned in High School and College-related to writing an exam paper. Consider the following:

  • Clearly defined main headings and sub-headings, when multiple sections of content are used on the same page.
  • Organized lists and bullet points when summarizing and ordering information.
  • The proper usage of font styles to accent specific points or ideas, within reason. (if you bolden the entire page of content, then it means that everything on the page is actually the same weight)
  • Proper grammar and spelling.
  • It’s also recommended that the main points of the page be written towards the top of the page, especially in the area that the user will initially view (and not have to scroll down to read).

Overall Website-Specific Factors

  • Create and utilize an end-user sitemap that provides one landing page design for search engines to crawl and index all important content on your website.
  • Create content beyond product information and company detail. This includes articles, tutorials, and resources applicable to the specific industry.
  • Utilize a navigational strategy that continually connects users to the most important sections of the website. Examples include the integration of a breadcrumb trail and the organization of content into structured sub-sections of the websites.
  • It’s preferable to use text-based navigation versus image-based navigation. This can also be said for headings and other navigational/organizational elements of the website.
  • Use the image “alt” property to properly define what the image is to represent. This is especially important in navigational circumstances.
  • Cross-link relevant material between web pages. If the most important pages are easily accessible and referenced (when appropriate) through cross-links, search engines will recognize this.
  • Try to avoid excessive Flash, AJAX, and other technologies that search engines historically have been known to have difficulty crawling. If they must be used, embed the applet or technology (such as a video) into an HTML page, so that there is still an opportunity to add keyword-rich text and meta information.
  • Finally, configure a web analytics package with your website to allow you to track visitors, referrals, and keyword information. I always recommend Google Analytics which, in my opinion, is the best web reporting tool available. It’s also 100% FREE, so why now?

Final Words

These recommendations provide a framework for search engine optimization success but are in no means the only things that website owners need to do to achieve high rankings for traffic-generating keywords. Always use your own judgment based on your audience when incorporating these recommendations. Keep in mind that your website has to be written for your users first. Keyword spam and content manipulation exclusively for search is never a good idea.

Scroll to Top

Get in touch with us for a Free Quote

We typically respond within a few hours

qodewire A woman is sitting on the floor with a laptop in front of her.

Get Unlimited Website Support & Maintenance from Industry Experts

Try us today for $59/mo only!