Posts Tagged ‘SEO’

Who’s Your Audience and What’s Their Word?

Thursday, April 14th, 2011 by Butch Whitmire

The latest data from Experian Hitwise shows that the number of keywords typed into search engines is declining.  Longer searches, those of 5-8 words, were down last month 3%.  As the data table shows,  1 word searches comprise a full 24% of all searches. 2 word searches are a close second. 

When it comes to people finding your business or organization organically on the web, understanding who your  potential customers are, how they think,  and what they may type into their favorite search engine is essential. 

At Force 5, we help clients determine their customer’s “word” as part of our Soul Searching™ brand development process and in our free SEO consultation services included in their web development packages.

Black Hat SEO will git you run out of town Pardner

Monday, March 28th, 2011 by David Morgan

Just like in the ole West, or at least as our Western movies tell us, Black hats are villains – and usually up to “no good”.

The New York Times recently noticed something odd when performing Google searches on terms as diverse as bedding, skinny jeans, area rugs and grommet-top curtains. “You could imagine a dozen contenders for each of these searches,” writes David Segal. “But in the last several months, one name turned up, with uncanny regularity, in the No. 1 spot for each and every term: JCPenney.” The retailer’s ranking even bested manufacturer Samsonite.com in Google searches for Samsonite carry-on luggage.

They discovered the strikingly unsubtle use of “black hat” optimization—including an array of phony sites that appeared to exist for the sole purpose of linking to the store’s website.

“There are links to JCPenney.com’s dresses page on sites about diseases, cameras, cars, dogs, aluminum sheets, travel, snoring, diamond drills, bathroom tiles, hotel furniture, online games, commodities, fishing, Adobe Flash, glass shower doors, jokes and dentists—and the list goes on,” noted Doug Pierce of Blue Fountain Media, a firm hired by the New York Times to investigate. He found an array of phony sites that appeared to exist for the sole purpose of linking to the store’s website.Though not illegal, black-hat tactics are strictly verboten in the Google rulebook. “The company draws a pretty thick line between techniques it considers deceptive and ‘white hat’ approaches, which are offered by hundreds of consulting firms and are legitimate ways to increase a site’s visibility.

Google retaliated with a “manual action” against JCPenney. In the space of two hours, for instance, the retailer’s No. 1 ranking for Samsonite carry-on luggage plummeted to No. 71. Rankings for other search terms underwent similarly dramatic demotions.

The Po!nt: In the end, cheaters never win. Sure, everyone’s trying to boost their search-engine rankings. Just make sure you follow Google’s ground rules when you do it.
Source:The full (and very interesting article) The New York Times.

Prevent Duplicate Content

Thursday, December 9th, 2010 by Force 5

Having duplicated content within search engines is a very common problem that is often overlooked. You would think having duplicated content will be good for your search rankings, although that is not the case as it may be not useful to the end-user.

www-vs-non-www (Problem)

Search engines recognize your www.your-domain.com and your-domain.com as two separate websites. You may not be penalized by having multiple results although you can improve the efficiency for end-users combing through the search results. In addition, if you have a single convention of your domain usage people linking to your site can improve your link backs.

www-vs-non-www (Solution)

Since search engines track these as separate sites you need to pick either your www domain or the non-www domain. You still want both versions active although what you want to do is to create a 301 Permanent Redirect to your desired domain. For example, http://discoverforce5.com will redirect your browser to http://www.discoverforce5.com. By performing a 301 Permanent Redirect you are telling search engines not to store/index the redirected content for that page/location.

One thing to not forget is that you make sure you allow your website URL paths to be included during the 301 Permanent Redirect process. For example, http://discoverforce5.com/Media-Hub/ will send the end-user to http://www.discoverforce5.com/Media-Hub/. As you see it makes sure to send the end-user to their original desired page.

ASP.NET Code Example:

In this example we want to use the www domain as the main convention and redirect the non-www domain. The code below you see we grab the current domain and the URL path the request was made. We then check to see if the domain includes the www, if not, we perform the 301 Permanent Redirect.

// get server name/domain
string sDomain = Request.Url.Host.ToString().ToLower(); // i.e. discoverforce5.com 
// get url path
string sPath = Request.RawUrl.ToString();
// check if www is in the server name
if (!sDomain.Contains("www."))
{
    // server name does not contain www - proceed with 301 Permanent Redirect
    Response.RedirectPermanent("http://www." + sDomain + sPath);
}

Apache Server Example:

With Apache servers this process is easier with utilizing the .htaccess functionality. Below you will see the Apache server equivalent to the ASP.NET example above.

RewriteEngine On
RewriteCond %{HTTP_HOST} ^discoverforce5\.com$ [NC]
RewriteRule ^(.*)$ http://www.discoverforce5.com/$1 [R=301]

Inconsistent Linking

Try to keep internal and external page links to your content consistent. For example, don’t link to http://www.discoverforce5.com/Services/ and http://www.discoverforce5.com/Services and http://www.discoverforce5.com/Services/Default.aspx as all three examples are different.

If you are interested in learning about how to submit your site to search engines feel free to read “The little things to not forget about during development [Part: 2]“.

Have any SEO needs or questions? Please give Force 5 a call.

How-to: 301 Permanent Redirect with ASP.NET 4 – Response.RedirectPermanent()

Friday, July 9th, 2010 by Force 5

During the process of migrating development over to the .NET 4 Framework there have been noticeable improvements.

One of the newest improvements is used quite often, Response.RedirectPermanent(). This new feature does a permanent redirection from a requested URL to a specified URL.

For a quick flashback to how this was previously completed review the code below:

/*
 * Previous 301 Permanent Redirect
 * */
Response.Status = "301 Moved Permanently";
Response.AddHeader("Location", "NewPage.aspx");

How you implement the new 301 Permanent Redirect is simply Response.RedirectPermanent(“URL-path-goes-here”). Here is an example with the new way with fewer lines of code.

/*
 * .NET 4 301 Permanent Redirect
 * */
Response.RedirectPermanent("NewPage.aspx");

By implementing 301 redirects it is a good practice to inform search engines that content has moved to a new location. Here are examples where Force 5 is involved with 301 redirects:

  • a web page that have moved
  • a web page that has been removed
  • web page content that has been consolidated with another web page
  • non-www permanent redirect to the www (or vice versa)

How-to: Adding META Keywords & META Description with ASP.NET 4

Tuesday, July 6th, 2010 by Force 5

In a previous post I showed how-to add specific page keywords & descriptions while using ASP.NET with Master Pages. Compared with the latest ASP.NET 4 version, this technique was not coder friendly.

In ASP.NET 4 adding page keywords and descriptions is as easy as this shown below.

    protected void Page_Load(object sender, EventArgs e)
    {
        // page keywords
        Page.MetaKeywords = "keywords go here...";
        // page description
        Page.MetaDescription = "description go here...";
    }

Overall, page keywords & descriptions are still important for SEO success along with page titles and page content. If you need any help with your website search rankings please feel free to contact Force 5 for some guidance.

Is Your Website Customer-Centric?

Friday, May 21st, 2010 by Force 5

In case you haven’t noticed the web is changing. There is a huge movement underway toward clean, simple, user friendly websites that promote productivity and, consumer interaction. The average consumer doesn’t care how creative a design team you have or, how technically advanced your development team is. They want information. Honest, peer driven information.

Whether a potential customer is at your site to buy a product, comment on your organization, read reviews, download a podcast or, read your latest blog entry, it’s your job to keep them interested and coming back. This is what customer-centric website design is all about.

So what is a customer-centric website?

Simple.  A customer-centric website focuses on your customers and what they want. Today’s online consumer is not interested in your company, products or services – they are interested in themselves. A customer-centric websites is structured so the customer can easily find what they want or get answers to their questions. By focusing your website on customer benefits and, ensuring a unique user experience, you will not only increase loyalty you’ll generate the much coveted word-of-mouth advertising; both key drivers of online sales.

There are a few basic steps you can take to get started on the road to a customer-centric website
• Clearly define your product or service and how customers will benefit from it
• Make sure your contact information is never more than a click away
• Clearly organized and easily navigable site content
• Place links in consistent locations and include them on every page
• Review your content for spelling and grammar mistakes
• Allow customer feedback on products, services and the site
• Make it easy for customers to get what they want
• Ask customers for a bare minimum of information to register or sign up
• Include in-depth, well written FAQ’s
• Make it easy for a customer to get support

A successful customer-centric website is created by meeting customers’ needs better than anyone else. If you focus every aspect of your website on meeting your customers’ needs you’re much more likely to remain a preferred provider. Remember, your customer is your best source of advertising. Give them what they want and they will tell the world.

Don’t get left behind.  Take the next step toward a true customer-centric website. Contact Force 5 today at 574-234-2060 or info@discoverforce5.com.

The little things to not forget about during development [Part: 2]

Friday, May 7th, 2010 by Force 5

Sitemap.xml

What are Sitemaps and why are they important?

Sitemaps are a tool for developers to inform search engines about the website content that is available to be indexed. The sitemap protocol is made up of XML that contains a list of URLs, last modified dates, and page priorities for your website.

Here is a quick sample in simple form of a sitemap.xml file:

<?xml version="1.0" encoding="UTF-8"?>
<urlset  xmlns=”http://www.sitemaps.org/schemas/sitemap/0.9”>
  <url>
  <loc>http://www.discoverforce5.com/</loc>
  </url>
  <url>
  <loc>http://www.discoverforce5.com/About/</loc>
  </url>
</urlset>

For more information on sitemap protocol and XML tag definitions visit: http://www.sitemaps.org/protocol.php#xmlTagDefinitions

Below you will see two optional attributes available to include in your sitemap. The new attributes you see will be <changefreq> and <priority>. As stated within the sitemap documentation:

  • <changefreq> refers to how often a page is likely to change even though search engines may not crawl that often.
  • <priority> refers to the priority related to other links/URL’s within your site.
<?xml version="1.0" encoding="UTF-8"?>
<urlset  xmlns=”http://www.sitemaps.org/schemas/sitemap/0.9”>
  <url>
  <loc>http://www.discoverforce5.com/</loc>
  <changefreq>monthly</changefreq>
  <priority>1.00</priority>
  </url>
  <url>
  <loc>http://www.discoverforce5.com/About/</loc>
  <changefreq>monthly</changefreq>
  <priority>0.80</priority>
  </url>
</urlset>

For more information about sitemaps and more advance information head over to www.sitemaps.org.

Next steps – Submitting your sitemap to the search engines

Each search engine is different in how they approach webmasters in submitting sites. Here is a list of locations to submit your sitemap.xml to search engines:

Tying in both the sitemap.xml with robots.txt

In part 1 of this series discussing robots.txt, you can declare the location of your sitemap.xml file for web crawlers/bots. Here is an example what a Robots.txt would look like:

User-agent: *
Allow: /
Sitemap: http://www.yourdomain.com/sitemap.xml

A quick refresher about the syntax above: “User-agent: *” is defining all bots and “Allow: /” is stating index all folders.

If you have multiple sitemaps you can declare them in your Robots.txt file. Here is an example in how to do so:

User-agent: *
Allow: /
Sitemap: http://www.yourdomain.com/sitemap-1.xml
Sitemap: http://www.yourdomain.com/sitemap-2.xml

In conclusion, having a sitemap.xml is not required for successful search engine indexing. Although, at Force 5 we believe it is beneficial when submitting your site to search engines that your site will be properly indexed. A great example is when a new site is going live where older pages may no longer exist in the same location. Even though search engines use other methods on indexing your site, using Sitemaps will in the end help the indexing process.

If you have any SEO needs or questions please give Force 5 a call.

The little things to not forget about during development [Part: 1]

Friday, March 19th, 2010 by Force 5

Robots.txt – Telling bots where to go and where not to

What is the robots.txt and why is the robots.txt file important?

The robots.txt file is placed in the root folder of your website. This file instructs search engine bots what can and cannot be indexed. All you need to do is to define some criteria of what can be indexed and what cannot.

Examples:

Below you will see “User-agent: *” which is defining all bots and “Allow: /” is stating index all folders.

User-agent: *
Allow: /

Here is a direct opposite to not allow indexing for all bots.

User-agent: *
Disallow: /

Not allowing indexing for your whole website would probably not be recommended. Although there can be special situations where you would want to block all. For example, if you are working on a ‘beta’ website that you did not want to get index. This would be the situation to use if you were not using an authentication process on the beta site to be able to view the site.

A few years ago when bandwidth was costly I ended up having to create a special rule to block my portfolio image folder. In this example you will see how disallow a folder for all bots.

User-agent: *
Disallow: /images/

For more information regarding robots.txt here are a few resources worth checking out:

Lastly, it should be noted the robots.txt is merely a recommendation than absolute. Meaning a bot can totally ignore the robots.txt rules that you have laid out. So if you have any important content that you need to keep from being indexed, it is a best practice to put login credentials on the folder storing that content to keep it from the bots.

If you need help with your robots or SEO please feel free to give Force 5 a call.