Posts Tagged ‘ASP.NET’

How-To: Use Wget to Automate the Karamasoft UltimateSearch Indexing Process for Your Website

Wednesday, January 5th, 2011 by Christian Mattix

Almost every modern website has a “search my site” module of some sort added to it. In this How-To I’m going to explain how to set up Karamasoft UltimateSearch to automatically rebuild its index on a repeating scheduled basis in a Microsoft Windows hosted environment.

First, you will need to obtain the UltimateSearch software from the Karamasoft site, found at: http://www.karamasoft.com/UltimateSearch/Features.aspx. Follow the directions provided to get the tool installed into your particular hosting environment. For this How-To I’m going to assume that you have a website www.example.com that you have the tool installed in. The process for starting the indexing process is as simple as accessing a particular webpage on your site and passing it a particular operation code in the query string. To start the full indexing process for our example site you would navigate to:

http://www.example.com/UltimateSearchInclude/Admin/UltimateSearch.admin.aspx?cmd=IndexFull

We want to be able to call this process via a script, so we will need something lightweight and easily used from within a scripting language. A perfect tool for this is the Open Source GNU Wget utility. From the GNU Wget site:

GNU Wget is a free software package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It is a non-interactive commandline tool, so it may easily be called from scripts, cron jobs, terminals without X-Windows support, etc.

Download and install the GNU Wget utility. The Windows port can be downloaded from http://gnuwin32.sourceforge.net/packages/wget.htm. Download the setup program, follow the setup wizard, and install the utility to your server. After the installation process has completed, add the directory that you installed wget.exe into to the PATH environment variable.

Once you have UltimateSearch and Wget installed on your server you are now ready to set up the indexing job. In order to have the site be indexed based on a schedule, the admin link needs to be visited on a scheduled basis. In order to do this, we are going to use the Wget utility that we just downloaded in a batch file. That batch file will then be called as a scheduled task by Windows.

The batch file that I created is named USearchIndexTask.bat. In it is the following:

@echo off
wget -O - http://www.example.com/UltimateSearchInclude/Admin/UltimageSearch.admin.aspx?cmd=IndexIncremental > nul 2> nul

I have saved this file in c:\Program Files\Force5\USearchIndexTask.bat. You can save it where-ever it makes the most sense in your hosting environment.

Once the batch file is saved, then you need to create a scheduled task to run the batch file. In our environment I used Windows Task Scheduler to create the task. Use that tool to create a task to run the USearchIndexTask.bat file. Choose a frequency that makes sense for your environment, based on the quantity of changes for the site. If there are very few changes made on a daily bases, then having it run once a day at midnight is an appropriate setting.

Manually run the scheduled task to verify that it completed successfully, and then go get lunch. You are done!

To see more bright ideas from the Left and Right brains of Force5, check out the rest of our blog as well as our work!

Prevent Duplicate Content

Thursday, December 9th, 2010 by Force 5

Having duplicated content within search engines is a very common problem that is often overlooked. You would think having duplicated content will be good for your search rankings, although that is not the case as it may be not useful to the end-user.

www-vs-non-www (Problem)

Search engines recognize your www.your-domain.com and your-domain.com as two separate websites. You may not be penalized by having multiple results although you can improve the efficiency for end-users combing through the search results. In addition, if you have a single convention of your domain usage people linking to your site can improve your link backs.

www-vs-non-www (Solution)

Since search engines track these as separate sites you need to pick either your www domain or the non-www domain. You still want both versions active although what you want to do is to create a 301 Permanent Redirect to your desired domain. For example, http://discoverforce5.com will redirect your browser to http://www.discoverforce5.com. By performing a 301 Permanent Redirect you are telling search engines not to store/index the redirected content for that page/location.

One thing to not forget is that you make sure you allow your website URL paths to be included during the 301 Permanent Redirect process. For example, http://discoverforce5.com/Media-Hub/ will send the end-user to http://www.discoverforce5.com/Media-Hub/. As you see it makes sure to send the end-user to their original desired page.

ASP.NET Code Example:

In this example we want to use the www domain as the main convention and redirect the non-www domain. The code below you see we grab the current domain and the URL path the request was made. We then check to see if the domain includes the www, if not, we perform the 301 Permanent Redirect.

// get server name/domain
string sDomain = Request.Url.Host.ToString().ToLower(); // i.e. discoverforce5.com 
// get url path
string sPath = Request.RawUrl.ToString();
// check if www is in the server name
if (!sDomain.Contains("www."))
{
    // server name does not contain www - proceed with 301 Permanent Redirect
    Response.RedirectPermanent("http://www." + sDomain + sPath);
}

Apache Server Example:

With Apache servers this process is easier with utilizing the .htaccess functionality. Below you will see the Apache server equivalent to the ASP.NET example above.

RewriteEngine On
RewriteCond %{HTTP_HOST} ^discoverforce5\.com$ [NC]
RewriteRule ^(.*)$ http://www.discoverforce5.com/$1 [R=301]

Inconsistent Linking

Try to keep internal and external page links to your content consistent. For example, don’t link to http://www.discoverforce5.com/Services/ and http://www.discoverforce5.com/Services and http://www.discoverforce5.com/Services/Default.aspx as all three examples are different.

If you are interested in learning about how to submit your site to search engines feel free to read “The little things to not forget about during development [Part: 2]“.

Have any SEO needs or questions? Please give Force 5 a call.

Hide IIS7 Response Headers

Thursday, September 16th, 2010 by Force 5

What are Response Headers?

Response headers is data that gets sent from the server to the browser. The data can include date & time, content type (ex: text/html, text/javascript), and server information (ex: Microsoft-IIS/7.0).

Why is it important to hide them?

Security. At the most basic level it will not broadcast what operating system the web server is and which version it is whether it is a Microsoft IIS web server or an Apache PHP web server.

Here is an example of what information is passed back to the browser:

Notes: This was complete on a Windows 2008 web server with IIS 7. One prerequisite we noticed is that you may need to install the IIS 6 Metabase Compatibility tool.

Steps to hide Response Headers in IIS:

  1. Download & Install UrlScan
  2. Configure UrlScan.ini settings
    • Open UrlScan.ini with Notepad (run as administrator).
      • C:\%OS-Directory%\System32\inetsrv\urlscan\UrlScan.ini
    • Change RemoveServerHeader value
      • RemoveServerHeader=1
  3. Editing Machine.Config settings file
    • This next edit you will need to make on each ASP.NET Framework version that you currently run. For example, Force 5 has applications that run ASP.NET version 2.0 & 4.0.
    • You will need to add this configuration setting to each Machine.Config file inside the <system.web> section.
      <system.web>
           <httpRuntime enableVersionHeader="false" />
            .....
      </system.web>

    • An important note, if you do not access to the Machine.Config files you can make this same change to your Web.Config file inside your individual website. The only difference is that you will need to include this setting in each website. Whereas the Machine.Config file will make this change global across the web server.
    • You can find your .NET Framework versions and Machine.Config files in either of these two directories:
      • C:\%OS-Directory%\Microsoft.NET\Framework
      • C:\%OS-Directory%\Microsoft.NET\Framework64
  4. Editing php.ini configuration settings (skip this step if you do not have PHP installed)
    1. Locate and open up your php.ini file
    2. Scroll down until you locate “expose_php
    3. Set expose_php = Off
  5. Restart IIS.
    • After restarting IIS and reviewing the changes you will notice all of the Server Response Headers no longer show.

Just a quick recap of what we just did:

  • Steps 1 & 2, only removes the “Server: Microsoft- IIS/{version}“  response header.
  • Steps 3 & 4, removes the actual ASP.NET & PHP version response header (example: X-AspNet-Version: 2.0.50727) from being displayed.

If you have any questions please feel free to leave a comment below or to contact us.

How-to: 301 Permanent Redirect with ASP.NET 4 – Response.RedirectPermanent()

Friday, July 9th, 2010 by Force 5

During the process of migrating development over to the .NET 4 Framework there have been noticeable improvements.

One of the newest improvements is used quite often, Response.RedirectPermanent(). This new feature does a permanent redirection from a requested URL to a specified URL.

For a quick flashback to how this was previously completed review the code below:

/*
 * Previous 301 Permanent Redirect
 * */
Response.Status = "301 Moved Permanently";
Response.AddHeader("Location", "NewPage.aspx");

How you implement the new 301 Permanent Redirect is simply Response.RedirectPermanent(“URL-path-goes-here”). Here is an example with the new way with fewer lines of code.

/*
 * .NET 4 301 Permanent Redirect
 * */
Response.RedirectPermanent("NewPage.aspx");

By implementing 301 redirects it is a good practice to inform search engines that content has moved to a new location. Here are examples where Force 5 is involved with 301 redirects:

  • a web page that have moved
  • a web page that has been removed
  • web page content that has been consolidated with another web page
  • non-www permanent redirect to the www (or vice versa)

How-to: Adding META Keywords & META Description with ASP.NET 4

Tuesday, July 6th, 2010 by Force 5

In a previous post I showed how-to add specific page keywords & descriptions while using ASP.NET with Master Pages. Compared with the latest ASP.NET 4 version, this technique was not coder friendly.

In ASP.NET 4 adding page keywords and descriptions is as easy as this shown below.

    protected void Page_Load(object sender, EventArgs e)
    {
        // page keywords
        Page.MetaKeywords = "keywords go here...";
        // page description
        Page.MetaDescription = "description go here...";
    }

Overall, page keywords & descriptions are still important for SEO success along with page titles and page content. If you need any help with your website search rankings please feel free to contact Force 5 for some guidance.

SQL Case Study – Convert data rows to columns

Friday, March 5th, 2010 by Force 5

We recently had a project that involved putting together a survey.  This survey was comprised of almost 150 questions.  As we brainstormed the best way to construct the data tables to store this information, the thought of a table with 150 columns made us cringe.  Time constraints also called for something we could put together relatively quickly.  We decided to create a table that stored each question as a row of data.  Then we made a table that referenced the primary key of the question table along with the user’s answer to that question.  So instead of having a table with 150 columns, we have one table with 150 rows and another table that stores a data row for each question answered on the survey.  Now if a question needs to be added or removed from the survey all that needs to happen is add or remove a row from the questions table.

It also made collecting the survey data through a ASP.NET Web Site a lot easier, but that can be a future blog topic.

All of that was a setup for displaying the following solution that we created.  In order to display the data correctly for reporting purposes we needed to be able to transform the 150 rows of data in the questions table into a table with that data as column names.  In simpler terms, we needed to convert a set of data rows in table columns in a temporary table. Then we needed to be able to populate that table with the data from the answers table.

Here is the solution we came up with using the power of a stored procedures in Microsoft SQL Server.

CREATE PROCEDURE [dbo].[Survey_Answers]
AS
BEGIN
	SET NOCOUNT ON;

	-- Declare variables
	DECLARE @QuestionID varchar(20), @sql varchar(MAX)

	-- Create empty temporary table with id column
	CREATE TABLE #tempTable (SurveyID int NULL)

	---- Insert Columns into pivot table ----
	-- Declare cursor to loop through table
	DECLARE curQuestions CURSOR FOR
	SELECT     QuestionID
	FROM         Survey_Questions

	OPEN curQuestions

	FETCH NEXT FROM curQuestions INTO @QuestionID
	WHILE @@FETCH_STATUS=0
	BEGIN
		-- Defines each column
		SET @sql = 'ALTER TABLE #tempTable ADD ' + @QuestionID + ' varchar(1024) NULL'
		-- Executes the command which creates the column in the temp table
		EXEC(@sql)
		FETCH NEXT FROM curQuestions INTO @QuestionID
	END

	-- Clean up cursor
	CLOSE curQuestions
	DEALLOCATE curQuestions
	---- End of Insert Columns section ----

	---- Insert id values into pivot table ----
	-- Create rows in temp table using IDs from Survey table
	INSERT INTO [#tempTable] (SurveyID)
		SELECT     SurveyID
		FROM         Survey

	---- Insert data into pivot table ----
	-- Loop through each row in Survey_Answers
	-- Update values in pivot table

	-- Declare variables
	DECLARE @SurveyID int, @QuestionID2 varchar(20), @Answer varchar(1024), @CurrentSurveyID int

	-- Initialize variables
	SET @CurrentSurveyID = -1
	SET @sql = ''

	-- Declare cursor to loop through table
	DECLARE curAnswers CURSOR FOR
	SELECT     Survey_Answers.SurveyID, Survey_Answers.QuestionID, Survey_Answers.Answer
	FROM         Survey_Answers INNER JOIN
						  Survey ON Survey_Answers.SurveyID = Survey.SurveyID
	ORDER BY Survey_Answers.SurveyID

	OPEN curAnswers

	FETCH NEXT FROM curAnswers INTO @SurveyID, @QuestionID2, @Answer
	WHILE @@FETCH_STATUS=0
	BEGIN
		IF @CurrentSurveyId<>@SurveyId
			BEGIN
				-- This will run at the end of a set of questions related to one survey
				-- And initializes variables for next set of questions
				IF @sql<>''
					BEGIN
						SET @sql = STUFF(@sql, LEN(@sql), 1, ' WHERE (SurveyID = ' + CONVERT(varchar, @CurrentSurveyId) + ');')
						EXEC(@sql)
					END
				SET @sql = 'UPDATE [#tempTable] SET'
				SET @CurrentSurveyId = @SurveyId
			END

		-- Update values in pivot table
		SET @sql = @sql + ' ' + @QuestionId2 + ' = ''' + @Answer + ''','
		FETCH NEXT FROM curAnswers INTO @SurveyID, @QuestionID2, @Answer

		-- This section takes care of the last row since it will not go through the IF @sql<>'' code above. Uses same code as that section
		IF @@FETCH_STATUS = -1
			BEGIN
				SET @sql = STUFF(@sql, LEN(@sql), 1, ' WHERE (SurveyID = ' + CONVERT(varchar, @CurrentSurveyId) + ');')
				EXEC(@sql)
			END
	END

	-- Clean up answers cursor
	CLOSE curAnswers
	DEALLOCATE curAnswers

	-- Select values from created table
	SELECT     [#tempTable].*, Survey.DateCreated
	FROM         Survey INNER JOIN
		[#tempTable] ON Survey.SurveyID = [#tempTable].SurveyID
	ORDER BY Survey.SurveyID 

	-- Clean up the pivot table
	DROP TABLE #tempTable
END

Let us know what you think about our approach or if you have any questions.

Tutorial: How to add specific page keywords & descriptions while using ASP.NET Master Pages

Friday, October 23rd, 2009 by Force 5

One of the most common questions that gets asked when starting out with ASP.NET Master Pages is how to add page specific keywords & descriptions.

For those not familiar with ASP.NET and/or Master Pages is that while using Master Pages the normal HTML tags including META tags are not part of individual pages. The individual ASP.NET pages only refer to content areas called, content place holders. The advantage of using Master Pages is that your site HTML template gets referenced in one area so you are able to make site wide changes to the Master Page instead of making the same changes to every individual web page within the site.

So to add either keywords or a description to your individual page you will need to create an HtmlMeta object as show below.

/* how to code */
protected void Page_Load(object sender, EventArgs e)
{
    // variables
    String sKeywords = String.Empty;
    String sDescription = String.Empty;

    // page keywords
    sKeywords = “Place your page keywords and phrases here.”;

    // page description
    sDescription = “Place your page description here.”;

    // meta tag keywords
    HtmlMeta mKeywords = new HtmlMeta();
    mKeywords.Name = “keywords”;
    mKeywords.Content = sKeywords;
    Header.Controls.Add(mKeywords);

    // meta tag description
    HtmlMeta mDescription = new HtmlMeta();
    mDescription.Name = “description”;
    mDescription.Content = sDescription;
    Header.Controls.Add(mDescription);
}