Monday, August 22, 2011

ASP.NET: Moving ViewState to bottom of page

If you wanted to move the ViewState to the bottom of the page in order to get Google to pay more attention to 
my page and less to the wad of Base64'ed ViewState. 
 
protected override void Render(System.Web.UI.HtmlTextWriter writer) 
{
    System.IO.StringWriter stringWriter = new System.IO.StringWriter();
    HtmlTextWriter htmlWriter = new HtmlTextWriter(stringWriter);
    base.Render(htmlWriter);
    string html = stringWriter.ToString();
    int StartPoint = html.IndexOf("<input name="\"__VIEWSTATE\"");</pre" type="\"hidden\"" />
    if (StartPoint >= 0) 
    {
        int EndPoint = html.IndexOf("/>", StartPoint) + 2;
        string viewstateInput = html.Substring(StartPoint, EndPoint - StartPoint);
        html = html.Remove(StartPoint, EndPoint - StartPoint);
        int FormEndStart = html.IndexOf("") - 1;
        if (FormEndStart >= 0) 
        {
            html = html.Insert(FormEndStart, viewstateInput);
        }
    }
    writer.Write(html);
}
 
This method averaged out at 0.000995s. It consistently beat the Regex 
one, even though the Regex one was very simple, the Regexes were 
precompiled. 

Page Life Cycle in ASP.NET

An important article on the different methods and order they are executed during the load of an .aspx web page. ASP.NET.
In this article, we are going to discuss the different methods and order they are executed during the load of an .aspx web page.
When a visitor first requests an .aspx page on your server, the server sends it to the HTTP Pipeline. The HTTP Pipeline handles all processes involved in converting all of the application code into HTML to be interpreted by the browser. The first class initiated is called HttpRuntime. This class finds a free HttpApplication object to start processing the request. The HttpApplication object then runs the appropriate handler assigned in the web.config and machine.config files for the requested extension.
The extension .aspx can be handled by the HandlerClass or HandlerFactory class. The HttpApplication objects starts the IHttpHandler interface which begins processing the application code by calling the processRequest() method.


The processRequest() method then calls the FrameworkInitialize() method which begins building the control trees for the requested page. Now the processRequest() method cycles through the page's life cycle in the order listed below.
Methods Description
Page_Init Page Initialization
LoadViewState View State Loading
LoadPostData Postback Data Processing
Page_Load Page Loading
RaisePostDataChangedEvent PostBack Change Notification
RaisePostBackEvent PostBack Event Handling
Page_PreRender Page Pre Rendering Phase
SaveViewState View State Saving
Page_Render Page Rendering
Page_Unload Page Unloading
The first processed method is Page_Init(). Once the control tree has been created, the controls declared in the .aspx file are initialized. The controls can modify some of the settings set in this method to be used later in the page life cycle. Obviously no other information is available to be modified at this time.
The next processed method is LoadViewState(). The Viewstate contains stored information that is set by the page and controls of the page. This is carried to and from every aspx page request per visitor.

The next processed method is LoadPostData(). These are values associated with the HTML form elements the visitor has typed, changed or selected. Now the control has access to this information which can update their stored information pulled from the Viewstate.
The next processed method is Page_Load(). This method should look familiar and is usually the most common used method on the server side application code for an .aspx file. All code inside of this method is executed once at the beginning of the page.
The next processed method is RaisePostDataChangedEvent(). When a visitor completes a form and presses the submit button, an event is triggered. This change in state signals the page to do something.
The next processed method is RaisePostBackEvent(). This method allows the page to know what event has been triggered and which method to call. If the visitor clicks Button1, then Button1_Click is usually called to perform its function.


The next processed method is Page_PreRender(). This method is the last chance for the Viewstate to be changed based on the PostBackEvent before the page is rendered.
The next processed method is SaveViewState(). This method saves the updated Viewstate to be processed on the next page. The final Viewstate is encoded to the _viewstate hidden field on the page during the page render.
The next processed method is Page_Render(). This method renders all of the application code to be outputted on the page. This action is done with the HtmlWriter object. Each control uses the render method and caches the HTML prior to outputting.
The last processed method is Page_Unload(). During this method, data can be released to free up resources on the server for other processes. Once this method is completed, the HTML is sent to the browser for client side processing.
Now you should have a little bit better understanding of the order of methods executed in the request of an .aspx file.

Wednesday, August 10, 2011

Default Values Table for ASP.NET

The following table shows the default values of value types returned by the default constructors. Default constructors are invoked by using the new operator, as follows:
int myInt = new int();
The preceding statement has the same effect as the following statement:
int myInt = 0;
Remember that using uninitialized variables in C# is not allowed.

Value type

Default value

bool

false

byte

0

char

'\0'

decimal

0.0M

double

0.0D

enum

The value produced by the expression (E)0, where E is the enum identifier.

float

0.0F

int

0

long

0L

sbyte

0

short

0

struct

The value produced by setting all value-type fields to their default values and all reference-type fields to null.

uint

0

ulong

0

ushort

0

Nullable Types in C#

Nullable types are instances of the System.Nullable<T> struct. A nullable type can represent the correct range of values for its underlying value type, plus an additional null value. For example, a Nullable, pronounced "Nullable of Int32," can be assigned any value from -2147483648 to 2147483647, or it can be assigned the null value. A Nullable can be assigned the values true false, or null. The ability to assign null to numeric and Boolean types is especially useful when you are dealing with databases and other data types that contain elements that may not be assigned a value. For example, a Boolean field in a database can store the values true or false, or it may be undefined.

class NullableExample
{
    static void Main()
    {
        int? num = null;
        if (num.HasValue == true)
        {
            System.Console.WriteLine("num = " + num.Value);
        }
        else
        {
            System.Console.WriteLine("num = Null");
        }

        // y is set to zero
        int y = num.GetValueOrDefault();

        // num.Value throws an InvalidOperationException if num.HasValue is false
        try
        {
            y = num.Value;
        }
        catch (System.InvalidOperationException e)
        {
            System.Console.WriteLine(e.Message);
        }
    }
}
 
The example will display the output:
num = Null
Nullable object must have a value.

Using Structs in ASP.NEt

The struct type is suitable for representing lightweight objects such as Point, Rectangle, and Color. Although it is just as convenient to represent a point as a class with Auto-Implemented Properties, a struct might be more efficient in some scenarios. For example, if you declare an array of 1000 Point objects, you will allocate additional memory for referencing each object; in this case, a struct would be less expensive. Because the .NET Framework contains an object called Point, the struct in this example is named "CoOrds" instead.

public struct CoOrds
{
    public int x, y;

    public CoOrds(int p1, int p2)
    {
        x = p1;
        y = p2;
    }
} 
 
It is an error to define a default (parameterless) constructor for a struct. It is also an error to initialize an instance field in a struct body. You can initialize struct members only by using a parameterized constructor or by accessing the members individually after the struct is declared. Any private or otherwise inaccessible members can be initialized only in a constructor.
When you create a struct object using the new operator, it gets created and the appropriate constructor is called. Unlike classes, structs can be instantiated without using the new operator. In such a case, there is no constructor call, which makes the allocation more efficient. However, the fields will remain unassigned and the object cannot be used until all of the fields are initialized.
When a struct contains a reference type as a member, the default constructor of the member must be invoked explicitly, otherwise the member remains unassigned and the struct cannot be used. (This results in compiler error CS0171.)
There is no inheritance for structs as there is for classes. A struct cannot inherit from another struct or class, and it cannot be the base of a class. Structs, however, inherit from the base class Object. A struct can implement interfaces, and it does that exactly as classes do.
You cannot declare a class using the keyword struct. In C#, classes and structs are semantically different. A struct is a value type, while a class is a reference type. For more information, see Value Types.
Unless you need reference-type semantics, a small class may be more efficiently handled by the system if you declare it as a struct instead.
 

Structure vs Classes in ASP.NET

Structs share most of the same syntax as classes, although structs are more limited than classes:
  • Within a struct declaration, fields cannot be initialized unless they are declared as const or static.
  • A struct cannot declare a default constructor (a constructor without parameters) or a destructor.
  • Structs are copied on assignment. When a struct is assigned to a new variable, all the data is copied, and any modification to the new copy does not change the data for the original copy. This is important to remember when working with collections of value types such as Dictionary<string, myStruct>.
  • Structs are value types and classes are reference types.
  • Unlike classes, structs can be instantiated without using a new operator.
  • Structs can declare constructors that have parameters.
  • A struct cannot inherit from another struct or class, and it cannot be the base of a class. All structs inherit directly from System.ValueType, which inherits from System.Object.
  • A struct can implement interfaces.
  • A struct can be used as a nullable type and can be assigned a null value.

Sunday, August 7, 2011

HTTP Status and Error Codes

HTTP Status and Error Codes


During your HTTP sessions, you'll receive various numbered codes from Web servers. When connected via HTTP, CuteFTP displays these codes in the log window. Some codes represent errors. Most others simply communicate the status of the connection. Here are brief explanations for the most common status and error codes.
The list below are standard HTTP codes. Numbers outside this list are proprietary to the Server or Client that you are using.

Error or Status Code Description
100 Series Informational - These status codes indicate a provisional response. The client should be prepared to receive one or more 1xx responses before receiving a regular response.
100 Continue.
101 Switching protocols.
Description
200 Series Success - This class of status codes indicates that the server successfully accepted the client request.
200 The client request has succeeded. This status code indicates that the Web server has successfully processed the request.
201 Created.
202 Accepted.
203 Non-authoritative information.
204 No content.
205 Reset content.
206 Partial content.


300 Series Redirection - The client browser must take more action to fulfill the request. For example, the browser may have to request a different page on the server or repeat the request by using a proxy server.
302 Object moved.
304 Not modified. The client requests a document that is already in its cache and the document has not been modified since it was cached. The client uses the cached copy of the document, instead of downloading it from the server
307 Temporary redirect.


400 Series Client Error - An error occurs, and the client appears to be at fault. For example, the client may request a page that does not exist, or the client may not provide valid authentication information.
400 Bad request.
401 Access denied.
401.1 Logon failed. The logon attempt is unsuccessful, probably because of a user name or password that is not valid.
401.2 Logon failed due to server configuration.
401.3 Unauthorized due to ACL on resource. This indicates a problem with NTFS permissions. This error may occur even if the permissions are correct for the file that you are trying to access. For example, you see this error if the IUSR account does not have access to the C:\Winnt\System32\Inetsrv directory.
401.4 Authorization failed by filter.
401.5 Authorization failed by ISAPI/CGI application.
401.7 Access denied by URL authorization policy on the Web server. This error code is specific to IIS 6.0.
403 Forbidden.
403.1 Execute access forbidden. The following are two common causes of this error message:
  • You do not have enough Execute permissions. For example, you may receive this error message if you try to access an ASP page in a directory where permissions are set to None, or you try to execute a CGI script in a directory with Scripts Only permissions.
  • The script mapping for the file type that you are trying to execute is not set up to recognize the verb that you are using (for example, GET or POST).
403.2 Read access forbidden. Verify that you have Read access to the directory. Also, if you are using a default document, verify that the document exists.
403.3 Write access forbidden. Verify that you have Write access to the directory
403.4 SSL required. Use HTTPS instead of HTTP to access the page.
403.5 SSL 128 required.
403.6 IP address rejected.
403.7 Client certificate required. You do not have a valid client certificate installed
403.8 Site access denied.
403.9 Too many users. The number of users who are connected to the server exceeds the connection limit.
403.10 Invalid configuration.
403.11 Password change.
403.12 Mapper denied access. The page that you want to access requires a client certificate, but the user ID that is mapped to your client certificate has been denied access to the file.
403.13 Client certificate revoked.
403.14 Directory listing denied.
403.15 Client Access Licenses exceeded.
403.16 Client certificate is untrusted or invalid.
403.17 Client certificate has expired or is not yet valid.
403.18 Cannot execute requested URL in the current application pool. This error code is specific to IIS 6.0.
403.19 Cannot execute CGIs for the client in this application pool. This error code is specific to IIS 6.0.
403.20 Passport logon failed. This error code is specific to IIS 6.0.
404 Not found. This error may occur if the file that you are trying to access has been moved or deleted.
404.0 File or directory not found.
404.1 Web site not accessible on the requested port.
404.2 Web service extension lockdown policy prevents this request.
404.3 MIME map policy prevents this request.
405 HTTP verb used to access this page is not allowed (method not allowed).
406 Client browser does not accept the MIME type of the requested page.
407 Proxy authentication required.
412 Precondition failed.
413 Request entity too large.
414 Request-URL too long.
415 Unsupported media type.
416 Requested range not satisfiable.
417 Execution failed.
423 Locked error.


500 Series Server Error - The server cannot complete the request because it encounters an error.
500 Internal server error. You see this error message for a wide variety of server-side errors.
500.12 Application is busy restarting on the Web server. Indicates that you tried to load an ASP page while IIS was in the process of restarting the application. This message should disappear when you refresh the page. If you refresh the page and the message appears again, it may be caused by antivirus software that is scanning your Global.asa file.
500.13 Web server is too busy.
500.15 Direct requests for Global.asa are not allowed.
500.16 UNC authorization credentials incorrect. This error code is specific to IIS 6.0.
500.18 URL authorization store cannot be opened. This error code is specific to IIS 6.0.
500.100 Internal ASP error. You receive this error message when you try to load an ASP page that has errors in the code.
501 Header values specify a configuration that is not implemented.
502 Bad Gateway. Web server received an invalid response while acting as a gateway or proxy. You receive this error message when you try to run a CGI script that does not return a valid set of HTTP headers.
502.1 CGI application timeout.
502.2 Error in CGI application.
503 Service unavailable. This error code is specific to IIS 6.0.
504 Gateway timeout.
505 HTTP version not supported.

HTTP 403 Error and Substatus Error Codes for IIS

In the HTTP protocol used on the World Wide Web, 403 Forbidden is an HTTP status code returned by a web server when a user requests a web page or media that the server does not allow them to. In other words, the server can be reached, but the server declined to allow access to the page. This response is returned by the Apache web server when directory listings have been disabled. Microsoft IIS responds in the same way when directory listings are denied. (This response may also be returned by the server if the client issued a WebDAV PROPFIND request but did not also issue the required Depth header, or issued a Depth header of infinity.)
403 Substatus Error Codes for IIS

    403.1 - Execute access forbidden.
    403.2 - Read access forbidden.
    403.3 - Write access forbidden.
    403.4 - SSL required.
    403.5 - SSL 128 required.
    403.6 - IP address rejected.
    403.7 - Client certificate required.
    403.8 - Site access denied.
    403.9 - Too many users.
    403.10 - Invalid configuration.
    403.11 - Password change.
    403.12 - Mapper denied access.
    403.13 - Client certificate revoked.
    403.14 - Directory listing denied.
    403.15 - Client Access Licenses exceeded.
    403.16 - Client certificate is untrusted or invalid.
    403.17 - Client certificate has expired or is not yet valid.

Source :- http://en.wikipedia.org/wiki/HTTP_403

Friday, August 5, 2011

What is delegates and how do we use them in asp.net?

The delegate is used to declare a reference type that can be used to encapsulate a named or an anonymous method. Delegates are similar to function pointers in C++; however, delegates are type-safe and secure. In asp.net we are using delegates to create custom events for custom controls.
For example, custom pager control most likely needs to have PageChanged event.
We can declare it like this:

public delegate void PageChangedHandler(object sender, EventArgs e);
public event PageChangedHandler PageChanged;

Then whenever we need to fire event:

if (PageChanged != null) // Checks if user assigned any event handler
    PageChanged(this, new EventArgs());

And then we can use our custom pager control as follow:

<cc:PostsPager ID="PostsPager" runat="server" OnPageChanged="PostsPager_PageChanged" />

Friday, July 1, 2011

Speed up your ASP.NET site with compression(GZip Compression)

Loading speed is a crucial issue for every professional web site. If your website takes too long to load, your visitors, who are your potential customers, will go away and never come back no matter how much effort you have put to develop a professional code or design a charming user interface. There are several techniques to speed up your website, pre creation of pages, caching … and compression.
Server level and page level compression

Compression can be performed in two stages: IIS and each individual web site. If you enable compression on IIS (6.0 and later),. all web sites and virtual directories are affected. Therefore, this technique is recommended only if you have the full control over that server and you own all the web sites on it. You must also get sure that all web sites on the server have no problem with getting compressed. If there are several web sites on the server who you are not the owner or administrator of them, you would better not bother by trying to compress them.
Page level compression lets you compress your own web site or even only specific pages or file types. For example we can set our compression mechanism to compress .aspx files only, and not html files, or not pages that use ASP.NET Ajax controls.
What do I need?

If you are using ASP.NET 2.0 and above (not 1.1) you do not need any 3rd party compression module or tool. All you need is provided by .NET Framework.
How to do it?

All we need to do is to compress the response output. It can be easily done by GzipStream or DeflateStream classes. Both of them use the same algorithm and the only difference is that GzipStream stores more header information on compressed file/stream to be used by 3rd party zip tools. Therefore DeflateStream is more light-weight. Before we compress the response output, we should get sure that the page we are doing to compress, supports compression. Otherwise, the user’s browser might not be able to decompress and display the page.
There are two ways of enabling response compression: First via HttpHandlers and Web.Config file, second by adding some code to global.asax file. Using a HttpHandler lets you change your compression module later without recompiling your web application and re-deploying it.
To compress the response output, all you need to do is to compress the Response.OutPut and assign it to Response.Filter. Response.Filter allows you wrap the Response.Output before it is transmitted towards the web browser.

context.Response.Filter = new
DeflateStream(context.Response.OutputStream,CompressionMode.Compress);

To create a HttpHandler, add a new class library project to your solution, including a public class which inplements IHttpHandler interface. Then implement it as the following code:
public
class
CompressMe:IHttpHandler

{

:

public
void ProcessRequest(HttpContext context)

{

string pageEncoding = context.Request.Headers["Accept-Encoding"];

if (!string.IsNullOrEmpty(pageEncoding))

if (pageEncoding.ToLower().Contains(“gzip”))

{

context.Response.AppendHeader(“Content-encoding”, “gzip”);

context.Response.Filter = new
GZipStream(context.Response.Filter, CompressionMode.Compress);

} }
As seen in the above code, first we check to see if the page header indicates compression is supported or not. If it is supported, we create a new instance of DeflateStream (belongs to .NET Framework 2.0 +) and pass Response. Filter to it. This is the stream that we like to compress. The resulting stream is assigned to Response.Filter , meaning that a compressed copy of Response. Filter is transmitted to the browser.
The context.Response.AppendHeader(“Content-encoding”, “gzip”) thing says the browser that it must decompress the page before displaying it. Today’s browsers usually are smart enough to figure it out, but better safe than sorry.
Now build the class library and copy it into /bin folder of your web site or reference to it from within your web application. Suppose that your class library is called CompressionModule, it includes a namespace called CompressionModule and class named CompressMe (which implements IHttpHandler interface). Now add the following line of code to web.config file, under <httpHandlers> tag:
<add
verb=*
path=*.aspx
type=

CompressionModule.

CompressionModule,

CompressMe

/>

This means that all .aspx files, with both GET and POST methods, must be redirected to CompressMe class before being sent to the browser.
NOTICE: YOU MUST RUN YOUR WEB SITE UNDER IIS AND NOT VISUAL STUDIO DEVELOPMENT SERVER. OTHERWISE YOUR PAGE SIZE WILL BE ZERO BYTES.
Another way to activate the compression is to implement Application_PreRequestHandlerExecute in global.asax file. To do so, add an event handler to global.asax file, convert sender argument to HttpApplication and use its Reponse property exactly like you did in above code:
void Application_PreRequestHandlerExecute(object sender, EventArgs e)

{


HttpApplication app = sender as
HttpApplication;

…………

}

More benefits of compression in ASP.NET applications

Compression can even be more of help in your ASP.NET web site. For example you can compress uploaded files, especially if you store them in a data base since databases are very expensive and accessing them is time/resource consuming.
The following code demonstrates how to compress an uploaded file and save it:
As the code exposes, we access to the uploading file via FileUpload.PostedFile.InputStream which helps us to create a GzipStream object based on it. We also create a FileStream to store the compressed file in a .zip file. Afterwards, we start to read 2048-byte packets of the GzipStream and write it into the FileStream. That’s all J

private
void DoUpload(FileUpload fileUpload)

{


const
int bufferSize = 2048;


string fileName = Path.ChangeExtension(Path.Combine(Server.MapPath(Request.Url.AbsolutePath), fileUpload.PostedFile.FileName), “.zip”);


FileStream fs = new
FileStream(, FileMode.Create);


GZipStream zip = new
GZipStream(fileUpload.PostedFile.InputStream, CompressionMode.Compress);


try

{


byte[] buffer = new
byte[bufferSize];


int offset = 0;


int count = 0;

count = zip.Read(buffer, offset, bufferSize);


while (count > 0)

{

fs.Write(buffer, offset, count);

offset += count;

}

}


finally

{

fs.Close();

zip.Close();

}

}

Thursday, June 23, 2011

Google to FTS Syntax Cheat Sheet

OPERATOR EXAMPLE

DESCRIPTION

 

nut

Searches for inflectional forms of the word nut

crank arm

crank AND arm

Searches for documents containing inflectional forms of the words crank AND arm crank and ann. The keyword AND is optional.

tire OR air

Searches for documents containing inflectional forms of the words tire or air,

“reflector bracket”

Performs a phrase search for the phrase "reflector bracket".

hardware -bracket

Searches for documents containing inflectional forms of the word hardware but not the word bracket.

+clamp

Searches for the word darn') without generating inflectional forms.

~seat

Searches for thesaurus forms of the word seat

Assemb*

Searches for words that begin with the prefix assemb

. <washer nut>

Searches for documents that contain the words washer in close proximity to the word nut



The SOUNDEX coding algorithm

The SOUNDEX code is a substitution code using the following rules:

The first letter of the surname is always retained.

The rest of the surname is compressed to a three digit code using the following coding scheme:
A E I O U Y H Wnot coded
B F P Vcoded as 1
C G J K Q S X Zcoded as 2
D Tcoded as 3
Lcoded as 4
M Ncoded as 5
Rcoded as 6

Consonants after the initial letter are coded in the order they occur:

HOLMES = H-452

ADOMOMI = A-355

The code always uses initial letter plus three digits. Further consonants in long names are ignored:

VONDERLEHR = V-536

Zeros are used to pad out shorter names:

BALL = B-400

SHAW = S-000

Double consonants are treated as one letter:

BALL = B-400

As are adjacent consonants from the same code group:

JACKSON = J-250

A consonant following an initial letter from the same code group is ignored:

SCANLON = S-545

Abbreviated prefixes should be spelt out in full:

ST JOHN = SAINTJOHN = S-532

Apostrophes and hyphens are ignored:

KING-SMITH = KINGSMITH = K-525

Consonants from the same code group separated by W or H are treated as one:

BOOTH-DAVIS = BOOTHDAVIS = B-312



Asp.net GZIP compression

// Authenticate here as we need Session and Request objects to be initialised
protected void Application_PreRequestHandlerExecute(Object sender, EventArgs e)
{

//HttpApplication app = sender as HttpApplication;
//string acceptEncoding = app.Request.Headers["Accept-Encoding"];
//System.IO.Stream prevUncompressedStream = app.Response.Filter;
//if (!(app.Context.CurrentHandler is Page || app.Context.CurrentHandler.IsReusable.GetType().Name == "SyncSessionlessHandler") || app.Request["HTTP_X_MICROSOFTAJAX"] != null)
//{
// return;
//}
//if (acceptEncoding == null || acceptEncoding.Length == 0)
//{
// return;
//}
//acceptEncoding = acceptEncoding.ToLower();
//if (acceptEncoding.Contains("deflate") || acceptEncoding == "*")
//{
// // defalte
// app.Response.Filter = new System.IO.Compression.DeflateStream(prevUncompressedStream, System.IO.Compression.CompressionMode.Compress);
// app.Response.AppendHeader("Content-Encoding", "deflate");
//}
//else if (acceptEncoding.Contains("gzip"))
//{
// // gzip
// app.Response.Filter = new System.IO.Compression.GZipStream(prevUncompressedStream, System.IO.Compression.CompressionMode.Compress);
// app.Response.AppendHeader("Content-Encoding", "gzip");
//}

}

Removing the Whitespaces while rendering the webpage

Removing the Whitespaces while rendering the webpage



//Overrides the Render method
        //protected override void Render(HtmlTextWriter writer)
        //{
        //    using (HtmlTextWriter htmlwriter = new HtmlTextWriter(new System.IO.StringWriter()))
        //    {
        //        base.Render(htmlwriter);
        //        string html = htmlwriter.InnerWriter.ToString();

        //        if ((ConfigurationManager.AppSettings.Get("RemoveWhitespace") + string.Empty).Equals("true", StringComparison.OrdinalIgnoreCase))
        //        {
        //            html = REGEX_BETWEEN_TAGS.Replace(html, "> <");
        //            html = REGEX_LINE_BREAKS.Replace(html, string.Empty);


        //            html = Regex.Replace(html, @"(?<=[^])\t{2,}|(?<=[>])\s{2,}(?=[<])|(?<=[>])\s{2,11}(?=[<])|(?=[\n])\s{2,}", string.Empty);
        //            html = Regex.Replace(html, @"[ \f\r\t\v]?([\n\xFE\xFF/{}[\];,<>*%&|^!~?:=])[\f\r\t\v]?", "$1");
        //            html = html.Replace(";\n", ";");
        //        }
               

        //        writer.Write(html.Trim());
        //    }
        //}

Friday, May 27, 2011

Remove whitespace from your ASP.NET page

In a normal webpage there is a lot of whitespace. We put it there in form of tabs and carriage-returns in order to make the document more maintainable and easier to read. But it has its price in terms of adding to the overall weight of the document in kilobytes and it takes longer for the browser to render a page with a lot of whitespace. This is especially true for IE6, which renders whitespace a lot slower than its counterparts.
The problem is, we don’t want to write html without whitespace. It would make it impossible to maintain. What we are looking for is an automatic way to filter the whitespace at runtime. In ASP.NET this is easy. You can override the page’s Render method and do the filtering in there. We also want to be able to turn the filtering on and off easily, because when we develop we often want to look at the rendered html code and make sense of it.
Here is an example that does just that. It overrides the render method of the page and is turned on/off in the web.config.
Implement this on the aspx page or even better on the master page if you use ASP.NET 2.0:
//Add this to the top of the page
    using
System.Configuration;
    using System.Web.UI;
    using System.Text.RegularExpressions;
//Overrides the Render method
    protected override void Render(HtmlTextWriter writer)
    {
        using (HtmlTextWriter htmlwriter = new HtmlTextWriter(new System.IO.StringWriter()))
        {
            base.Render(htmlwriter);
            string html = htmlwriter.InnerWriter.ToString();

            if ((ConfigurationManager.AppSettings.Get("RemoveWhitespace") + string.Empty).Equals("true", StringComparison.OrdinalIgnoreCase))
            {
                html = Regex.Replace(html, @"(?<=[^])\t{2,}|(?<=[>])\s{2,}(?=[<])|(?<=[>])\s{2,11}(?=[<])|(?=[\n])\s{2,}", string.Empty);
                html = Regex.Replace(html, @"[ \f\r\t\v]?([\n\xFE\xFF/{}[\];,<>*%&|^!~?:=])[\f\r\t\v]?", "$1");
                html = html.Replace(";\n", ";");
            }

            writer.Write(html.Trim());
        }
    }
and add this to the web.config appsetting section:
"RemoveWhitespace" value="true"/>
It's as easy as that and the overhead is reasonable. I use this on the most of the websites I build and have never noticed any negative impact in any way.


Source : -http://madskristensen.net/post/Remove-whitespace-from-your-ASPNET-page.aspx

Wednesday, May 11, 2011

10 Highly Effective Free Ways of Generating Traffic

In this highly digital world, targeted traffic to your website could not be any more valuable. You can have the best, most accessible, beautiful designed website in the world... if no one visits it you might as well just take it down. Does this sound like the old "If a tree falls in the woods does anyone hear it" phrase? That is because it is not that far off.
If you have a huge advertising budget getting targeted traffic is not that difficult. There are countless tools available for identifying the right markets and you can simply write checks to your hearts desire. However most of us are placed in situations where we (or our clients) simply can not afford to spend tens of thousands of dollars on internet advertising.
Luckily, if you know the right strategies you can get plenty of targeted traffic with out spending any hard cash.

1. Build an E-mail List

Building an e-mail list is something that many of us have seemed to have forgotten about. It may be due to the association with spam, or that it has been around so long, or maybe social media and RSS seem like a better option. E-mail offers a very unique difference and advantage however. First you have the ability to send specific e-mails and messages to specific groups and segments of your audience (where RSS is an all or nothing approach), but additionally you tend to have a more captive audience which results in higher response and conversion rates.
RESOURCES

2. Focus on Organic Search

Organic Search Engine Optimization (SEO) has the capability to pull in large amounts of traffic with out having to spend a dime (in theory). Organic search is simply the unpaid listing of websites that show up when someone performs an internet search on a site such as Google, Yahoo or Bing. With 80% of all transactions starting with an internet search, not only do you have an audience that is ready to buy... it has the potential to bring in more traffic that almost any other approach available.
While we would all like to get more organic traffic it often is easier said than done. It is important to keep in mind that there is no way to trick your way to good rankings. The best practice is always to create high quality valuable content and ensure your website is built in a way that is search engine friendly. Keep in mind that when you are focusing on organic traffic getting high quality links with targeted anchor text tends to be most effective (for example, DIY Solar Energy)
RESOURCES

3. Social Media

Social Media is the use of employing sites and media that allow you to connect with other people on a social level. The common examples would be Twitter, YouTube, Facebook and Linkedin. These sites let you connect and converse with other users who have similar interests as you. Because all websites have some overarching theme or topic, you can connect with people who are also interested in the topic. This gives you the opportunity to build awareness of your site/product/service with your audience on a personal level which is much more effective than on a push media level.
An important note about social media. Make sure anything you do, publish or say on these networks is done in a way that actually adds value to someones day. If you are simply signing up to spam links to your site you are likely to have sub-par results.
RESOURCES

4. Develop Some Public Relations

With so much focus on digital media public relations has lost a lot of the attention that it used to hold. Public relations is a valuable part of any marketing mix and it is an approach to building awareness and traffic that shouldn't be ignored. Especially considering it can be a low cost (or no cost) solution. There really are two areas in which we can benefit from public relations, through traditional forms of media and new forms of media.
While there are many different facets of PR, where you stand to gain a lot of new traffic is by having stories and articles written about your website.

Offline

Back before the internet became what it is today, companies would send a small one page write up with "news worthy" content regarding their company to any relevant media sources. These went out to newspapers, magazines, journals, etc... This write up was called a "press release" and simply covered the "who, what, when, why and how" of the story. Despite the fact that the internet is changing the way people consume news, it is still an effective way of building awareness about your website and/or company.
There have been plenty of studies and reports that show a direct correlation between traditional media mentions and website traffic. Even if your website isn't listed in the article you will find more branded searches for your website (and thus more traffic). If you know that you are going to be covered you can also try and capitalize on the increase in traffic and offer special discounts or personalized messages.
The process is fairly simple. Round up any related publications and look for the news contacts, almost every publication will have an e-mail where you can submit news. Write up a short press release and distribute it to the list you created previously. Don't be afraid to call up the media and pitch the story to them a few days after you have sent the release.

Online

This process doesn't have to be limited to offline publications. There are lots of industry blogs and websites that regularly post content and news to an audience consisting of your target market. While the concept is similar (get an online publication or blog to talk about you, your product, your company, etc..) it tends to be slightly different in approach. In most cases getting articles and news published can be a bit trickier as you are not always working with professional journalists who always need that next story.
When pitching stories to bloggers or online publications you may need to spend more time developing a relationship with the authors first. Additionally finding a list of relevant blogs can be a bit more tricky and require more searching. The more open and honest you are about the situation the better your results will be. It is unnecessary to write a formal press release (and ineffective), focus on starting a conversation with the author instead.
RESOURCES

5. Add Value to Other Sites

Ok "Web 2.0" might be a cliche term, but there are a countless websites that thrive on user participation and contribution. I assure you that there are some related to your industry or that consist of your target market, as people naturally want to connect and discuss things that are important to them. Some industries and topics may be harder to find than others, but if you are listening there are people talking.
Once you have found these types of sites, take a little bit of time every week to add value to them through participation. Over time the users will begin to become familiar with you and trust you. By offering your knowledge you will build awareness, good will and trust for your company/website/profession. Additionally most of these sites will even let you have a profile or signature that can contain a link (or several) to your website so you don't have to be so direct about your motives.
The more you participate and the more you contribute the more you will get out of this technique.

6. Get Others Involved

Sometimes the best way to build awareness and traffic is through word of mouth. Word of mouth is actually one of the most effective forms of marketing online and off. Not far behind word of mouth in terms of effectiveness is celebrity endorsement. This method can drive traffic to your website by using both of these proven techniques. When you get other people involved in your website (either as guest bloggers, giving them an interview, etc...) you are doing two very powerful things. First you are actually giving people a small portion of ownership in what you are doing. When they invest the time to contribute and assist the site they buying into the idea that it is worth their time and effort. Because of this they are going to be much more likely to want your website to succeed and thus tell other people. If nothing else they will likely link to their contribution and tell people about it.
Additionally by having a well known figure contributing to your website you are getting their endorsement and it can boost your credibility significantly (even to the point of matching their own). This subconsciously will resonate with users and put the site at a higher status than they would previously, making them much more likely to share your site, contribute to the site and return for future visits.

7. Syndicate Your Content

Chances are if you are looking for more traffic there are at least a handful of other sites on the web that cover the same (or similar) subjects that you are. By creating high quality content you can offer a win/win situation. Approach popular websites and offer to publish (or syndicate) your articles on their site. They win by providing relevant content to their readers with out having to perform the work of creating it and you can build awareness / traffic to your site through an article byline.
If you can't find any websites that will bite you could always submit your articles to article directory sites. They often have huge mailing lists on various topics and tend to rank fairly well themselves.

8. Contests For Your Services

Contests have always been a great way to build awareness and interest. After all who doesn't like wining something for free? Since we are talking about free ways to generate traffic purchasing an item to give away is out of the picture. Sure you might be able to find something you already have to give away but how likely is that? Instead offer up your expertise. Chances are you have skills or knowledge that is valuable to other people or companies. By offering a contest for the services you don't have to spend any additional capital and build awareness for all of those people who previously didn't know about your expertise but could use your help.
This is a perfect time to build your e-mail list. Ask those who are signing up for your contest if they would like to be entered into your newsletter for future opportunities and giveaways.
If you don't want to do a contest you could also do a free seminar or webinar, both of which can drive a lot of traffic and attention to your site as well.

9. Be Remarkable

Supreme marketing guru Seth Godin often talks about the importance of being remarkable. The idea is that if you are "the best in the world" at something it will naturally market for you. An important part of achieving this status is picking a very specific and narrow niche or specialty. If you have a content focused website, what very specific topic can you cover that you can do better than anyone else in the marketplace? If you run a service company, what very narrow and specific niche can you server better than any of the other companies in the market?
By focusing your attention on becoming the best at this narrow specialty you will become remarkable and an expert. Your demand will increase dramatically and people will start finding your site naturally. This actually is more effective and will generate more traffic than if you were to try and be "good" in a market that is already saturated with those who can do a better job than you.
RESOURCES

10. Start a Group or Organization

Through out human history we have always desired a sense of belonging and community.It doesn't take much looking to find a group of people who are interested in the same things that you are. Start a weekly meet up, tweet up, chat up (if it is virtual), forum, digest, etc... that gets the group of people together to talk and discuss the issues. You would be surprised at how quickly a group like that can grow. As the founder and head of the group or organization you will receive a lot of notoriety and interest. It will also give you great opportunities to have links and advertisements for your company or website through out areas in which members interact.

Conclusion

There are lots of different ways to drive traffic to your website with out spending much (or any) money. Sometimes they are more effective than paid methods. The general rule of thumb though is that traffic tends to be high involvement and low cost, or low involvement and high cost. Search Marketing is a good example of this - where pay per click can get you traffic very quickly and with out much in terms of time investment, organic rankings can take significantly more work and take much longer but at a much lower cost.
It is important to understand what techniques, tactics and strategies are going to be most effective for any given situation. Blindly trying them all is not going to work very well, take the time to think through and plan your approach in building traffic.


Search engine optimization (SEO)- Things todo and not todo

Search engine optimization (SEO) is the process of improving the volume and quality of traffic to a web site from search engines via "natural" ("organic" or "algorithmic") search results for targeted keywords. In this article find out how to do basic SEO, code search engine friendly pages and how to do a basic promotion of your site.

Things to do
  • Keywords in URL
    For example http://www.aboutdoghealth.org/ use whole words – keywords to best describe your site. Don’t rely on this if you don’t have keywords in other parts of your site.
  • Keywords in <title> tag
    This shows search results as your page title, so this is one of the most important things and it shouldn’t be long 5-6 words max, and use keyword at the beginning.
  • Keywords in anchor texts
    Also very important, especially for the anchor text, because if you have the keyword in the anchor text in a link from another site, this is regarded as getting a vote from this site not only about your site in general, but about the keyword in particular.
  • Keywords in headings (<H1>, <H2>, etc. tags)
    One more place where keywords count a lot. But beware that your page has actual text about the particular keyword.
  • Keywords in the beginning of a document
    While coding your page put your main content before side bar. Because this also counts, though not as much as anchor text, title tag or headings
  • Keywords in <alt> tags
    Spiders don't read images but they do read their textual descriptions in the <alt> tag, so if you have images on your page, fill in the <alt> tag with some keywords about them.
  • Anchor text of inbound links
    This is one of the most important factors for good rankings. It is best if you have a keyword in the anchor text but even if you don't, it is still OK.
  • Origin of inbound links
    It is important if the site that links to you is a reputable one or not. Generally sites with greater Google PR are considered reputable and the .edu and .gov sites are the most reputable
  • Links from similar sites
    Having links from similar sites is very, very useful. It indicates that the competition is voting for you and you are popular within your topical community.
  • Metatags
    Metatags are becoming less and less important but if there are metatags that still matter, these are the <description> and <keywords> ones.
  • Unique content
    Having more content (relevant content, which is different from the content on other sites both in wording and topics) is a real boost for your site's rankings.
  • Frequency of content change
    Frequent changes are favored. It is great when you constantly add new content but it is not so great when you only make small updates to existing content.
  • Site Accessibility
    Another fundamental issue, which that is often neglected. If the site (or separate pages) is unaccessible because of broken links, 404 errors, password-protected areas and other similar reasons, then the site simply can't be indexed.
  • Sitemap
    It is great to have a complete and up-to-date sitemap, spiders love it, no matter if it is a plain old HTML sitemap or the special Google sitemap format.
Things not to do
  • Keyword stuffing
    Any artificially inflated keyword density (10% and over) is keyword stuffing and you risk getting banned from search engines.
  • Keyword dilution
    When you are optimizing for an excessive amount of keywords, especially unrelated ones, this will affect the performance of all your keywords and even the major ones will be lost (diluted) in the text.
  • Single pixel links
    when you have a link that is a pixel or so wide it is invisible for humans, so nobody will click on it and it is obvious that this link is an attempt to manipulate search engines.
  • Cross-linking
    Crosslinking occurs when site A links to site B, site B links to site C and site C links back to site A
  • Duplicate content
    When you have the same content on several pages on the site, this will not make your site look larger because the duplicate content penalty kicks in. To a lesser degree duplicate content applies to pages that reside on other sites but obviously these cases are not always banned
  • Doorway pages
    Creating pages that aim to trick spiders that your site is a highly-relevant one when it is not, is another way to get the kick from search engines.
  • Cloaking
    Cloaking is another illegal technique, which partially involves content separation because spiders see one page (highly-optimized, of course), and everybody else is presented with another version of the same page.
  • Invisible text
    This is a black hat SEO practice and when spiders discover that you have text specially for them but not for humans, don't be surprised by the penalty.
  • Illegal Content
    Using other people's copyrighted content without their permission or using content that promotes legal violations can get you kicked out of search engines.
  • Flash
    Spiders don't index the content of Flash movies, so if you use Flash on your site, don't forget to give it an alternative textual description. And also don’t have just flash home page without HTML one.
  • Frames
    Frames are very bad for SEO. Avoid using them unless really necessary.
  • Redirects (301 and 302)
    When not applied properly, redirects can hurt a lot – the target page might not open, or worse – a redirect can be regarded as a black hat technique, when the visitor is immediately taken to a different page.
  • Bans in robots.txt
    If indexing of a considerable portion of the site is banned, this is likely to affect the nonbanned part as well because spiders will come less frequently to a “noindex” site.
  • Session IDs
    This is even worse than dynamic URLs. Don't use session IDs for information that you'd like to be indexed by spiders.