ASP.NET Performance Optimization

Introduction

The Performance of an application from the perspective of the client is very important and if it is degraded from too many round trips, too many resources, or too many Ajax or server calls, it gives the end user no option other than leaving your useful resource. So to end this problem you must have an eye on ways to boost your application performance by using performance boosters.

The performance of an application is the most important aspect that you need to monitor every day to provide uninterrupted services to your favorite clients and the better approaches your system uses the better results you will get. I will write a series of Optimization and Performance Techniques for SQL, jQuery, Ajax, JavaScript, C#, query optimization, and Website Optimization in upcoming articles, but for now, I am stating ASP.NET Performance Metrics.

But wait, what are the ways that you can judge how you want optimization of your web/application?

Optimization and Performance metrics

Below is the list of Optimization and performance Metrics that you want to acknowledge.

  1. Speed
  2. Use Logs
  3. Proper Exception Handling
  4. View State
  5. Proper Use of Caching
  6. Avoid server-side validation
  7. Minify and Compress JS, CSS Resources
  8. Session Management
  9. Paging for Large Result set
  10. Avoid Unnecessary RoundTrips to Server
  11. Pages Must be Batch Compiled
  12. Partition Application Logically
  13. HTTP Compression
  14. Resource Management
  15. String Handling

We Can Discuss Each one in detail. So, let's start.

Speed

The speed of your application is the most important factor, and you need to keep an eye on this factor. There are many factors involved in boosting the speed of your application:

Reduce page size

  • By Reducing page size using external CSS and javascript files instead of inline CSS and javascript.
  • The other way to reduce page size is to use only the minified version of javascript files and also beautify CSS and JS with online tools.
  • To beautify Cascading Style Sheet Files, Follow this Link.
  • To beautify Javascript files, follow this Link.
  • To Minify Cascading Style Sheet Files, Follow this Link.
  • To Minify JavaScript files, follow this Link.
  • Beautify means to format the unformatted files (means with white space, comments, and without indentation).
  • Minify means to remove white spaces, comments, and non-indent text, and the nice feature is to reduce space by giving your functions and variables a single name like function a(b){ if b == '4'){ b='good'}}, move your code to a single line of function defined.
  • It's a very efficient way to separate the logic of your page as well as we did to create an application by separating the logic of data access and business layers, so at the page level, you can separate it by making user controls of header, body, and footer.

Reduce the number of requests to the server

  • The fewer number of requests served to the server the more efficient your page behavior is.
  • Reduce the number of requests by reducing the number of resources, like moving your inline css code of all files to single CSS and likewise with javascript. The other way is to cache the nonstatic resources and remove unnecessary headers from requests like version number and asp.net powered by. Use CDN (Content Delivery Network) so that it downloads files from the nearest available server and saves the concurrent requests if other websites are using the same jQuery plugin files.

Use Logs

Use IIS Log to trace out issues on a weekly or monthly basis of your application, and the best is that you have to watch it daily mainly. IIS Log contains information about your server, date and period, Referral page, and the original URL, and lots more information also the HTTP Status response codes through which you can understand the nature of the issue.

I have written a post on the HTTP Status Response code. You can read it Here.

The other way you can customize what to log is by creating a database table and inserting the exception detail in this. I have done it this way in my organization and also a tracking module which can search, generate reports daily, insert the log modules and lots more, so that you don't need to query the table again and again to check for daily error or reporting.

There are other best approaches to tracking the error. One of the best available tools is the Elmah Tool, which has a configuration base setting through which you can track it by configuring it with an email option for the responsible person.

ELMAH (Error Logging Modules and Handlers) is an open-source debugging tool for ASP.NET web services. When added to a running web application on a machine, exceptions that are thrown trigger event handlers in the ELMAH tool.

You can get it from Google Code Link OR.

You can get it from the Nuget Packages Link.

Proper Exception Handling

Many developers cannot handle proper exception techniques as a result the outcomes are not pretty satisfactory which means they cannot understand if there is any crashing of the application. The best way is to use try..... catch block appropriately to handle the exception in the right manner. Users can use the (if statement) to check for an open connection if not open and close the connection to the database. The other way use can use the (try.... catch block) for the connection if not closed throw an exception. With try...catch, it is best approach to use finally block also, with try...catch and then block like this to properly dispose of the unused resources if an exception occurs or not.

try
{
}
catch (Exception)
{
    throw;
}
finally
{
}

Exception handling is the most important technique for finding the original run time unhandled exception in an application, but it should be wisely and judiciously used.

View state

View state is an encrypted component in web forms that maintains the state of pages. It is used to maintain the state of the page in post-backs, these are the hidden fields and you can check by viewing the source of the page. if you are too using the view state to maintain the state behavior of data in web forms in the large form it loads your page and as a result, you have a performance issue.

The potential issues lie with the view state; it has large page load times due to the increased number of page states.

So, what are the best practices to override these types of headoff?

Here are some performance paradigms that must be accounted for while using view state.

  • Use it whenever needed on a page, but limit it to as small a size as possible.
  • Don't use multiple forms on a single page having state management enabled.
  • Use it wisely as required on the page or either at the control level or the application level.
  • Monitor the size of the view state by enabling tracing.
  • Avoid storing large objects as the size is directly proportional to the objects.

Proper use of caching

With the proper use of caching you can get a lot of benefits like reducing the round trips to the server, reducing the number of server resources and it renders faster than in routine normal mode. It can improve the performance manifold by caching the data on multiple HTTP Requests. It can store the page partially for specified times with some expiration values. It can boost application performance by storing data in memory so that it can be accessed quickly with less time, just like CPU RAM. The cache can be accessible in your single application to use in the web farm. You can use distributed cache managers like memcached to share the cache data within the web farm.

The best use of cache is as follows.

  • Use it in all layers: DataAccess, Business, and UI Layer. Using it correctly can give you a performance boost.
  • caching for large time for static resources or not too used resources and also adding expiration to cache also give best performance.
  • Don't cache expensive objects in a cache-like connection and like other resources.
  • use Output cache for static pages with expiration time and location as per your need.
  • Use partial fragment cache to partially cache the page component.

Avoid server-side validation

Validating your system is an important part as you filter what you want very clearly in DB because if you have free-type text input then chances are that the required data cannot be received that’s why validations are required.

Types of Validations

Validations are of two types

Client and server-side validation.

Server Side Validation

They are important from the perspective of securing your sensitive information, like saving passwords and other sensitive information. This is not required to do server-side validation as it always submits requests and responses back to the client, which causes the cost and time of the user. This type of validation occurs when the submit Hit.

The best Tip is to use it whenever you require to ensure that security is not bypassed otherwise it is better to check the client-side validation formats like email, URL, phone number, masking, and other required information that needs to be correct.

Minify and Compress JS, CSS Resources

The best approach with static content files is to minify, which means making them small. Just like in the production environment, Jqueryis also recommends using the min.js version file for best operations with their library. You need too to include the minified version of your application working javascript files as the number of requests greatly increases the page time Some of time size of the file doesn't matter, but it can also be reduced by using the minified files for javascript and CSS files.

In IIS there is also a setting for Compressing the Static and dynamic content you can try this to enable compression in the website.

This is also the best approach to use the Cascading style sheet files in the head of the webpage, while the scripts must be included at the bottom of the page for fast processing of the page.

As you are all aware we are using bundles of javascript libraries day by day to do our work but we forget about the behavior and impact that they made on our system. To overcome this Microsoft also announced its Microsoft web Optimization framework which is also useful.

You can read more about this framework Here.

You may refer to section 1 of this article about the speed concerns.

Session Management

The session is an important part of asp.net applications, but its effects are adverse if not handled properly.

Here are some best practices to use it intelligently.

  • Do not store a bunch of data in sessions.
  • Store basic types of data, not complex types like objects.
  • Use wisely the available session states like in Proc, out of process using state view, and out of process using SQL Server.
  • Out of process is the best option as it application did not restart despite any application configuration changes but it is slow as it is some server in SQL, While the other in proc is fast as it has used the same memory as the application and their retrieval is also fast.
  • Do not use sensitive data in the session state.
  • Always use the abandon () method to sign out the user with the session enabled.

Paging for Large Result set

Paging on large result sets is an extremely useful approach as we restrict the result to 10 to 30 records per page to be shown, and on call of the next records, we get more records to load it, reducing the extra load the server bears to fetch all the records and return it which causes an increase in page load times and extra cost to your users as well as the whole page goes unresponsive for about large time. So best approach is to make your result sets as less as possible and also use of ROWCOUNT() enhances the paging a lot more.

As your client has low resources, saving the large results set also makes an impact on your client.

The basic backbone in paging is the use of the row_number ranking function. If you can check the time with all records versus the first page records, then it shows a great improvement.

Avoid unnecessary roundtrips to the Server

The best method to avoid roundtrips to the server is to ensure that no unnecessary calls are sent to the server. As several requests are sent to the server increase the page load time increases and the result client suffers.

So, you must use client mechanisms to ensure the validity of getting data from the server as it does not result in postbacks and does not involve any server callbacks, which result in server involvement and trigger the request-response cycle.

You can follow metrics to minimize the round trips between the web server and the browser.

Use Server. Transfer instead of Response. Redirect for redirecting to a certain path. Server. Transfer scope is in current application redirection for redirecting to other than your application use Response. Redirect.

If your data is static, you can use caching for best performance. Use Output Buffering as it reduces roundtrips by loading the whole page and made available to the client. If you want to transfer some data and the client is always connected, then use HttpResponse.IsClientConnected as it reduces the chances of any missing change not being sent to the server.

Pages must be Batch Compiled

The more the assemblies grow in a process, the more chance that the process shoots out and throws out memory exceptions. To overcome this pages need to be batch compiled as when first requested initiated to compile pages all the pages in the same directory batch compiled and it makes a single assembly. The basic advantage is that the maximum number of assemblies that try to load in the process does not load which did not compromise on server load and only a single batch compile assembly loaded in the process.

You can also ensure some things while doing this, like

  • The debug property in the configuration file is always set to false in the production environment. If it is set to true, pages are not batch-compiled
  • Pages also did not time out if a certain web service of the page was not responding at the desired time,
  • Make sure that different languages are not used in the same directory, as it reduces the chances of batch compilation.

Partition Application Logically

This means logically partitioning your application logic like business, presentation, and data access layers. This is very useful as you have control of anything happening, and anyone can do their respective work in the logic layer. This doesn’t mean that you have to write more lines of code. Proper code with reusability and scalability of the application is the key property of your application's overall performance.

Don’t confuse it with the physical separation of logic, as it only separates the code logic.

Below are the key pros of the separate application logic.

  • The main advantage of this is that you have a choice of logic to reside on servers separately for your ease in a web farm environment, but it increases the latency of calls.
  • The closer is your logical layers the more benefit you have for example all logic files in the bin directory.

HTTP Compression

As the name suggests, http compression means to compress the content mostly in Gzip format or deflate and send with content headers after compression is applied. It provides faster transmission time between IIS and the browser.

There are two types of compression supported in IIS

Static Compression

It compresses cache static content by specifying the path of the directory attribute. After the first request is compressed followed by requests used the cache compresses the copy thus decreasing the time to furnish the content and increasing the productivity and performance of the application. You should only compress static content, which does not change or is not dynamic.

Dynamic Compression

Unlike static content, dynamic content often changes as a result it supports compression by not being able to add it to the cache it only compresses the content.

Resource Management

Resource management is the management of the overall resources of your application as it is directly related to the performance of the application. Poor resource management decreases performance, and it loads your server CPU.

Below is the list of the most useful techniques for resource management.

  • Good use of pooling
  • Proper use of the connection object.
  • Dispose of unused resources after using them
  • Handling memory leaks.
  • Remove unused variables

String Handling

String management is one of the keys to managing the memory of your application.

There are many techniques that are very useful in handling strings. Some of them are enlisted.

  • Use Response.Write() to the fastest show output to the browser.
  • Use StringBuilder when you don’t know the number of iterations to concatenate strings.
  • Use the += operator to concatenate strings when you know the number of strings is limited.
  • Do not use.ToLower() while comparing string as it creates a temporary string instead of using the string. Compare to compare two strings because it has built-in checks for case-insensitive data by using the cultureinfo class.

Read more articles on ASP.NET


Similar Articles