Performance Tuning in ASP.Net Web Applications

In this article, we will walk through some performance improvement techniques in ASP.NET Web Applications.

As you know, good performance of a web application is what a customer expects. The web application performance issue arises when the number of users increases in the application or it can also occur after a few days in production or during load testing in UAT. At this point, it's a tedious task to fix the performance issues since efforts are more toward making code changes and henceforth everyone blames developers for this issue. The main point is more if it's an application like Online Banking in which more customers are engaged at a time.

Now, if we came to understand that our web application has performance issues then we can fix it easily. First, we will something that could be potentially the cause of performance issue. It is the time to learn which one is the culprit for a performance issue and fix it soon. The major tasks I categorize as being performance tuning issues are Collect Data, Analyze Results, Fine-tune the code and again test it for performance measure in an iterative fashion.


CPU and memory are the two places where most of the performance issues occur. The following are the points which one needs to check before collecting data as a .Net developer.
 
Set debug=false in web.config

When you create the ASP.Net web application, by default this attribute is set to "true" in the web.config file that is very useful when developing. However, when you are deploying your application, always set it to "false". Setting it to "true" requires the pdb information to be inserted into the file and that results in a comparatively larger file and hence processing will be slow. So always set debug="false" before deployment.

Turn off Tracing unless until required


Tracing enables us to track the applications trace and the sequences that the developer needs to monitor the trace. When trace is enabled it tries to add extra information to the page. Always set this to "false" unless you require monitoring the trace logging in the production web.config.
  1. <trace enabled="false" requestLimit=”10” pageoutput=”false” traceMode=”SortByTime” localOnly=”true”>  

Choose Session State management carefully

One extremely powerful feature of ASP.NET is its ability to store session state for users for any web applications. Since ASP.NET manages session state by default, we pay the cost in memory even if you don't use it. In other words, whether you store your data in in-process or on a state server or in a SQL Database, session state requires memory and it's also time consuming when you store or retrieve data from it.  You may not require session state when your pages are static or when you do not need to store information captured in the page. In such cases where you need not use session state, disable it on your web form using the directive:
  1. <@%Page EnableSessionState="false"%>  
In case you use the session state only to retrieve data from it and not to update it, make the session state read-only using the following directive.
  1. <@%Page EnableSessionState ="ReadOnly"%>  

If the application needing session mode state is out of process then consider carefully whether there is a need of the state server or SQL Server session mode. SQL Server session mode provides lower performance than session state server mode.

Don't store Session Sate to store objects of any type (including objects you create) since they are stored by serializing and then de-serializing them and that results in a big performance penalty.

Use the client-side session state management technique to transfer data from one page to other page like query string and clientsidestorage if possible.

Deploy with Release Build

Be sure you use Release Build mode and not Debug Build when you deploy your site to production. If you think this doesn't matter, think again. By running in debug mode, you are creating PDBs and cranking up the timeout. Deploy Release mode and you will see the speed improvements.

Disable View State of a Page if possible


View state is a technique in ASP.NET for storing some state data in a hidden input field inside the generated page. When the page is posted back to the server, the server can parse, validate and apply this view state data back to the page's tree of controls.

View state is a very powerful capability since it allows state to be persisted with the client and it requires no cookies or server memory to save this state. Many ASP.NET server controls use view state to persist settings made during interactions with elements on the page; for example, saving the current page that is being displayed when paging through data. 

There are a number of drawbacks to the use of view state, however it increases the total payload of the page, both when served and when requested. There is also an additional overhead incurred when serializing or deserializing view state data that is posted back to the server. View state increases the memory allocations on the server. Several server controls, the most well known of which is the DataGrid, tend to make excessive use of view state, even in cases where it is not needed. Pages that do not have any server postback events can have the view state turned off. 

The default behaviour of the View State property is enabled, but if you don't need it, you can turn it off at the control or page level. Within a control, simply set the EnableViewState property to false, or set it globally within the page using this setting.

  1. <%@ Page EnableViewState="false" %>

Avoid Response.Redirect


To do client-side redirection in ASP.NET, users can call Response.Redirect and pass the URL. When Response.Redirect is called, the server sends a command back to the browser telling it to request the page redirected to it. An extra roundtrip happens that affects the performance. We can send information from the source page using a query string. There is a limitation on the length of a query string, it cannot be used to pass large amounts of data over the wire.
To do server-side redirection, users can use Server.Transfer. Since the execution is transferred on the server, Server.Transfer does not require the client to request another page. In Server.Transfer, by using HttpContext we can access the source page's items collection in the target page. The drawback of using this method is that the browser does not know that a different page was returned to it. It displays the first page's URL in the browser's address bar. This can confuse the user and cause problems if the user tries to bookmark the page. Transfer is not recommended since the operations typically flow through several pages.
 
Use StringBuilder to concatenate strings
 
All the manipulations you do to the string are stored in memory as separate references and it must be avoided as much as possible. In other words, when a string is modified, the runtime will create a new string and return it, leaving the original to be garbage collected. Most of the time this is a fast and simple way to do it, but when a string is being modified repeatedly it begins to be a burden on performance, all of those allocations eventually get expensive. Use StringBuilder whenever a string concatenation is needed so that it only stores the value in the original string and no additional reference is created.   

Don't throwing exceptions

Exceptions cause slowdowns you will never see in web applications, as well as Windows applications. You can use as many try/catch blocks as you want. Using exceptions gratuitously is where you lose performance. For example, you should stay away from things like using exceptions to control flow.
 
Use Finally Method to kill resources

The finally method gets executed independent of the outcome of the block. Always use a finally block to kill resources like closing database connections, closing files and other resources such that they are executed independent of whether the code in the try worked or went to the catch.
 
Use Client-side Scripts for validations
 
User Input is evil and it must be thoroughly validated before processing to avoid overhead and possible injections to your applications. Client-side validation can help reduce round trips required to process a user's request. In ASP.NET you can also use client-side controls to validate user input. However, do a check at the server side too to avoid the infamous JavaScript disabled scenarios.
 
Avoid unnecessary round trips to the server
 
Round trips significantly affect performance. They are subject to network latency and to downstream server latency. Many data-driven Web sites heavily access the database for every user request. Whereas connection pooling helps, the increased network traffic and processing load on the database server can adversely affect performance. Keep round trips to an absolute minimum. Implement Ajax UI whenever possible. The idea is to avoid full page refresh and only update the portion of the page that needs to be changed.
 
Use Page.ISPostBack
 
Be sure you don't execute code needlessly. Use the Page.ISPostBack property to ensure that you only perform page initialization logic when a page is first time loaded and not in response to client postbacks.
 
Include Return Statements within the Function/Method
 
Explicitly using a return allows the JIT to perform slightly more optimizations. Without a return statement, each function/method is given several local variables on the stack to transparently support returning values without the keyword. Keeping these around makes it harder for the JIT to optimize and can impact the performance of your code. Look through your functions/methods and insert a return as needed. It doesn't change the semantics of the code at all and it can help you get more speed from your application.
 
Use Foreach loop instead of For loop for String Iteration

A foreach statement is far more readable and in the future it will become as fast as a for loop for special cases like strings. Unless string manipulation is a real performance hog for you, the slightly messier code may not be worth it. Use a LINQ query expression when needed.
 
Avoid Unnecessary Indirection

When you use byRef, you pass pointers instead of the actual object. Many times this makes sense (side-effecting functions, for example), but you don't always need it. Passing pointers results in more indirection and that is slower than accessing a value that is on the stack. When you don't need to go through the heap, it is best to avoid it, thereby avoiding indirection.

Choose correct collections object

Choose the right collection object that does better over another. For example, an ArrayList is better than an array because an ArrayList has everything that is good about an array plus automatic sizing, Add, Insert, Remove, Sort, Binary Search. All these great helper methods are added when implementing the IList interface and the downside of an ArrayList is the need to cast objects upon retrieval. Carefully choosing a collection over another gives better performance.

Here is the list of collection performance issues that helps us deciding which one gives better performance.

Always check Page.IsValid when using Validation Controls


Always be sure you check Page.IsValid before processing your forms when using Validation Controls.

Use Paging: The Grid Choice

Most web applications need the data to be shown in tabular format. For that we need a Grid view, DataGrid, JQgrid, Telerik grid, Kendo Grid UI and so on. When doing a selection we should use a thin grid that gives better performance. ASP.NET grid view uses server-side code and makes the page heavy, whereas JQGRID is faster since it does everything at the client side. Use paging to display data on a user demand basis instead of pulling a huge amount of data and showing it in a grid. Use a Repeater control instead of a DataGrid or DataList because it is efficient, customizable and programmable and faster in performance than that of a GridView, DataGrid and DataList.

Do Ajax calls instead of ASP.NET code behind code

Call a web service from JavaScript or jQuery instead of server-side. Use asynchronous calls to call a web method from a web service.

Store your content by using caching
 
ASP.NET allows you to cache entire pages, fragments of pages or controls. You can cache also variable data by specifying the parameters that the data depends on. By using caching you help the ASP.NET engine to return data for repeated request for the same page much faster. 

The proper use and fine tuning of the caching approach of caching will result in better performance and scalability of your site. However improper use of caching will actually slow down and consume lots of server memory and memory usage. A good candidate to use caching is if you have infrequent chance of data or static content of a web page.
 
Use low cost authentication

Authentication can also have an impact on the performance of your application. For example, Passport authentication is slower than form-based authentication and that in turn is slower than Windows authentication.

Minimize the number of web server controls


The use of web server controls increases the response time of your application because they need time to be processed on the server side before they are rendered on the client side. One way to minimize the number of web server controls is to take into consideration the use of HTML elements where they are suited, for example if you want to display static text.

Avoid using unmanaged code

Calls to unmanaged code is a costly marshalling operation. Try to reduce the number of calls between the managed and unmanaged code. Consider doing more work in each call rather than making frequent calls to do small tasks. Use a using block.

Avoid making frequent calls across processes

If you are working with distributed applications then this involves additional overhead negotiating network and application level protocols. In this case network speed can also be a bottleneck. Try to do as much work as possible in fewer calls over the network.

Design with Value Types


Use simple structs when you can and when you don't do much boxing and unboxing. ValueTypes are far less flexible than Objects and end up degrading performance if used incorrectly. You need to be very careful about when you treat them like objects. This adds extra boxing and unboxing overhead to your program and can end up costing you more than it would if you had stuck with objects.

Remove the unused HTTPMODULES

There are several ASP.NET default Http Modules that sit in the request pipeline and intercept each and every request. For example, SessionStateModule intercepts each request, parses the session cookie and then loads the proper session in the HttpContext. Not all of these modules are always necessary for processing in the page execution life cycle.

For example, if you aren't using Membership and Profile provider, you don't need FormsAuthentication module remove this from web application web.config file.

Minimize assemblies


Minimize the number of assemblies you use to keep your working set small. If you load an entire assembly just to use one method, you're paying a tremendous cost for very little benefit. See if you can duplicate that method's functionality using code that you already have loaded.

Use Reflection wisely if needed

Avoid using reflection if there is no functional need. Especially Activator.CreateInstance() in refelection takes more time to execute. I noticed many times in the Ant profiler that dynamically creating an instance using Activator.CreateInstance (with two constructor parameters) is taking a decent amount of time. If you use reflection at the core module then test it in the profiler and see the performance hit before using it any UI application.

Encode Using ASCII When You Don't Need UTF

By default, ASP.NET comes configured to encode requests and responses as UTF-8.  If ASCII is all your application needs, eliminating the UTF overhead can return a few cycles. Note that this can only be done on a per-application basis.

Avoid Recursive Functions / Nested Loops

These are general things to adopt in any programming language that consumes much memory. Always avoid nested loops and recursive functions, to improve performance.

Minimize the Use of Format ()

When you can, use toString() instead of Format(). In most cases, it will provide you with the functionality you need, with much less overhead.

Make JavaScript and CSS External


Using external files generally produces faster pages because the JavaScript and CSS files are cached by the browser. Inline JavaScript and CSS increases the HTML document size but reduces the number of HTTP requests. With cached external files, the size of the HTML is kept small without increasing the number of HTTP requests, thus improving the performance.

Use multiple threads when calling multiple operations

The problem arises when single-threaded code gets stuck on a long-running process. So when we call multiple services in a single method we should call two methods in two different threads. Use of multiple threads makes the application more responsive.

Do Minification


Right now in Visual Studio versions we see all the styles and the scripts are bundled together to reduce the size of them. Minify your styles and script classes to reduce the size of the pages.

Do compression at IIS Level

Using compression is the single most effective way to reduce page load times. The .aspx files sent by the server to the browser consists of HTML. HTML is highly compressible by algorithms such as gzip. Because of this, modern web servers including IIS 5 and later have the ability to compress outgoing files and modern browsers have the ability to decompress incoming files.

Use Ngen.exe to optimize managed code performance

The Native Image Generator (Ngen.exe) is a tool that improves the performance of managed applications. Ngen.exe creates native images that are files containing compiled processor-specific machine code and installs them into the native image cache on the local computer. The runtime can use native images from the cache instead of using the Just-In-Time (JIT) compiler to compile the original assembly.

Do a load test in the end of development cycle

Visual Studio provides options for doing load tests where we can test the application when the development is about to be complete. We can load as many users and set the duration and test it. By doing so we can easily learn which page has performance issues or the application itself is performing badly. This should be a part of the development process and reduces the cost of repair with less effort instead of waiting for someone to identify performance issues in the application.

I have observed that the team that follows a strict code review has very less performance issues reported because they avoid all these in the early stage. When we are working on a customer facing website where transactions are huge there should be proper process, proper guidelines and proper planning during the development phase.

Script rendering order and cleaning up Html code

If possible, you can move the script tags <script> to the very bottom of the page. The reason this is important is that during the rendering, when the browser comes across a <script> tag, it stops to process the script and then proceeds. If you put the script tags at the bottom of the page then the page/HTML will render faster and the scripts can execute after the DOM elements have loaded.

Sometimes moving the script to the bottom of the page is not possible since some DOM elements or CSS may depend on these scripts, so they can be rendered. In such cases, you could move those scripts further up the page. However, as a rule of thumb, try to keep the scripts as low, towards the bottom, as possible.

Positioning the <script> tag towards the bottom of the page is not the only option to defer the load of script files. There are other ways too, for example, you can use the defer attribute.

Image “Optimization” does not mean that it reduces the quality of the image. But it will re-arrange the pixels and palettes to make the overall size smaller.

Web Definition of Image Optimization is “This term is used to describe the process of image slicing and resolution reduction. This is done to make file sizes smaller so images will load faster. When we place style sheets near the bottom part of the HTML, most browsers stop rendering to avoid redrawing elements of the page if their styles change, thus decreasing the performance of the page. So, always place Style Sheets into the Header. Minimize the number of iframes and DOM access. Use div elements instead of tables and a single CSS Style Sheet or Script File for the entire website.

Remove the unused ViewEnginee from ASP.NET MVC Pipeline

By default the ASP.Net runtime provides two types of view engines in ASP.Net MVC applications. The unused view engines should be removed from the runtime. For example, if you use a Razor page then add the following code in your global.asax.cs. By default, ASP.Net MVC renders with an aspx engine and a Razor engine. This only uses the RazorViewEngine.

  1. ViewEngines.Engines.Clear();  
  2. ViewEngines.Engines.Add(new RazorViewEngine());  

Do not put C# code in your MVC view

Your ASP.NET MVC views are compiled at run time and not at compile time. Therefore if you include too much C# code in them, your code will not be compiled and placed in DLL files. Not only will that damage the testability of your software, it will also make your site slower because every view will take longer to be displayed (because they must be compiled). Another down side of adding code to the views is that they cannot be run asynchronously and so if you decide to build your site based on Task-based Asynchronous Pattern (TAP), you won't be able to take advantage of asynchronous methods and actions in the views.

Use the high-performance libraries

Recently I was diagnosing the performance issues of a web site and I came across a hotspot in the code where JSON messages are coming from a third-party web service that had to be serialized several times. Those JSON messages were de-serialized by Newtonsoft.Json and it so happened that Newtonsoft.Json was not the fastest library when it came to de-serialization. Then we replaced Json.Net with a faster library (for example ServiceStack) and got a much better result.

Tips for Database Operations

Profile Database and check the High Response Time in any pages

Run SQL Profiler against the solutions database whilst hitting all the key web pages. Identify all SQL operations that have high durations or CPU values and review them with an eye to optimising them. Also identify how many SQL operations are involved in the rendering of each page and see if any of them can be coalesced. Aim for the goal of at most one SQL call to render any page.

Return Multiple Resultsets

If the database code has request paths that go to the database more than once, then these round-trips decrease the number of requests per second your application can serve. Return multiple resultsets in a single database request, so that you can cut the total time spent communicating with the database. You'll be making your system more scalable, too, since you'll reduce the work the database server is doing managing requests.

Connection Pooling and Object Pooling
 
Connection pooling is a useful way to reuse connections for multiple requests, rather than paying the overhead of opening and closing a connection for each request. It's done implicitly, but you get one pool per unique connection string. Be sure you call Close or Dispose on a connection as soon as possible. When pooling is enabled, calling Close or Dispose returns the connection to the pool instead of closing the underlying database connection. Account for the following issues when pooling is a part of your design:
  • Share connections.
  • Avoid per-user logons to the database.
  • Do not vary connection strings.
  • Do not cache connections.
Use SqlDataReader Instead of Dataset wherever it is possible
 
If you are reading a table sequentially you should use the DataReader rather than DataSet. A DataReader object creates a read-only stream of data that will increase your application performance because only one row is in memory at a time.

Keep Your Datasets Lean 
 
Remember that the dataset stores all of its data in memory and that the more data you request, the longer it will take to transmit across the wire. Therefore only put the records you need into the dataset.

Avoid Inefficient queries 
 
Queries that process and then return more columns or rows than necessary waste processing cycles that could best be used for servicing other requests. Too much data in your results is usually the result of inefficient queries. The SELECT * query often causes this problem. You do not usually need to return all the columns in a row. Also, analyze the WHERE clause in your queries to ensure that you are not returning too many rows. Try to make the WHERE clause as specific as possible to ensure that the least number of rows are returned. Queries that do not take advantage of indexes may also cause poor performance.

Too many open connections

Connections are an expensive and scarce resource that should be shared between callers by using connection pooling. Opening a connection for each caller limits scalability.
To ensure the efficient use of connection pooling, avoid keeping connections open and avoid varying connection strings.

Avoid Transaction misuse

If you select the wrong type of transaction management, you may add latency to each operation. Additionally, if you keep transactions active for long periods of time, the active transactions may cause resource pressure. Transactions are necessary to ensure the integrity of your data, but you need to ensure that you use the appropriate type of transaction for the shortest duration possible and only where necessary.


Avoid Over-Normalized tables

Over-normalized tables may require excessive joins for simple operations. These additional steps may significantly affect the performance and scalability of your application, especially as the number of users and requests increases.

Reduce Serialization

Dataset serialization is more efficiently implemented in .NET Framework version 1.1 than in version 1.0. However, Dataset serialization often introduces performance bottlenecks. You can reduce the performance impact in a number of ways. Use column name aliasing. Avoid serializing multiple versions of the same data. Reduce the number of DataTable objects that are serialized.

Do Not Use CommandBuilder at Run Time

CommandBuilder objects, such as as SqlCommandBuilder and OleDbCommandBuilder, are useful when you are designing and prototyping your application. However, you should not use them in production applications. The processing required to generate the commands affects performance.
Manually create Stored Procedures for your commands, or use the Visual Studio® .NET design-time wizard and customize them later if necessary.

Use Stored Procedures Whenever Possible
 
Stored Procedures are highly optimized tools that result in excellent performance when used effectively. Set up Stored Procedures to handle inserts, updates and deletes with the data adapter. Stored Procedures do not need to be interpreted, compiled or even transmitted from the client and cut down on both network traffic and server overhead. Be sure to use CommandType.StoredProcedure instead of CommandType.Text

Avoid Auto-Generated Commands

When using a data adapter, avoid auto-generated commands. These require additional trips to the server to retrieve metadata and provide you a lower level of interaction control. Whereas using auto-generated commands is convenient, it's worth the effort to do it yourself in performance-critical applications.


Use Sequential Access as Often as Possible

With a data reader, use CommandBehavior.SequentialAccess. This is essential for dealing with BLOB data types since it allows data to be read off of the wire in small chunks. Whereas you can only work with one piece of the data at a time, the latency for loading a large data type disappears. If you don't need to work the entire object at once, using Sequential Access will provide you much better performance.

Database general rule for high performance

Use Set NOCOUNT ON in Stored Procedures. If you turn on the NOCOUNT option, Stored Procedures will not return the row-count information to the client and this will prevent SQL Server from sending the DONE_IN_PROC message for each statement in the Stored Procedure.

Do not use the sp_ prefix for custom Stored Procedures. Microsoft does not recommend the use of the prefix "sp_" in user-created Stored Procedure names because SQL Server always looks for a Stored Procedure beginning with "sp_" in the master database.

Use the following rules:
  • Create indexes based on use.
  • Keep clustered index keys as small as possible.
  • Consider range data for clustered indexes.
  • Create an index on all foreign keys.
  • Create highly selective indexes.
  • Consider a covering index for often-used, high-impact queries.
  • Use multiple narrow indexes rather than a few wide indexes.
  • Create composite indexes with the most restrictive column first.
  • Consider indexes on columns used in WHERE, ORDER BY, GROUP BY and DISTINCT clauses.
  • Remove unused indexes.
  • Use the Index Tuning Wizard.
  • Consider table partition and row parttion.
  • use RAID for more read performance.

WCF tips for performance improvement

Select proper WCF Binding

Selection of the WCF binding also affects the performance. There are many types of bindings available in WCF Services. Each binding has a special purpose and security model. Depending on requirements we can select the proper binding type. For example if we create a WCF service that initially uses a WSHttpBinding. This binding has an extra cost for security, reliable sessions and transaction flow. If we select BasicHttpBinding instead of this then the performance is dramatically improved.

Throttling

Throttling of services is another key element for WCF performance tuning. WCF throttling provides the prosperities maxConcurrentCalls, maxConcurrentInstances and maxConcurrentSessions that can help us to limit the number of instances or sessions created at the application level.

  1. <serviceThrottling   
  2. maxConcurrentCalls="16" maxConcurrentSessions="100" maxConcurrentInstances="10" />  
Use data contract serialization

Serialization is the process of converting an object to a transferable format. XML serialization and binary are very useful when transferring objects over the network. XML serialization is very popular for its interoperability and binary serialization is used when transferring objects between two .NET applications.

Data contract serialization is about 10% faster than XML serialization. This can be significant if anyone is working with a large amount of data. The Data Contract Serializer can serialize public members as well as private and protected members.

Caching
 
External Dependency is another main problem with WCF Service Performance. We can use Caching to avoid this problem. Caching allows us to store data in memory or some other place from which we can retrieve it quickly. We have two options for caching, in-memory and external caching.

In-memory Caching

WCF services do not have access to the ASP.NET cache by default. We can do this using ASP.NET compatibility by adding the "AspNetCompatibilityRequirements" Attribute.
  1. [AspNetCompatibilityRequirements ( RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)]  
  2.   
  3. <system.serviceModel>  
  4. <serviceHostingEnvironment aspNetCompatibilityEnabled="true" />  
  5. </system.serviceModel>  

External Caching


The problem with in-memory caching is that it is very difficult to expire an item from the cache when the user does a change on it. The sticky session can help us to resolve the problem. All requests from the same source IP address are routed to the same server; this is called a sticky session. We can also use Windows Server AppFabric as the cache server.

If your application is smaller (non-clustered, non-scalable and so on) and your service is not stateless (you might want to make it stateless), you might want to consider In-Memory caching, however with large scale applications you might want to check out AppFabric, Coherence and so on.

Compress data

Only serialize the data to be sent across the network and that is actually required by the end user. In other words, try to avoid sending unneeded data across the network.

Setting the WCF transport and Reader Quotas property

The WCF transport property, like timeout, Memory allocation limits and Collection size limits, also help to improve the performance of the service. The Timeout is used to mitigate DOS (Denial of Service) attacks. Memory allocation allows us to prevent a single connection from exhausting the system resources and denying service to all other connections. Collection size limits help us to restrict the consumption of resources. The Reader Quotas property (like MaxDepth, MaxStringContentLength, MaxArrayLength, MaxBytesPerRead and MaxNameTableCharCount) can assist us in restricting message complexity to provide protection from Denial of Service (DOS) attacks.

Do Load Balancing and Server Addition

Load balance should not just be seen as a means to achieve scalability. Whereas it definitely increases scalability, many times it increases the performance of the web applications since the requests and users are distributed with multiple servers.

Doing Clean Code in development Phase by using FXCOP

Fxcop is a tool given by Microsoft that restricts developers from writing any code that can cause performance issues. If we write custom rules in FXCOP and follow these rules then the build will not be a success until the developer fixes the errors.

Some of these rules are:

  • Avoid excessive locals
  • Avoid uncalled private code
  • Avoid uninstantiated internal classes
  • Avoid unnecessary string creation
  • Avoid unsealed attributes
  • Review unused parameters
  • Dispose methods should call SuppressFinalize
  • Do not call properties that clone values in loops
  • Do not cast unnecessarily
  • Do not concatenate strings inside loops
  • Do not initialize unnecessarily
  • Initialize reference type static fields inline
  • Override equals and operator equals on value types
  • Prefer jagged arrays over multidimensional
  • Properties should not return arrays
  • Remove unused locals
  • Test for empty strings using string length
  • Use literals where appropriate

Tool used for performance tuning

These are the tools that are used to monitor the performance of the code. There are various performance counters for each .NET object that is used to decide which area to focus on during performance tuning. Performance Counters are used to provide information as to how well the operating system or an application, service, or driver is performing. The counter data can help determine system bottlenecks and fine-tune system and application performance.

  • .Net memory profiler
  • App Dynamics
  • Red Gate Ants profiler
  • Fiddler
  • Performance counters by perfmon

Conclusion: I have mentioned some of the tips for performance tuning. Performance tuning is not a one day job since it takes iterative tasks to improve performance. Understanding the performance counters is the fundamental way to fix any performance issues.

In the next article I will write how to deal with performance counters. Thanks for reading.

Reference:

Improving .NET Application Performance and Scalability


Similar Articles