Reader Level:
ARTICLE

When should you use the 'var' keyword in C#?

Posted by Vulpes Articles | C# Language April 25, 2011
‘var’ is a contextual keyword that was first introduced in C# 3.0. It’s described as contextual because it’s only a keyword when used in a certain context - in other contexts it can be used as an ordinary identifier.
  • 0
  • 0
  • 9906

What is 'var' and what is its purpose?

'var' is a contextual keyword that was first introduced in C# 3.0. It's described as contextual because it's only a keyword when used in a certain context - in other contexts it can be used as an ordinary identifier.

Contextual keywords are nothing new. C# 1.0 introduced the contextual keywords: get, set, value, add and remove when defining properties and events. A number of others have been added in subsequent versions.

'var' is an instruction to the C# compiler to infer the type of a local variable from the type of the expression assigned to it. For example:

var list = new List<string>();  // list inferred to be of type List<string>
var count = 3;  // count inferred to be of type int
var greeting = "Hello" // greeting inferred to be of type string
var var = 'c'; // var (as an ordinary identifier) inferred to be of type char

The last example, though perfectly legal, is of course bad practice.

Once the type of a variable has been inferred, it cannot be changed to something else:

greeting = 5; // not allowed as greeting is a string variable

So, despite appearances to the contrary, 'var' is strongly typed.

So what's the problem then?


'var' is one of the most controversial additions to C#.

Some C# developers love it and use it as often as they can; other developers hate it and only use it when they have to. The remainder lie somewhere between these two extremes though, in my experience, many do not have a considered or a consistent policy.

Those who love it say that it reduces typing, shortens lines and avoids the need to duplicate the type name on both sides of the assignment operator when 'newing up' an object which many developers had grumbled about in versions 1.0 and 2.0. If there's any doubt about the type of a variable, then Intellisense will resolve it.

Those who hate it say that it makes code harder to read (and code is read more often than written!) and encourages the use of 'ugly' devices such as Hungarian notation to make a variable's type clear from its name. For example: iCount for an int variable or sGreeting for a string variable. Also, Intellisense may not always be available when reading code.

Even the choice of keyword for type inference is controversial. Many developers believe 'var' was a bad choice because it is suggestive of the weakly typed 'var' keyword in JavaScript or of Variants in COM-based languages (such as VB6) which can assume values of different types.

The latest version of Visual C++ uses the 'auto' keyword to fulfil the same role which, arguably, would have been a better choice for C# though you might then have had some complaints from programmers involved in the automobile industry!

When must you use 'var'?

You must use it when declaring a variable of an anonymous type or a collection of anonymous types because, by definition, these types do not have a name (well they don't have one which is normally accessible from C#). So, you must write:

var anonymous  = new{FirstName = "John", LastName = "Doe"};
var query = from name in names where name.StartsWith("A") select new { Initial = name[0], LastName = name.Split(' ')[1]};

When can't it be used?

Its use is restricted to declaring local variables within methods or properties, including iteration variables in 'for' or 'foreach' statements. This means that it can't be used in any of the following scenarios:

  • As the type of a field
  • As the type of a parameter
  • As the return type of a method or property
  • As a type parameter in a generic type or method

It also can't be used if there's a type called 'var' within the same scope:

class var
{
    static void Main()
    {
        var v = "Hello"; // cannot implicitly convert type 'string' to 'var'  
    }
}


Nor, can it be used in statements like this where the type of 'var' cannot be inferred:

var n = null; // Cannot assign <null> to an implicitly-typed local variable

When should it be used?

So, for those of us who are not committed one way or the other to the use of 'var', is there a sensible middle ground that most of us could agree upon? 

The main objection to the indiscriminate use of 'var' is that it makes code harder to read because you can't tell just by inspection what the type of the variable is - you have to think about it first and may be struggling even then.

This suggests that it would be reasonable to confine the use of 'var' to those cases where the type is obvious because it explicitly appears on the right hand side of the assignment. These cases would include:

  • 'new' expressions
  • 'static' methods which return a value of the same type
  • Expressions involving a simple cast
  • Expressions including an 'as' conversion

Here are some examples:

var  ht = new Hashtable();
var dt = DateTime.Parse("1/1/2011");
ht.Add(1,dt);
var dt2 = (DateTime)ht[1];
var dict = ht as IDictionary;

Now you could argue in all these cases that you might not necessarily be assigning these values to a variable of the 'obvious' type but to a variable of a type which the obvious type is implicitly convertible to (say, 'object' in the above cases). However, if that were the case, then there would be nothing to stop you specifying the type of the variable explicitly as you would already have a mixture of implicit and explicit assignments if you followed these conventions.

Even if you accept these restrictions as reasonable, you then have to decide how long a type name needs to be for the saving in typing, line length and duplication to be worthwhile. I think you could certainly say that it would be worthwhile for any generic type but what about non-generic types?

Although it's highly subjective, my own view is that it's not worthwhile unless the name of the non-generic type consists of at least eight characters which conveniently excludes the built-in types (string, double, decimal etc.) from consideration.

Does it really matter what you do?

Yes, I believe it does.

When you're writing code, you don't want to be agonizing over whether you should be using 'var' or not! You need a clear policy on when (if at all) you should use it, so the decision can be made instantly.

Similarly, when you're reading code, you don't want to spend several seconds trying to decide what the type of a variable is - you should be able to tell instantly.

Although, if you were previously uncommitted, you may not necessarily agree with my 'sensible middle ground' proposal, I hope this article will at least help you to decide on a policy which is right for you!

COMMENT USING

Trending up