FREE BOOK

Chapter 10: Processes, AppDomains,Contexts, and Threads

Posted by Apress Free Book | C#.NET February 02, 2009
In this Chapter you drill deeper into the constitution of a .NET executable host and come to understand the relationship between Win32 processes, application domains, contexts, and threads.

Concurrency Revisited

Given this previous example, you might be thinking that threads are the magic bullet you have been looking for. Simply create threads for each part of your application and the result will be increased application performance to the user. You already know this is a loaded question, as the previous statement is not necessarily true. If not used carefully and thoughtfully, multithreaded programs are slower than single threaded programs. Even more important is the fact that each and every thread in a given AppDomain has direct access to the shared data of the application. In the current example, this is not a problem. However, imagine what might happen if the primary and secondary threads were both modifying a shared point of data. As you know, the thread scheduler will force threads to suspend their work at random. Since this is the case, what if thread A is kicked out of the way before it has fully completed its work? Again, thread B is now reading unstable data.

To illustrate, let's build another C# console application named MultiThreadSharedData. This application also has a class named WorkerClass, which maintains a private System.Int32 that is manipulated by the DoSomeWork() helper function. Also notice that this helper function also leverages a for loop to printout the value of this private integer, the iterator's value as well as the name of the current thread. Finally, to simulate additional work, each iteration of this logic places the current thread to sleep for approximately one second. Here is the type in question:

   internal class WorkerClass
    {
        private int theInt;
        public void DoSomeWork()
        {
            theInt++;
            for (int i = 0; i < 5; i++)
            {
                Console.WriteLine("theInt: {0}, i: {1}, current thread: {2}",
                theInt, i, Thread.CurrentThread.Name);
                Thread.Sleep(1000);
            }
        }
    }

The Main() method is responsible for creating three uniquely named secondary threads of execution, each of which is making calls to the same instance of the WorkerClass type:

  public class MainClass
    {
        public static int Main(string[] args)
        {
            // Make the single worker object.
            WorkerClass w = new WorkerClass();
            // Create and name three secondary threads,
            // each of which makes calls to the same shared object.
            Thread workerThreadA =
            new Thread(new ThreadStart(w.DoSomeWork));
            workerThreadA.Name = "A";
            Thread workerThreadB =
            new Thread(new ThreadStart(w.DoSomeWork));
            workerThreadB.Name = "B";
            Thread workerThreadC =
            new Thread(new ThreadStart(w.DoSomeWork));
            workerThreadC.Name = "C";
            // Now start each one.
            workerThreadA.Start();
            workerThreadB.Start();
            workerThreadC.Start();
            return 0;
        }
    }


Now before you see some test runs, let's recap the problem. The primary thread within this AppDomain begins life by spawning three secondary worker threads. Each worker thread is told to make calls on the DoSomeWork() method of a single WorkerClass instance. Given that we have taken no precautions to lock down the object's shared resources, there is a good chance that a given thread will be kicked out of the way before the WorkerClass is able to print out the results for the previous thread. Because we don't know exactly when (or if) this might happen, we are bound to get unpredictable results. For example, you might find the output shown in Figure 10-14.



Figure 10-14. Possible output of the MultiThreadSharedData application

Now run the application a few more times. Figure 10-15 shows another possibility (note the ordering among thread names).



Figure 10-15. Another possible output of the MultiThreadSharedData application

Humm. There are clearly some problems here. As each thread is telling the WorkerClass to "do some work," the thread scheduler is happily swapping threads in the background. The result is inconsistent output. What we need is a way to programmatically enforce synchronized access to the shared resources.

As you would guess, the System.Threading namespace provides a number of synchronization-centric types. The C# programming language also provides a particular keyword for the very task of synchronizing shared data in multithreaded applications.

Synchronization Using the C# "lock" Keyword

The first approach to providing synchronized access to our DoSomeWork() method is to make use of the C# "lock" keyword. This intrinsic keyword allows you to lock down a block of code so that incoming threads must wait in line for the current thread to finish up its work completely. The "lock" keyword requires you to pass in a token (an object reference) that must be acquired by a thread to enter within the scope of the lock statement. When you are attempting to lock down an instance level method, you can simply pass in a reference to the current type:

   internal class WorkerClass
    {
        private int theInt;
        public void DoSomeWork()
        {
            lock (this)
            {
                theInt++;
                for (int i = 0; i < 5; i++)
                {
                    Console.WriteLine("theInt: {0}, i: {1}, current thread: {2}",
                    theInt, i, Thread.CurrentThread.Name);
                    Thread.Sleep(1000);
                }
            } // Lock token released here!
        }
    }

Now, once a thread enters into a locked block of code, the token (in this case, a reference to the current object) is inaccessible by other threads until the lock is released. Thus, if threadA has obtained the lock token, and threadB or threadC are attempting to enter, they must wait until threadA relinquishes the lock.

NOTE If you are attempting to lock down code in a static method, you obviously cannot use the "this" keyword. If this is the case, you can simply pass in the System.Type of the current class using the C# "typeof" operator (although any object reference will work).  If you now rerun the application, you can see that the threads are instructed to politely wait in line for the current thread to finish its business (Figure 10-16).



Figure 10-16. Consistent output of the MultiThreadSharedData application

SOURCE CODE The MultiThreadSharedData application is included under the Chapter 10 subdirectory.

Synchronization Using the System.Threading.Monitor Type

The C# lock statement is really just a shorthand notation for working with the System.Threading.Monitor class type. Under the hood, the previous locking logic (via the C# "lock" keyword) actually resolves to the following (which can be verified using ildasm.exe):

   internal class WorkerClass
    {
        private int theInt;
        public void DoSomeWork()
        {
            // Enter the monitor with token.
            Monitor.Enter(this);
            try
            {
                theInt++;
                for (int i = 0; i < 5; i++)
                {
                    Console.WriteLine("theInt: {0}, i: {1}, current thread: {2}",
                    theInt, i, Thread.CurrentThread.Name);
                    Thread.Sleep(1000);
                }
            }
            finally
            {
                // Error or not, you must exit the monitor
                // and release the token.
                Monitor.Exit(this);
            }
        }
    }

If you run the modified application, you will see no changes in the output (which is good). Here, you make use of the static Enter() and Exit() members of the Monitor type, to enter (and leave) a locked block of code. Now, given that the "lock" keyword seems to require less code than making explicit use of the System.Threading.Monitor type, you may wonder about the benefits. The short answer is control.

If you make use of the Monitor type, you are able to instruct the active thread to wait for some duration of time (via the Wait() method), inform waiting threads when the current thread is completed (via the Pulse() and PulseAll() methods), and so on. As you would expect, in a great number of cases, the C# "lock" keyword will fit the bill. If you are interested in checking out additional members of the Monitor class, consult online Help.

Synchronization Using the System.Threading.Interlocked Type

Although it always is hard to believe until you look at the underlying CLR code, assignments and simple arithmetic operations are not atomic. For this reason, the System.Threading namespace also provides a type that allows you to operate on a single point of data atomically. The Interlocked class type defines the static members shown in Table 10-9.

Table 10-9. Members of the Interlocked Type
 

Member of the System.Threading.Interlocked Type Meaning in Life
Increment() Safely increments a value by one
Decrement() Safely decrements a value by one
Exchange() Safely swaps two values
CompareExchange() Safely tests two values for equality, and if so, changes one of the values with a third

Although it might not seem like it from the onset, the process of atomically altering a single value is quite common in a multithreaded environment. Thus, rather than writing synchronization code such as the following:

            int i = 9;
            lock (this)
            { i++; }

you can simply write:

// Pass by reference the value you wish to alter.
int i = 9;
Interlocked.Increment(ref i);

Likewise, if you wish to assign the value of a previously assigned System.Int32 to the value 83, you can avoid the need to an explicit lock statement (or Monitor logic) and make use of the Interlocked.Exchange() method:

            int i = 9;
            Interlocked.Exchange(ref i, 83);

Finally, if you wish to test two values for equality to change the point of comparison in a thread-safe manner, you would be able to leverage the Interlocked.CompareExchange() method as follows:

            // If the value of i is currently 83, change i to 99.
            Interlocked.CompareExchange(ref i, 99, 83);

Synchronization Using the [Synchronization] Attribute

The final synchronization primitive examined here is the [Synchronized] attribute, which, as you recall, is a contextual attribute that can be applied to context-bound objects. When you apply this attribute on a .NET class type, you are effectively locking down all members of the object for thread safety:

     using System.Runtime.Remoting.Contexts;
     // This context-bound type will only be loaded into a
    // synchronized (and hence, thread-safe) context.
    [Synchronization]
    public class MyThreadSafeObject : ContextBoundObject
    { /* all methods on class are now thread safe */}

In some ways, this approach can be seen as the lazy approach to writing thread-safe code, given that we are not required to dive into the details about which aspects of the type are truly manipulating thread-sensitive data. The major downfall of this approach, however, is that even if a given method is not making use of thread-sensitive data, the CLR will still lock invocations to the method. Obviously, this could degrade the overall functionality of the type, so use this technique with care.

Thread Safety and the .NET Base Class Libraries

Although this chapter has illustrated how you can build custom thread-safe types, you should also be aware that many of the types of the base class libraries have been preprogrammed to be thread-safe. In fact, when you look up a given type using online Help (such as System.Console) you will find information regarding its level of thread safety (Figure 10-17).



Figure 10-17. Many (but not all) .NET types are already thread-safe .

Sadly, many .NET types in the base class libraries are not thread-safe, and therefore, you will have to make use of the various locking techniques you have examined to ensure the object is able to survive multiple requests from the thread base.

Total Pages : 13 910111213

comments