Thread Pooling
Thread pooling is the process of creating
a collection of threads during the initialization of a multithreaded
application, and then reusing those threads for new tasks as and when required,
instead of creating new threads. Then every process has some fixed number of
threads depending on the amount of memory available, those threads are the need of
the application but we have freedom to increase the number of threads. Every
thread in the pool has a specific given task. The thread returns to the pool
and waits for the next assignment when the given task is completed.
Usually, the thread pool is required when we
have number of threads are created to perform a number of tasks, in this
organized in a queue. Typically, we have more tasks than threads. As soon as a
thread completes its task, it will request the next task from the queue until
all tasks have been completed. The thread can then terminate, or sleep until
there are new tasks available.
Creating thread pooling
The .Net framework library included the
"System.Threading.ThreadPool" class. it was so easy to use.You need not
create the pool of threads, nor do you have to specify how many consuming
threads you require in the pool. The ThreadPool class handles the creation of
new threads and the distribution of the wares to consume amongst those threads.
There are a number of ways to create the thread pool:
Via the Task Parallel Library (from Framework 4.0).
By calling ThreadPool.QueueUserWorkItem.
Via asynchronous delegates.
Via BackgroundWorker.
Entering the Thread Pool via TPL
The task parallel library provide the task class for enter
the thread pool easy. The task class is the part of .Net Framework 4.0 .if
you're familiar with the older constructs, consider the nongeneric Task class a
replacement for ThreadPool.QueueUserWorkItem, and the generic Task<TResult> a
replacement for asynchronous delegates. The newer constructs are faster, more
convenient, and more flexible than the old.
To use the nongeneric Task class, call
Task.Factory.StartNew,
ing in a delegate of the target method:
using
System.Threading.Tasks;
using
System.Threading;
using
System.Diagnostics;
using
System;
class
Akshay
{
static void
Run()
{
Console.WriteLine("Welcome
to the C# corner thread pool!");
}
static void
Main() // The Task class is in System.Threading.Tasks
{
Task.Factory.StartNew(Run);
Console.Read();
}
}
Output :
Task.Factory.StartNew
returns a Task object, which you can then use to monitor the task-for instance,
you can wait for it to complete by calling its Wait method.
The generic Task<TResult> class is a subclass of the nongeneric
Task. It lets you get a return value back from the task after it finishes
executing. In the following example, we download a web page using Task<TResult>:
class
Akshay
{
static void
Main()
{
// Start the task executing:
Task<string>
task = Task.Factory.StartNew<string>
(() => DownloadString("http://www.c-sharpcorner.com/"));
// We can do other work here and it will
execute in parallel:
//RunSomeOtherMethod();
// When we need the task's return
value, we query its Result property:
// If it's still executing, the
current thread will now block (wait)
// until the task finishes:
string result = task.Result;
}
static string
DownloadString(string uri)
{
using (var
wc = new System.Net.WebClient())
return wc.DownloadString(uri);
Console.Read();
}
}
Entering the Thread Pool Without TPL using
ThreadPool.QueueUserWorkItem
You can't use the Task Parallel Library if you're
targeting an earlier version of the .NET Framework (prior to 4.0). Instead, you
must use one of the older constructs for entering the thread pool:
ThreadPool.QueueUserWorkItem
and asynchronous delegates.
The
ThreadPool.QueueUserWorkItem
method allows us to launch the execution of a function on the system thread
pool. Its declaration is as follows:
ThreadPool.QueueUserWorkItem(new
WaitCallback(Consume), ware);
The first parameter specifies the function that
we want to execute on the pool. Its signature must match the delegate
WaitCallback.
publicdelegate void WaitCallback (object state);
Again, the simplicity of C# and the dotNet
framework shine through. In just a few lines of code, I've recreated a
multithreaded consumer-producer application.
using
System;
using
System.Threading;
using
System.Diagnostics;
public
class Akshay
{
public
int id;
public Akshay(int
_id)
{
id = _id;
}
}
class
Class1
{
public
int QueueLength;
public Class1()
{
QueueLength = 0;
}
public
void Produce(Akshay
ware)
{
ThreadPool.QueueUserWorkItem(
new
WaitCallback(Consume), ware);
QueueLength++;
}
public
void Consume(Object
obj)
{
Console.WriteLine("Thread
{0} consumes {1}",
Thread.CurrentThread.GetHashCode(),
//{0}
((Akshay)obj).id);
//{1}
Thread.Sleep(100);
QueueLength--;
}
public
static void
Main(String[] args)
{
Class1 obj =
new Class1();
for
(int i = 0; i < 100; i++)
{
obj.Produce(new
Akshay(i));
}
Console.WriteLine("Thread
{0}",
Thread.CurrentThread.GetHashCode()
); //{0}
while
(obj.QueueLength != 0)
{
Thread.Sleep(1000);
}
Console.Read();
}
}
Ouput :
Synchronization Objects
The previous code contains some rather inefficient
coding when the main thread cleans up. I repeatedly test the queue length every
second until the queue length reaches zero. This may mean that the process will
continue executing for up to a full second after the queues are finally drained.
I can't have that.
The following example uses a ManualResetEvent
Event object that will signal the main thread to exit.
using
System;
using
System.Threading;
using
System.Diagnostics;
public
class Akshay
{
private
bool WaitForComplete;
private
ManualResetEvent Event;
public
int QueueLength;
public
int id;
public Akshay(int
_id)
{
id = _id;
}
public
void Wait()
{
if
(QueueLength == 0)
{
return;
}
Event = new
ManualResetEvent(false);
WaitForComplete = true;
Event.WaitOne();
}
public
void Consume(Object
obj)
{
Console.WriteLine("Thread
{0} consumes {1}",
Thread.CurrentThread.GetHashCode(),
//{0}
((Akshay)obj).id);
//{1}
Thread.Sleep(100);
QueueLength--;
if (WaitForComplete)
{
if (QueueLength == 0)
{
Event.Set();
}
};
}
}
Ouput :
When the consuming thread finishes consuming a
ware and detects that the WaitForComplete is true, it will trigger the Event
when the queue length is zero. Instead of calling the while block when it wants
to exit, the main thread calls the Wait instance method. This method sets the
WaitForComplete flag and waits on the Event object.
Why we need thread pooling?
Thread pooling is essential in multithreaded
applications for the following reasons.
Thread pooling improves the response time of an application as threads are already available in the thread pool waiting for their next assignment and do not need to be created from scratch.
Thread pooling saves the CLR from the overhead of creating an entirely new thread for every short-lived task and reclaiming its resources once it dies.
Thread pooling optimizes the thread time slices according to the current process running in the system.
Thread pooling enables us to start several tasks without having to set the properties for each thread.
Thread pooling enables us to state information as an object to the procedure arguments of the task that is being executed.
Thread pooling can be employed to fix the maximum number of threads for processing a particular request.