Action Speaks Louder Than Words
My life-long learning journal.
I guess there are more benefits, but these are some that are most striking to me. However, be mindful that reuse is no 'silver bullet' to the software developments woes; overrun projects, late delivery, requirements mismatch etc. Nonetheless, adoption of reuse is necessarily one of the key driver in many of the success stories of software projects.
In instances where deploying with administrative rights is not an option i.e. externally hosted solutions, the EL must be recompiled to disable the built-in instrumentation via removing certain compiler directives.
this blog explains the steps required to achieve this, and I quote it here for ease of reference:
Of course, with this solution, we introduce additional configuration management overheads. Which application works with instrumentation? Which one don't? In addition, 2 versions of EL assemblies must be maintained; with instrumentation, and without.
Headache...
To support one or more writers, all operations on the Hashtable must be done through the wrapper returned by the Synchronised method.
Enumerating through a collection is intrinsically not a thread-safe procedure. Even when a collection is synchronized, other threads could still modify the collection, which causes the enumerator to throw an exception. To guarantee thread safety during enumeration, you can either lock the collection during the entire enumeration or catch the exceptions resulting from changes made by other threads."
Now what this really means is this; The Hashtable supports multiple concurrent read operations, however, if the collection is modified during these iterations i.e. Hashtable.Add() or Hashtable.Remove(), then an InvalidOperationException will be generated.
For example:
public class TestMultithreadingHashtable{
private Hashtable table;
public void Main()
{
Hashtable table = new Hashtable();
//code to insert a lot of elements.
Thread readThread = new Thread(new ThreadStart(ReadFromHashtable));
Thread writeThread = new Thread(new ThreadStart(WriteToHashtable));
readThread.Start();
Thread.CurrentThread.Sleep(500);
writeThread.Start();
}
}
public void ReadFromHashtable()
{
IEnumerator enumerator = table.GetEnumerator();
//use this to iterate through elements.
while (enumerator.MoveNext())
//InvalidOperationException is thrown immediately after
//write thread has executed Add() as IEnumerator is no longer valid.
{
//code to display elements.
}
}
Upon a more thorough digging about in MSDN on IEumerator, the following details clarifies things a little.
"An enumerator remains valid as long as the collection remains unchanged. If changes are made to the collection, such as adding, modifying or deleting elements, the enumerator is irrecoverably invalidated and the next call to MoveNext or Reset throws an InvalidOperationException. If the collection is modified between MoveNext and Current, Current will return the element that it is set to, even if the enumerator is already invalidated.
The enumerator does not have exclusive access to the collection; therefore, enumerating through a collection is intrinsically not a thread-safe procedure. Even when a collection is synchronized, other threads could still modify the collection, which causes the enumerator to throw an exception... "
So to sum it up, multiple concurrent read operations on a Hashtable is fine. If there is an operation that modifies the content of the table, it must be synchronised against the read operations.
To achieve this, we can use the simple 'lock' (C#) or 'synclock' (VB.NET) mechanism, or for better performance and fine-grained control, use ReadWriterLock class. Maybe I will elaborate a little more on this the next time.