Devised to extend the capabilities of the Application
intrinsic object,
the Cache
object lacks one key feature that the now old-fashioned Application
object supplies-the ability to execute a bunch of update statements atomically.
Already in classic ASP, the Application
object provides a pair of methods
to lock and unlock the object, thus serializing access to it. Both Application
and Cache
are global application objects subject to the action of all
possible threads active at a given time. In ASP.NET, both Application
and Cache
objects are thread-safe, meaning that their contents can be
freely accessed by any threads in the AppDomain. However, this applies only
to operations that read or write an individual item. The ASP.NET Cache
and Application
objects guarantee that no other concurrently running
threads can ever interfere with the single read or write operation you're executing.
However, if you need to execute multiple operations on the Cache
object
in an atomic way, that's a different story. Consider the following code snippet:
int counter = -1; object o = Cache["Counter"]; if (o == null) { // Retrieve the last good known value from a database // or return a default value counter = RetrieveLastKnownValue(); } else { counter = (int) Cache["Counter"]; counter ++; Cache["Counter"] = counter; }
The Cache
object is accessed repeatedly in the context of an atomic
operation-incrementing a counter. While individual accesses to Cache
are thread-safe, there's no guarantee that other threads won't kick in between
the various calls. If there's potential contention on the cached value, you
should consider using additional locking constructs, such as the C# lock
statement
on the Cache
object. In Visual Basic .NET, the C# lock statement is implemented
through the SyncLock
keyword.
Where should you put the lock? If you directly lock the Cache
object,
you might run into serious troubles. ASP.NET uses the Cache
object extensively
and doing so might have an impact on the overall performance of the application.
If you look under the hood of the Cache
object implementation, though,
you realize that most of the time, ASP.NET doesn't internally access the cache
store via the Cache
object. Rather, it accesses the direct data container,
which is an internal class named CacheSingle
(or CacheMultiple
if your application is configured to run in web garden mode).
In this regard, a lock on the Cache
object probably won't affect many
native ASP.NET components. However, it's still a risk because you might block
a bunch of HTTP modules and handlers in the pipeline, as well as other pages
and sessions in the application that need to use cache entries. Note that by
locking the Cache
object, you prevent access to the whole Cache
contents and not just to the entries you need to protect from concurrent access.
The best way out seems to be using a synchronizer, which is an intermediate but global object that you lock before entering in a piece of code that is sensitive to concurrency:
lock(yourSyncObject) { // Access the Cache here. }
This pattern must be replicated for each access to the cache that requires
serialization. The synchronizer object must be global to the application. For
example, it could be a static member defined in the global.asax file and initialized
in the Application_Start
event.
Dino Esposito is Wintellect's ADO.NET and XML expert, and a trainer and consultant
based in Rome, Italy. Dino is a contributing editor to Windows Developer
Network and MSDN Magazine, and the author of several books for Microsoft
Press including Building Web Solutions with ASP.NET and ADO.NET
and Applied XML Programming for .NET. Contact Dino at [email protected].