Dr. Dobb's is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Channels ▼
RSS

Web Development

Measuring Application Performance In SOAs


Where To Measure Performance

There are two aspects to measuring and evaluating the performance of an application composed of one or more web services. The first is the throughput of the web service itself—its ability to accept a request and provide a response in accordance with required performance parameters. On a larger scale, it's also the ability to actually have a transaction throughput that the component was designed to meet. We know that as "scalability," although it's difficult to test true scalability prior to integrating all application components and functionality.

The second aspect is the ability to evaluate the performance of the application in the aggregate, including the web service. This includes the ability of the web service to respond, and how that response is coordinated with the responsiveness of the application as a whole. While the focus may be specifically on the performance of the web service component, it must be analyzed within the context of the entire application.

Why? First, because that's the type of performance the end user will experience. While the profiling data collected from developer tests doesn't easily correlate to the end-user experience, it can be representative of how the application will be used in practice. Second, bottlenecks may be exposed that may not be apparent from looking at the web service separate from the client application. While they are loosely coupled, there can still be dependencies that affect overall performance, especially if the application is working with multiple web services within the SOA.

To accomplish these goals, it's necessary to measure the performance of all of the application components simultaneously, during the same testing run, and correlate those disparate measurements into an integrated view. While the web service may appear to perform acceptably within its own context, resource or processing issues, synchronization problems, networking or data throughput, and bottlenecks may prevent it from reaching its potential.

Unit Testing During Development

Unit testing a web service presents the first significant challenge. As a practical matter, it requires an external stimulus to initiate execution, so it isn't as simple to call the service, as you would a DLL, or link it in, as you would a library.

Fortunately, using the web services wizard in Visual Studio .NET has the side effect of creating a web page front-end for functional testing purposes, and it works well enough to also initiate unit testing. Otherwise, you would have to write your own call method into the service.

Unit testing should cover both functional and performance testing. Functional testing involves exercising the operations which compose the service, to ensure they behave as specified. Whether you employ an automated test management system or conduct these unit tests manually, keeping track of code coverage is important to determine what code you've tested and how much you've tested it. Many developers already do this, and while web services open certain challenges, the process is largely familiar.

The goal of performance testing is to identify slow code and potential bottlenecks. For developers with experience in monolithic or tightly coupled applications, this is often simply a matter of noting that the application doesn't perform as expected, isolating the component responsible and fine-tuning the code. It's not quite like that when building a loosely coupled application with web services.

Profiling a .NET web service, however, is a necessary part of the development process, even if you're just a consumer of an existing service. Since you're not looking at performance or scalability testing of the entire application at this point, you can initiate profiling early in the development cycle.

[Click image to view at full size]
Figure 1: Compuware DevPartner Studio performance analysis lets you see the execution times associated with each line of code, as well as the number of times that line was executed.

At a minimum, profiling should collect two types of information: execution time and the number of times code executes (see Figure 1). The reason for the first is readily apparent—so that you can quickly identify parts of the code that seemingly execute more slowly than others. It's not necessarily indicative of poorly performing code because it may just be performing a computationally intensive operation that can't be improved upon. However, information on the performance of your code could justify a closer look at specific operations.

The number of times code is executed can also be indicative of poor performance, but in a different sense. A single line or method may execute in an acceptable amount of time, but may be inefficient in the sense that a single call to that operation does too little work for too much overhead. The trick is to find the optimum amount of data that can be processed most efficiently.


Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.