Business Scenario-Based Performance Diagnostics Procedures: Difference between revisions

From NovaOrdis Knowledge Base
Jump to navigation Jump to search
Line 6: Line 6:


=Assumptions=
=Assumptions=
==Load Generator==


We need a load generator that is capable of the following:
We need a load generator that is capable of the following:

Revision as of 18:59, 4 May 2016

Internal

Overview

Assumptions

Load Generator

We need a load generator that is capable of the following:

  • Generate load on behalf of multiple application users, concurrently, in such a way that each user should be capable of performing an interactive session setup, interact repeatedly within the application by executing a set of arbitrary business scenarios in a loop, and then end the session.
  • Insert arbitrary data in the requests it had previously recorded and it is replaying as part of the load tests. At minimum, the load generator must be able to insert arbitrary constant strings as custom header values in specific requests in the run sequence. More sophisticated analysis require additional capabilities, such as inserting unique identifiers per request, iteration, inserting request sequence numbers, etc..

NeoLoad is one of products.

Business Scenario Recording

This step consists in the recording of the business scenario whose performance will be measured under load. This step is highly dependent on the tool used to record the interaction with the application and it usually consists in intercepting the traffic with a proxy. The load generation tool records the HTTP request details, stores them and used the stored data to replay the application traffic.

Business Scenario "Annotation"

Mark the Beginning and the End of the Interesting Scenarios

The scenarios to be measured must be "annotated" by configuring the load generator to send specific HTTP headers with the first and the last request of the scenario.

Business-Scenario-Start-Marker

The first request of the scenario must include a custom HTTP header named Business-Scenario-Start-Marker. The value of the header must be a string designating the scenario type, for example "Read bank account value". The load test will consist in sending multiple scenarios of the same type, in a loop, and after data processing we will be able to tell whether the specific scenario is within the required performance limits. Shorter values (i.e. "TypeA", "TypeB", etc.) are preferred, because ultimately it translates in sending less bytes over the network, though the amount of traffic generated because of the marker headers is most likely going to be insignificant relative to the amount of application traffic.

Business-Scenario-Stop-Marker

The last request of the scenario must include a custom HTTP header named Business-Scenario-Stop-Marker. The value of the header must be the same string used when marking the start of the scenario with Business-Scenario-Start-Marker.

If the end of the scenario is not explicitly marked, the data processing utilities will assume a scenario ends when the next Business-Scenario-Start-Marker header carrying the same scenario type string is encountered. Assuming each iteration contains just one scenario of a specific type, this default behavior will produce in most cases invalid data, effectively equating a scenario with the whole iteration.

Target Environment Preparation

Component Life Cycle

In order to eliminate artifacts introduced by the state modified as result of previous load tests, the environment should be capable to stop and re-start all its active elements (load balancers, application servers, databases, caches)

In case the tests are executed in an environment managed by em, the best way of restarting environment is to restart the environment's VMs. This way all services are restarted and initialized.

Alternatively, the VM can be restarted manually.

If restarting the VMs is not practical, at least the component application servers must be restarted.

Instrumentation and Data Collection

As an one-time operation, the active elements of the environment must be instrumented to collect performance data.

Load Test and Data Generation

Data Collection and Pre-Processing

Data Processing and Report Generation