Sei sulla pagina 1di 22

HP Application Lifecycle Management 12.

00
Benchmark

ALM PCoE
Document Release Date: May 2014
Software Release Date: May 2014
Legal Notices

Warranty

The only warranties for HP products and services are set forth in the express warranty statements
accompanying such products and services. Nothing herein should be construed as constituting an additional
warranty. HP shall not be liable for technical or editorial errors or omissions contained herein.
The information contained herein is subject to change without notice.

Restricted Rights Legend

Confidential computer software. Valid license from HP required for possession, use or copying. Consistent
with FAR 12.211 and 12.212, Commercial Computer Software, Computer Software Documentation, and
Technical Data for Commercial Items are licensed to the U.S. Government under vendor's standard
commercial license.

Copyright Notices

© Copyright 2014 Hewlett-Packard Development Company, L.P.

Trademark Notices

Microsoft® and Windows® are U.S. registered trademarks of Microsoft Corporation.


Microsoft SQL Server is a trademark of the Microsoft Corporation.
Oracle® is a registered US trademark of Oracle Corporation, Redwood City, California.
UNIX® is a registered trademark of The Open Group.
Red Hat Linux is a trademark of Red Hat, Inc.
Solaris is a registered trademark of Sun Microsystems, Inc
WebLogic is a trademark of Oracle Corporation
WebSphere® is a trademark of IBM Corporation.
Jboss is a trademark of JBoss Inc.

2
Contents

1 Introduction ..................................................................................................................................... 4
1.1. Audience ....................................................................................................................................................... 4
1.2. Disclaimer ..................................................................................................................................................... 4
1.3. Purpose and scope ........................................................................................................................................ 4
1.4. Terminology .................................................................................................................................................. 5

2 Server Side Regression Test ...................................................................................................... 6


2.1. Executive summary ..................................................................................................................................... 6
2.2. Environment settings .................................................................................................................................. 6
2.3. Load Profile and test scenario ..................................................................................................................... 8
2.4. Test results ................................................................................................................................................... 9

3 User Experience Regression Test ......................................................................................... 10


3.1. Executive summary ................................................................................................................................... 10
3.2. Environment settings ................................................................................................................................ 10
3.3. Load Profile and Test Scenario ................................................................................................................. 12
3.4. Test Results ................................................................................................................................................ 12

4 Stability Test ................................................................................................................................. 15


4.1. Executive summary ................................................................................................................................... 15
4.2. Environment settings ................................................................................................................................ 15
4.3. Load profile and test Scenario ................................................................................................................... 17
4.4. Test Results ................................................................................................................................................ 19

Appendix 1: Project Data Profile ................................................................................................ 21

3
1 Introduction

1.1. Audience
This document is for those who plan, size, set up, and maintain the HP ALM 12.00
environment.

1.2. Disclaimer
The results published in this document are accurate for the below tests only when they are
run on the specific environments described. Different ALM configurations or environments
may have different performance results for the same tests.
Factors that commonly affect performance include: Active Project sizes structure and
quantity, user flows, network configurations, JVM configuration, ALM configurations
(Caches), Database tuning, and Hardware configuration (RAM, CPU, disk size).
The aim of this document is to provide an analysis of ALM performance as reflected in
benchmark tests described below. The document should be used as a general guideline. For
more specific capacity planning, conduct load tests directly on your actual environment.

1.3. Purpose and scope


This document summarizes the three main performance tests that were done for ALM 12.00:
1. Regression. Verified there are no regressions in Transaction Response Time between
versions.
2. Single user. Verified there are no regressions in client performance.
3. Stability. Tested system endurance for 3 days under very high load.

4
1.4. Terminology

Transaction A measurement of one user action defined on load script.

OTA A transaction that triggers one or more OTA requests, simulating


Transaction working with ALM desktop client.

ALM Web A transaction that triggers one or more REST requests, simulating
Client REST working with ALM web client.
Transaction

Transaction The time measured by a LoadRunner generator as the response


Response time of the transaction. This is defined by LoadRunner as the time
Time (TRT) taken for all Web/OTA requests belonging to the transaction to
return the responses. It is also called Transaction Latency.

Overall TRT Weighted average of all TRT during test execution.

Performance Counters are used to provide information as to how well the


counters operating system or an application, service, or driver is
performing. The counter data can help determine system
bottlenecks and fine-tune system and application performance.
The operating system, network, and devices provide counter data
that an application can consume to provide users with a graphical
view of how well the system is performing.

Server Side TRT of server and network (excluding UI time).


TRT

User TRT from user operation in client until response is received,


experience rendered and displayed. Response time includes client, network,
file system, database and server time.

JVM A JVM's throughput accounts for the percentage of total time


Throughput garbage collection does not take place. For instance, 80 percent
throughput implies that garbage collection consumes 20 percent of
the JVM's processing while your application consumes only 80
percent.

AUT Application under test

LoadRunner HP LoadRunner is an automated performance and test


(LR) automation product. It examines system behavior and
performance while generating actual load.

CRUD Create, Read, Update and Delete of entity

5
2 Server Side Regression Test

2.1. Executive summary


Server side regression in ALM 12.00 includes running the same load on ALM 12.00 and older versions and
comparing their server performance.
The current ALM version (12.00) was tested and compared to ALM 11 patch 16 and ALM 11.52 patch 2.
All the tests were conducted on recommended hardware requirements and with software configuration as
detailed in section 2.2 and IPv4 protocol Network.
Comparison of ALM 12.00 to ALM11:
Result shows improved performance response time for most transactions.
This improvement is a result of ALM application platform changes such as JDBC driver update, increase of
cache sizes, query optimization, and other internal platform enhancements.
Comparison of ALM 12.00 to ALM11.52:
Result shows improved performance response time for most transactions.
This improvement is a result of ALM application platform changes such as increase of cache sizes, query
optimization, and other internal platform enhancements.

2.2. Environment settings


The following diagram illustrates the testing HW structure:

Load Testing Tools AUT

Database
LR Controller
Application Server

Load Generator

PC Server
File Server

6
Application server software
Version OS Database Web JVM
server Heap Size

ALM 11 Microsoft Windows Microsoft SQL JBoss 5.1 10 GB


Server 2008 Enterprise Server 2008 SP1
SP2 64 Bit

ALM 11.52 Red Hat Enterprise Linux Oracle 11.2.0.3 Apache 2.2 10 GB
6.2 64 Bit

ALM 12.00 Microsoft Windows SQL Server 2008 Microsoft IIS 10 GB


Server 2008 R2 R2 SP2/Oracle 7.5/Apache
Enterprise SP1 64 11.2.0.3 2.2
bit/Red Hat Enterprise
Linux 6.4 64 bit

Application Server Hardware


Environment Model CPU RAM Controller

ALM 11 ESX 5.1 VM Xeon X5670 2.93 14 GB Smart Array


version 8 (using 4 Cores) P410i

ALM 11.52 ESX 5.1 VM Xeon X5670 2.93 14 GB Smart Array


version 8 (using 4 Cores) P410i

ALM 12.00 ESX 5.1 VM Xeon X5670 2.93 14 GB Smart Array


version 8 (using 4 Cores) P410i

DB Server Hardware
Environment Model CPU RAM Controller

ALM 11 ESX 5.1 VM Xeon X5670 2.93 20 GB Smart Array


version 8 (using 8 Cores) P410i

ALM 11.52 ESX 5.1 VM Xeon X5670 2.93 20 GB Smart Array


version 8 (using 8 Cores) P410i

ALM 12.00 ESX 5.1 VM Xeon X5670 2.93 20 GB Smart Array


version 8 (using 8 Cores) P410i

7
2.3. Load profile and test scenario
 1000 active users were loaded.
 300 active projects:
 20 ALM projects which contain real-world data and structure. The project
structure is based on selected enterprise ALM customers. See appendix 1 for
more details.
 280 empty projects, used only for login-logout during the load (for jobs and
cache testing).
 5-6 transactions to server per second. See the “Project Load Profile” section for more
details.
 Load execution duration: 5 hours.

2.3.1 Project load profile


In order to do a full regression test we used a scenario that contained the main
functionalities of common modules. The simulated usage profile (users’ distribution among
modules and number of events per hour per user) was based on a workload analysis of
selected ALM customers, representing different deployment sizes and configurations.
The load profile for the test was based on the four main ALM modules, and the most common
and important business processes that exist in all tested versions. The following table shows
the number of transactions executed during one hour of load:

Generic Transactions per hour


Transactions

File operations 498

Login 495

Module Transactions per hour

Requirements 1978

Test Plan 4808

Test Lab 6978

Defects 5634

8
2.4. Test results
 ALM 11 to ALM 12.00 (Windows)
Overall TRT has improved in ALM 12.00. Most transactions improved between 5 to 25%.
 ALM 11.52 to ALM 12.00 (Linux Red Hat)
Overall TRT has improved in ALM 12.00. Most transactions improved between 5 to 20%.

9
3 User Experience Test

3.1. Executive summary


Test of user experience covered both ALM desktop client and ALM web client.
For ALM desktop client, Testing was done using automation executing and comparing
response times of 40 of the most common user operations over ALM 11.52 and ALM12.00.
When comparing the response time of 40 automated operations in ALM 12.00 to ALM 11.52,
we found that all the operations response times have improved or remained the same.
For more details about the tested scenario see section 3.3.
For ALM web client tests, a set of operations and KPIs were defined. Each operation was
executed several times on a loaded environment and verified that it meets KPI requirements.
Comparing ALM desktop client and ALM web client shows that for most of the operations,
ALM web client has same or better response times than ALM desktop client.
For more details about tested scenario see section 3.4, for KPIs list see section 3.5 and for
measured results see section 3.6.

3.2. Environment settings


All the tests were conducted on recommended hardware requirements and with software
configuration as described below .

Client machine
Software version Hardware version

Operating System: Windows 7 Processors: Intel® Xeon® 5160 (Using 2 cores)


Professional sp1 32 bit
Memory: 4GB RAM
Internet Explorer 10/ Chrome version
32/ Firefox version 26

Application server software


10
Version OS Database Web JVM
server Heap Size

ALM 11.52 Red Hat Enterprise Linux Oracle 11.2.0.3 Apache 2.2 10 GB
6.2 64 Bit

ALM 12.00 Microsoft Windows SQL Server 2008 Microsoft IIS 10 GB


Server 2008 R2 R2 SP2/Oracle 7.5/Apache
Enterprise SP1 64 11.2.0.3 2.2
bit/Red Hat Enterprise
Linux 6.4 64 bit

Application Server Hardware


Environment Model CPU RAM Controller

ALM 11 ESX 5.1 VM Xeon X5670 2.93 14 GB Smart Array


version 8 (using 4 Cores) P410i

ALM 11.52 ESX 5.1 VM Xeon X5670 2.93 14 GB Smart Array


version 8 (using 4 Cores) P410i

ALM 12.00 ESX 5.1 VM Xeon X5670 2.93 14 GB Smart Array


version 8 (using 4 Cores) P410i

DB Server Hardware
Environment Model CPU RAM Controller

ALM 11 ESX 5.1 VM Xeon X5670 2.93 20 GB Smart Array


version 8 (using 8 Cores) P410i

ALM 11.52 ESX 5.1 VM Xeon X5670 2.93 20 GB Smart Array


version 8 (using 8 Cores) P410i

ALM 12.00 ESX 5.1 VM Xeon X5670 2.93 20 GB Smart Array


version 8 (using 8 Cores) P410i

WAN between the client and server was simulated using a WAN simulator between client
and application server.
The latency defined was 160 ms for one direction (320 ms for round trip). This was done to
simulate very high latency between client and server.

11
3.3. Automation test scenario for ALM desktop client
testing
Each operation was repeated 5 times. The test was performed on a big project which was
synthetically generated and used for all performance testing. Database caches were cleaned
between each operation. The operations were run with no load on the background, on a
completely isolated environment.
See Appendix 1 for details about the project structure.
Here are examples of some operations over requirements and their details:

Operation Name Details

Create requirement Create one requirement on the 4th level of the


tree

Filter requirement – Tree view On tree view, filter requirements by: req
parent="Requirements\Requirement_1-1",
priotiy=5-Urgent Or "4-Very High",type=Testing

Update requirement Submit update 3 fields on a requirement in the


4th level of tree

Move requirement Move a folder containing 10 requirements to


other folder, both folders on the 4th level on
requirements tree

Delete requirement Delete requirements folder containing 10


requirements on 4th level of the tree

Copy paste requirement Copy and paste folder with 10 requirements


from 4th to 5th level of the tree

3.4. Test scenario for ALM web client


Here is the list of operations that were tested for Defects and Requirements:
 CRUD
 Filters
 Group by
 Sorting
 Author mode
 Entity Details view
 Children view
 Entity tabs switching
 Tree\Grid view

12
3.5. KPIs

Transaction Category KPI Comments

In-Page(micro) 50-150 milliseconds Examples: Scrolling, Open list,


fill data in TextBox

High usage Up to 1 seconds Expand tree(navigation)

High-Medium usage Up to 2 seconds Entity filters, Modify entity, etc

Medium usage Up to 3 seconds Create entity, Login\Logout, etc

Low usage Up to 6 seconds Delete entity, relations

Relatively heavy 6-12 seconds Examples: Small Report,


operations copy\paste.

3.6. Results
No regressions in ALM desktop client operations.
For ALM web client, all tested operations meet KPIs.
Comparing ALM desktop client and ALM web client shows that for many of the operations,
ALM web client has same or better response times than ALM desktop client.
Here are some examples for response times measured on the same environment.

Operation ALM web ALM desktop


client client

Open Requirements module first time 1700 ms 2100 ms

Go to 1 requirement 2400 ms 6300 ms

Go to 1 defect 1500 ms 2700 ms

Add 1 defect 2000 ms 2000 ms

Expand requirement with 2 sub requirements 1000 ms 1100 ms

13
14
4 Stability Test

4.1. Executive summary


The Stability test tested the endurance of the full ALM system when under high load.
The test was performed for 72 hours. The application server was up for the entire test time
frame.
During the entire run, the system was stable – no increase or peaks in TRT and system
resources.

4.2. Environment settings


The stability test was run on 2 environments, both with the same structure described in the
diagram below. The difference between the environments is in the application server’s
hardware and software configuration details. The application servers used for the tests are
described on the tables below.

Load Testing Tools AUT

PC Server

PC Server
Load Generator

Database

Application Server 1+2


LR Controller

Load Generator

File Server
50 PC Hosts
100 Lab Hosts

15
Environment Software Versions

Software type Version

App server OS Windows 2008 R2 EE & Linux RedHat 6.3

Database SQL 2008R2 SP1 & Oracle 11.2.03g

Web reverse proxy IIS 7.5 & Apache 2.2

Environment Hardware Versions

Application Server Hardware

Environment Model CPU RAM Controller

Windows ProLiant BL460c 2 * Xeon X5650 (6 Cores) 16 GB Smart


G7 HT enabled Array
P410i

Windows ProLiant BL460c 2 * Xeon X5650 (6 Cores) 16 GB Smart


G7 HT enabled Array
P410i

Linux ProLiant BL460c 2 * Xeon X5650 (6 Cores) 16 GB Smart


G7 HT enabled Array
P410i

Linux ProLiant BL460c 2 * Xeon X5650 (6 Cores) 16 GB Smart


G7 HT enabled Array
P410i

Database Server Hardware

Environment Model CPU RAM Controller

Windows ProLiant DL380 2 * Xeon X5650 (6 Cores) 32 Smart


G7 HT enabled GB Array
P410i

Linux ProLiant DL380 2 * Xeon X5650 (6 Cores) 64 Smart


G7 HT enabled GB Array
P410i

16
4.3. Environment configuration
This section sums up the main parameters modified to support described load.

Site Admin EHCache

Change Description

Changed 'TD Tables Struct Dir' cache Change customization cache that
to contain maximum of 300 projects contains tables metadata to
contain all projects to which users
login during load

Changed 'Fields Properties Directory’ Change customization cache that


cache to contain maximum of 300 contains project fields properties
projects to contain all projects to which
users login during load

Changed site configuration parameter Change site configuration to


CUSTOMIZATION_CACHE_MAX_ELEMENTS
support above changes
value to 300

JVM configuration
Increase JVM size in each application server node to 10 GB.

Oracle database configuration

Parameter Value

MEMORY_TARGET 14 GB
MEMORY_MAX_TARGET 16.5 GB

MSSQL database configuration


Maximum server memory is configured to 28 GB

17
4.4. Load profile and test scenario
 Load of 1000 active users.
 1000 active projects in site:
 20 ALM projects which contained real-world data and structure. The project
structure was based on selected enterprise ALM customers. See Appendix 1 for more
details.
 980 empty projects, used only for login-logout during the load (for jobs and cache
testing).
 22-23 requests per second on both nodes (~11 per node). 18 OTA requests per second and
5 REST ALM web client requests per second. See the “Project Load Profile” section for
more details.
 Load execution duration: 72 hours. ALM full system includes: 2 ALM Servers, PC Server,
50 PC hosts, 100 LAB hosts.
 50 PC users performing login and logout, test editing, test running and viewing online
reports.
 40 concurrent LAB runs

4.3.1 Project load profile


The stability test provides information regarding the product’s stability by covering the main
functionalities of common modules. The simulated usage profile (i.e., users’ distribution
among modules and number of events per hour per user) was based on a workload analysis of
selected ALM customers, representing different deployment sizes and configurations.
The load profile used for the test is based on the four main ALM modules, and most common
and important business processes.
The following table shows the number of OTA transactions that ran on each module during
one hour of the system load:

Generic Transactions per hour


Transactions

File Operations 986

Login 1800

Module Transactions per hour

Releases 967

Requirements 3,852

Test Plan 9,573

Test Lab 13,991

Defects 8,058

18
The following table shows the number of REST transactions that ran on each module during
one hour of the system load:

Generic Transactions per hour


Transactions

Login 996

Module Transactions per hour

Releases 96

Requirements 1283

Defects 2727

4.5. Test results


The test was performed 3 consecutive times, each run for 24 hours. The application server was up for the
entire test time frame.
During the entire run, the system was stable – no increases or peaks in TRT and system resources.
The following are some measurements taken from a 72 day run:

Category Measurement Avg. Value Max value

Transaction Add defect 0.111 sec 1.638 sec


Response
Time (TRT)

Filter on defects 0.290 sec 1.91 sec

Filter with group 0.345 sec 1.84 sec


by on defects

Create 0.107 sec 1.373 sec


requirement

System % total CPU time 5.5 % 35 %


resources

JVM throughput 99.38 % ---------

19
The following graph shows TRT during the entire load. All transactions are stable over time:

The following graph shows the stable % processor time of application server machine:

The following graph shows JVM heap usage of one node:

20
Appendix 1: Project Data Profile
The following table contains the details of the profile used for each ALM entity in
the project:

Module Entity Amount Structure

Release Folders 36 1 Root Folder -> 5 Folders -> 2


Folders -> 10 Releases -> 10
Release Releases 200 Release cycle

Release Cycles 2000

Requirements Requirements 30692 1 Root Requirement -> 30


Requirements -> 2 Req (9-10
Levels)

Business Component 107 1 root folder-> 100 folders ->


Components Folder 50 components -> 100 steps
Component 5210

Component Step 506000

Test Plan Test Folder 41152 1 root folder -> 70 folders

Test 185261 1 root folder -> 19 folders -> 2


Folders with 4 Tests (9 levels) -> 5
tests
1 root folder -> 50 folders -> 20
tests

BP Test to 20210 1 root folder -> 50 folders -> 20


component tests -> 20 components

BP Parameter 202100 1 root folder -> 50 folders-> 20


tests -> 20 components -> 10
parameters

Test Lab Cycle Folder 2802 1 root folder -> 50 folders -> 5
folders -> 2 folders -> 4 folders ->
Cycle 20002 10 test sets -> 20 tests

Run 400100

Defects Defect 60000 60000 defects

Asset Relation 337656

21
ERI Asset Repository 414396 1 root folder -> 10 resource folder -
Items > 2 resource folder -> 5 resource
folder -> 5 resource folder -> 2
resource folder -> 9 files (5 Lib, 3
Object Rep. 1 recovery scenario)

LAB Time slots 7000

Hosts 100

PC Hosts 50

22

Potrebbero piacerti anche