HP OSMS: Tomcat Sizing Guide for HP ProLiant
HP Part Number:
Published: May 2008
Edition: 1.0
?? Copyright 2008
Legal Notice
Confidential computer software. Valid license from HP required for possession, use or copying. Consistent with FAR 12.211 and 12.212, Commercial Computer Software, Computer Software Documentation, and Technical Data for Commercial Items are licensed to the U.S. Government under vendor's standard commercial license.
The information contained herein is subject to change without notice. The only warranties for HP products and services are set forth in the express warranty statements accompanying such products and services. Nothing herein should be construed as constituting an additional warranty. HP shall not be liable for technical or editorial errors or omissions contained herein.
Acknowledgments
Intel and Itanium are trademarks or registered trademarks of Intel Corporation or its subsidiaries in the United States and other countries.
RED HAT READY Logo and RED HAT CERTIFIED PARTNER Logo are trademarks of Red Hat, Inc.
4
5
6
7
8
Introduction
HP Open Source Middleware Stacks (OSMS) offer building block applications, such as the Web Server; technical blueprints, and documents such as this Tomcat Sizing Guide which describes the maximum workload that HP ProLiant
Executive Summary
The HP Open Source Integrated Portfolio (HP OSIP) comprises a range of products and services designed to verify that customers can successfully realize the cost and feature benefits of adopting open source software in their IT environments. HP Open Source foundation components include the base components of an open
The results published in this document were achieved using the Apache benchmarking tool and Apche JMeter for each of the HP ProLiant
Intended Audience
The intended audience for this document is anyone who is interested in determining the Tomcat user connection workload that can be supported on a given HP ProLiant
Scope and Purpose
This document presents the results of a series of benchmark tests performed using the Apache Benchmarking tool and Apache JMeter. The tests were conducted running a Tomcat application server to evaluate system performance when running a web application under a heavy load. The following HP ProLiant servers were used in this test: BL460c, BL480c, and BL465c. The benchmark data provided in this sizing guide can assist customers in determining which HP Proliant
HP provides quality assurance from extensive integration testing with open source software and HP hardware so that you can confidently deploy the complete stack. Once you have completed a successful evaluation, you have the flexibility to ???do it yourself??? or get assistance from HP to incorporate open source stacks into your existing IT infrastructure.
HP Services
HP Open Source Consulting Services can help you build and integrate open source and commercial software across multiple operating system (OS) environments. Additionally, HP Open Source Support Services provide industry leading technical support for all the products HP sells, including hardware, operating systems, and open source middleware.
To learn more about HP Open Source Consulting and Support Services, contact your local HP sales representative or visit the HP Business and IT Services website at:
For the location of the nearest sales office, call:
???In the United States: +1 800 637 7740
???In Canada: +1 905 206 4725
???In Japan: +81 3 3331 6111
???In Latin America: +1 305 267 4220
Introduction 9
???In Australia/New Zealand: +61 3 9272 2895
???In Asia Pacific: +8522 599 7777
???In Europe/Africa/Middle East: +41 22 780 81 11
10
Typographic Conventions
This document uses the following typographic conventions.
Publishing History
The document publishing date and part number indicate the current edition of the document. The publishing date changes when a new edition is printed. Minor changes might be made without changing the publishing date. The document part number changes only when extensive changes are made. Document updates might be issued between editions to correct errors or document product changes. For the latest version of this document online, see the HP Technical Documentation website at:
HP Encourages Your Comments
HP encourages your comments concerning this document. We are committed to providing documentation that meets your needs. Send any errors found, suggestions for improvement, or compliments to:
Include the document title, manufacturing part number, and any comment, error found, or suggestion for improvement you have concerning this document.
Introduction 11
Hardware Test Environment
This benchmark was performed within an HP BladeSystem
Table 1 Test Hardware Environment
(ext3 ??? no LVM): Disk0: /, /boot, swap
RHEL5 AS u1 SMP kernel
(ext3 ??? no LVM): Disk0: /, /boot, swap
RHEL5 AS u1 SMP kernel
During the test, the server and configuration that were used as the client where ab and JMeter were run is as follows:
Machine Model: HP ProLiant BL460c
CPU:
Memory: 2GB
OS: RHEL5 AS u1
Tomcat Installation
The Tomcat software is available on the Apache Software Foundation website. The file to download for Linux servers is
NOTE: Prior to installing Apache Tomcat, please verify that JDK is installed and configured on the system.
1.Unzip the *.gz file to the dictionary where you want to install tomcat.
In the example that follows, the directory used
2.To start Tomcat, enter the command:
The following messages display:
Using
Using
Using
Using JRE_HOME: /usr/local/jdk
3.Verify the Tomcat installation by launching a browser and navigating to the following URL: http://<YOUR_TOMCAT_SERVER_IP>:8080/
The Tomcat web page displays as shown in Figure 1.
12
Figure 1 Tomcat Successful Installation Verification
NOTE: If the iptables firewall is enabled in the system, the Tomcat server can not be accessed from other machines.
4.If necessary, stop Tomcat by entering the following command:
The following messages display:
Using
Using
Using
Using JRE_HOME: /usr/local/jdk
Tomcat Configuration
Setting JVM Options for Tomcat
Edit
Table 2 JVM Configuration
Setting Tomcat Connector Attributes
Edit
<Connector port="8080" protocol="HTTP/1.1"
maxThreads="3000" acceptCount="200" ...
Table 3 Tomcat Configuration
???false??? to skip the DNS lookup and return the IP address in String form instead (thereby improving performance). By default, DNS lookups are enabled.
Static Web Page Benchmark
This section provides information on running static web page benchmark on BL460c, BL480c and BL465c servers. The Apache Benchmarking tool was used on web pages of different sizes to generate two types of data: 1) the number of requests (recorded per second) and 2) the response time (recorded in milliseconds). A basic analysis of these results is also included with this data to assist you in understanding the benchmark results.
Benchmark Software
The Apache HTTP Benchmarking Tool is a command line (ab) benchmarking tool that is bundled with the standard Apache package to benchmark Hypertext Transfer Protocol (HTTP) Web servers. It is a free and open source software that is distributed under the Apache license. It can simulate large numbers of connections to perform a series of requests against given pages in applications on the HTTP web server. The tool then provides performance statistics, in particular data about the number of requests per second which the server is capable of serving and the average response time for all requests.
Table 4 Main ab Options
14
For more information about ab, see the following web site:
http://httpd.apache.org/docs/2.0/programs/ab.html
Benchmark Results
In the static web page testing, several web pages were used with different sizes that varied from 5KB to 200KB. For each web page, ab was run with a varying number of concurrent connections and total requests to determine the maximum requests per second and the maximum response time for 80% of the fastest requests. For this test, the concurrent connections varied in number from 100 to 2500 and the total number of requests was set to 100 times the number of concurrent connections.
The benchmark results for the testing are presented in the following sections and show the maximum number of requests per second and maximum response time for 80% of the fastest requests that each system can handle, depending on the number of concurrent connections and the size of pages.
The following tables show that, with regard to requests per second, having more connections can negatively impact the value for small size pages like the 5KB page but have little influence for bigger size pages. In regard to response time, when the number of concurrent connections is larger, so is the response time. For relatively small pages such as a 5KB size page, both the HP ProLiant BL460c and BL480c servers, configured with 2
HP Proliant BL460c Server
The HP ProLiant BL460c server has features that are equal to standard 1U
Table 5 Static Web Page Test Results for the HP ProLiant BL460c Server
Static Web Page Benchmark 15
Figure 2 Static Web Page Test Results for the HP ProLiant BL460c Server - Requests per Second
Figure 3 Static Web Page Test Results for the HP ProLiant BL460c Server - Average Response Time
HP Proliant BL465c Server
The HP ProLiant BL465c server is a
Table 6 Static Web Page Test Results for the HP ProLiant BL465c Server
16
Figure 4 Static Web Page Test Results for the HP ProLiant BL465c Server - Requests per Second
Figure 5 Static Web Page Test Results for the HP ProLiant BL465c Server - Average Response Time
HP Proliant BL480c Server
The HP ProLiant BL480c server is a
Static Web Page Benchmark 17
Table 7 Static Web Page Test Results for the HP ProLiant BL480c Server
Figure 6 Static Web Page Test Results for the HP ProLiant BL480c Server - Requests per Second
Figure 7 Static Web Page Test Results for the HP ProLiant BL480c Server - Average Response Time
Application Scenario Benchmark
To evaluate the performance of a Tomcat application server, apart from benchmark testing on single pages, the scenario testing to simulate operations on an application in the real world is another important approach to obtain the maximum number of concurrent users a Tomcat server can support in the case of the system response time being met.
18
JPetStore was used as the test application deployed on the Tomcat Application server. This section provides a detailed description of the test scenario for JPetStore and, how to run Apache JMeter against the test application. The benchmark data includes the number of requests the Tomcat Application server can process per second and the average response time for all requests on each ProLiant Blade server.
JMeter Installation and Configuration
Apache JMeter is an open source Java desktop application. It is designed to load test functional behavior and can be used to measure performance on static and dynamic resources like Servlets, Perl scripts, Java Objects, and so on. It can be used to simulate a heavy load on a Tomcat server to test its capabilities, and can also make a graphical analysis of performance under concurrent load. JMeter version 2.3.1 is used in the following tests. For more information, visit the JMeter site located at:
http://jakarta.apache.org/jmeter/
The steps for installing and configuring JMeter are as follows:
1.Verify the computing environment meets the JMeter test requirements.
JMeter requires a fully compliant JVM 1.4 or later. JMeter Version 2.2 and later no longer support Java 1.3. Make sure you have the correct version of JRE/JDK installed and set the JAVA_HOME environment variable.
2.Download the latest JMeter version from the website and unzip the file to the directory where you want to install JMeter.
3.To run JMeter in GUI mode, run the jmeter file in the JMETER_HOME/bin/ directory.
NOTE: You can edit the JMETER_HOME/bin/jmeter file to modify the JMeter parameters or Java Virtual Machine (VM) options. For example, to create the memory consumed by JMeter from the default 256MB to 3GB, you can set the following line in the JMeter file:
For more detailed information, see the JMeter documentation located at:
http://jakarta.apache.org/jmeter/usermanual/index.html
JPetStore Installation and Configuration
JPetStore is a sample application based on the Struts and iBATIS framework. It is a completely rewritten Pet Store application based on Sun???s original J2EE Pet Store. For more information on iBATIS and JPetStore, see:
http://ibatis.apache.org/javadownloads.cgi
JPetStore can be deployed in Tomcat and other Java web servers. It supports MySQL, PostgreSQL and other databases. In this document, MySQL is used as the database for JPetStore. Figure 8 displays the architecture of the test environment.
Application Scenario Benchmark 19
Figure 8 Application Scenario Test Environment Architecture
Before performing the JPetStore installation on the Web server, verify that the following installations and configurations have been completed.
???Tomcat has been installed on the Tomcat server node and configured correctly.
???MySQL has been installed and configured on the MySQL server node. Make sure that MySQL is running correctly.
???Download the MySQL JDBC Driver file,
Perform the following steps to install and configure JPetStore.
1.Download the latest version of JPetStore from the iTBATIS website.
2.Unzip the file
In the
???
???
???
3.On the MySQL node, run the following commands:
mysql
4.Copy
Tomcat automatically deploys the JPetStore application if it is running.
5.Edit the file
Driver=org.gjt.mm.mysql.Driver Url=jdbc:mysql://<YOUR_MYSQL_SERVER>:3306/JPETSTORE Username=jpetstore
Password=ibatis9977
20
Application Scenario Configuration
A JMeter test plan must be created to simulate the requests sent to the server in the scenario. There are two approaches which are generally used to create a test plan: JMeter???s Proxy and Badboy. For JMeter, see the JMeter Reference Manual located at:
http://jakarta.apache.org/jmeter/usermanual/index.html
For JMeter's Proxy and recording tests, see the JMeter tutorial located at:
http://jakarta.apache.org/jmeter/usermanual/jmeter_proxy_step_by_step.pdf
For Badboy, see the Badboy website located at:
To simulate a customer???s shopping behavior, a scenario is created according to the following typical visiting steps:
1.Visit the index page, and then perform a login action.
2.Look through a kind of fish, and add it to the cart
3.Perform a search action, select a kind of goods from the search result and add it to the cart.
4.Switch to the cart page, and update the goods number to the correct number.
5.Check out and then logout.
Figure 9 displays a JMeter test plan that was created in accordance with the previously described scenario.
Figure 9 Typical JMeter Test Plan
Running the Scenario
Select the Thread Group element in the JMeter tree and increase the Number of Threads (users) from the default value to the value you want to test. Next, enter a value in the Loop Count field. For instance, if the thread number is 1000, and the loop count is 100, then JMeter creates 1000 users at the same time and each user repeatedly runs the test plan 100 times. Before starting the test, select Summary Report in the tree. While the test is running, you can watch the statistics from the page until the end of the test. At the end of the test, a Summary Report is generated, similar to the example shown in Figure 10.
Application Scenario Benchmark 21
Figure 10 JMeter Summary Report
Benchmark Results
Table 8 shows the requests per second and average response time of the scenario described in the previous section. The response time is defined as the time it takes the Tomcat server to send the object of an HTTP request back to the client. The requests per second is the average request number Tomcat is capable of handling (per second) in the particular scenario.
Table 8 JMeter Application Test Results
22
Figure 11 JMeter Application Scenario Test Results ??? Requests per Second
Figure 12 JMeter Application Scenario Test Results ??? Response Time
Summary
The performance of the Tomcat Application server is impacted by several factors such as network throughput, system processing capability, concurrent user connections, and so on. The data shown in this document were obtained under the precondition that network throughput was stable. However, it is not generally the case that all users are simultaneously requesting service from the application server. Among the different applications, the ratio of concurrent users to total users of a certain application is different, but usually varies between 10% and 30%. Another difference is the interval between requests of each user. The longer the interval is, the more users the Tomcat server can serve. In the benchmark testing described in this document, the interval between requests of each user was not considered. Therefore, for applications in the real world, the number of concurrent user connections that the Tomcat Application server can process on these
Summary 23
Resources
For additional information on the hardware and software used in the Tomcat tests, see the following websites:
HP Open Source Middleware Stacks (OSMS)
HP BladeSystem
http://www.hp.com/go/bladesystem/
http://h18004.www1.hp.com/products/blades/components/bladeservers.html
Apache Tomcat
http://tomcat.apache.org/index.html
Apache JMeter
24