Skip to main content link. Accesskey S
  • HCL Logo
  • HCL Notes and Domino wiki
  • THIS WIKI IS READ-ONLY. Individual names altered for privacy purposes.
  • HCL Forums and Blogs
  • Home
  • Product Documentation
  • Community Articles
  • Learning Center
  • API Documentation
Search
Community Articles > Lotus Domino > Domino deployment scenarios > Test Infrastructure : Domino 8.5.3 on iSeries V7R1 & SLES 11
  • Share Show Menu▼
  • Subscribe Show Menu▼

Recent articles by this author

Test Infrastructure : Domino 8.5.3 on iSeries V7R1 & SLES 11

Domino 8.5.3 on iSeries V7R1 SLES 11 Test Configuration Product version: 8.5.3 1 Overview The IBM System Verification Test (SVT) objective is to execute a set of test scenarios against a test configuration that contains the key requirements and components that will create a load in a Domino ...
Community articleTest Infrastructure : Domino 8.5.3 on iSeries V7R1 & SLES 11
Added by ~Ted Umfootherader | Edited by ~Judy Fezresaburynds on October 28, 2011 | Version 3
  • Actions Show Menu▼
expanded Abstract
collapsed Abstract
No abstract provided.
Tags: 8.5.3

Domino 8.5.3 on iSeries V7R1 & SLES 11

Test Configuration


Product version: 8.5.3


1 Overview


The IBM System Verification Test (SVT) objective is to execute a set of test scenarios against a test configuration that contains the key requirements and components that will create a load in a Domino domain consisting of two SLES 11 servers and two iSeries servers. Each server is running two Domino partitions.

This testing used test scripts currently used by the Domino System Test team.

One's perception of system quality is governed under the statement of overall system reliability. A widely accepted definition of software reliability is the probability that a computer system performs its destined purpose without failure over a specified time period within a particular execution environment. This execution environment is known formally as the operational profile, which is defined in terms of sets of possible input values together with their probabilities of occurrence. An operational profile is used to drive a portion of the system testing. Software reliability modelling is therefore applied to data gathered during this phase of testing and then used to predict subsequent failure behaviour during actual system operations

A reliability test is one that focuses on the extent to which the feature or system will provide the intended function without failing. The goal of all types of testing is the improvement of the reliability program with specific statements about reliability specific tests. Reliability is the impact of failures, malfunctions, errors and other defect related problems encountered by customers. Reliability is a measure of the continuous delivery of the correct service (and, the time to failure).

SVT's purpose of running Reliability tests was to ascertain the following:


· Data population for all parts of the infrastructure to force set limits to be achieved and passed

· Running sustained reliability scripts at >100% maximum capacity. Assessing :


· Breakpoints

· System stability pre and post breakpoint

· Serviceability


· Forcing spikes and anti-spikes in usage patterns

· Exposing NRPC, IMAP and DWA services to 110% of their maximum load

· Flushing out the DB Table spaces to their maximum, proving the maximum, proving ability to recover/get back to a good place when the maximum limits have been exceeded

· Proving serviceability errors and warnings when thresholds are hit

2 Configuration diagram for SLES 11 & iSeries configuration


The SLES 11 and iSeries configuration the Domino system test team used utilised four Servers, two SLES 11 and two iSeries servers. Each server had two Domino partitions installed.


The environment hosts 11,400 registered mail files. Each reliability Domino partition on iSeries has 3,500 registered users and each SLES 11 reliability Domino partition has 1000 registered users. The IVT Domino partitions on the other iSeries server Held 400 users for integration verification testing . This involved testing the functionality of Lotus Notes 8.5.3 on multiple operating systems and Lotus iNotes 8,5,3 on multiple browser applications against Domino 8.5.3 on iSeries and Sametime 8.5.1 integration. The specific testing covered by IVT will not be covered in this document.

It is generally recommended to use separate file system for Transaction Logging.

The design task was run prior to the start of the test to upgrade the templates. The update task runs for the entire period of the test (until the server is brought down).

Details of System under Test (SUT)
System
Virtual system (lpar) running on a 9119-FHB Power7
Processor
Six shared cpu's, 4GHz clock speed, Power7 IPL mode, uncapped
Memory
16GB
Model of Machine
9119-FHB
Disk Drive
6.7 TB of disk space, 192 35G SAN LUN's on a DS8700
Operating System
IBM I 7.1
Domino Server
Domino 8.5.3 Production Build for os400
Table 1


Table 1 shows the resources available in the physical iSeries environment


Details of System under Test (SUT)
System
2 x IBM System x3650 (with 2 domino partitions)
Processor
One Intel® XEON® CPU 5160 @3.00 GHz (dual core) per physical server
Memory
8GB per physical server
Model of Machine
x3650 Type 7979
Disk Drive
132 GB of Disk Space , 4 TB of SAN Storage
Operating System
SUSE Linux Enterprise Server 11
Domino Server
Domino 8.5.3 Production Build for Linux(32 bit)
Table 2


Table 2 shows the resources available in the physical SLES 11 environment




2.1 Evaluation Criteria


The performance of Domino 8.5.3 is evaluated under the following criteria:

· Server CPU: The overall CPU of the server will be monitored over the course of the experiment. The aim is for the server CPU not to go above 75% over the course of the experiment allowing the server to function appropriately. It is acceptable for the CPU to occasionally spike at this level for a short period of time, but it must return to a lower level. High CPU results from the server being stressed due to processes running such as compact, fixup or replication or from user load or any other third party programs.

· Domino Processes CPU: The previous metric monitors the overall CPU of the server, however, the CPU consumption of Domino specific processes will also be monitored individually. In this manner the CPU consumption of Domino specific processes may be evaluated.

· Server Memory: The server memory metric represents the amount of physical memory available on the server. If the available memory becomes low the server performance could be compromised.

· Server Disk I/O: The disk is a source of contention when a server is under load and performing a high number of read and write operations. The disk queue length is measured to determine if the disk I/O operations are resulting in a bottleneck for the system performance.

· Network I/O: These metrics monitor the network utilization to ensure the bandwidth consumption is acceptable and that the network is not overloaded.

· Response Times from the End-user Perspective: The server response times for user actions represent how long a single user must wait for a given transaction to complete. This metric captures the user experience of the application with the server. At times, response times will be longer when a server is under load. When response times increase over an extended period, or persist at high levels (e.g. when a database or view takes longer than 30 seconds to open), they indicate that performance indicators are being hit and detailed analysis must be performed to determine the source of the slowdown and seek remediation.

· Open Session Response Times: In addition to monitoring the individual action response times, the Open session response times will also be evaluated in order to ensure the server remains responsive over the course of the experiment.




2.2 Tools


In order to simulate user activity and capture the evaluation metrics discussed in section 2.1 a number of tools must be used:


· Testnsf: Testnsf is a capacity-planning tool that is used to run a variety of different loads against a targeted Domino server.

· Domino showstats data: The Domino showstats captures important server metrics. A Testnsf client driver may be used to execute the showstats console command at regular intervals for each server in the configuration and will provide Domino-specific data. The resulting data is logged in a text file and may be graphed for analysis.

· Open session: The Open session tool measures mail file request/response times. It will open a view of a mail database at a set time interval and record the response time in milliseconds. As a result, a server slow down may be identified by analyzing the resulting response times.




2.3 Evaluation Process


The Testnsf tool was used to place load on the Domino server. In order to simulate realistic load on the Domino server a total of 16 client drivers running testnsf were used.

Within the different test cycles, loads were applied to the configuration to simulate user activity. During each cycle for SLES 11, 500 mail users were directed at each DPAR along with 250 DWA users. For iSeries, 500 mail users were directed at each DPAR along with 300 DWA users and 300 IMAP users. The tests will run 24 hours per day for 7 days at the same time. There will be a ramp up and ramp down of one hour. This will replicate a real scenario with a company.

In order to isolate the performance of the Domino server under load from a single user perspective for standard Notes mail, a client driver will execute a “single user” open session script.

The results represent a single user experience of how the application will perform at busy times of the day when the server is heavily loaded.





2.4 Scenario: Online Mode

The scenario evaluates the performance of Lotus Notes Clients in online mode. Online mode means that the user mail files are stored and maintained on the Domino server. Every time a user performs an action the request is sent to the server and the mail file is modified and updated on the server side.

N85Mail Script with attachment size modification
Workload Actions
Action Count per hour per user current script
Action Count per 24 hour per user current script
Refresh inbox
4
96
Read Message
20
480
Reply to all
2
48
Send Message to one recipient
4
96
Send Message to three recipient
2
48
Create appointment
4
96
Send Invitation
4
96
Send RSVP
4
96
Move to folder
4
96
New Mail poll
4
96
Delete two documents
4
96
Total Messages sent
16
384
Total Transactions
52
1248
Table 3

Table 3 shows the action workload of the built in N85Mail script with modifications to the attachment size. The script reflects the workload that is expected of a single user over the course of a day.

Message Distribution in N85 Mail Script
Message size distribution
Percent of messages sent
Attachment size ( if any )
0 < size <= 1k
5.9%
N/A
1k < size <= 10k
66%
N/A
10k < size <= 100k
25.0%
50 KB
100k < size <= 1mb
2.8%
N/A
1mb < size <= 10mb
.3%
10 MB

Table 4


The resulting mail distribution is shown in table 4.

N85DWA Script with attachment size modification
Workload Actions
Action Count per hour per user current script
Action Count per 24 hour per user current script
Refresh inbox
4
96
Read Message
20
480
Reply to one message
4
96
Send Message to one recipient
4
96
Send Message to three recipient
4
96
Create appointment
4
96
Send Invitation
4
96
Send RSVP
4
96
Move to folder
4
96
New Mail poll
12
288
Delete two documents
4
96
Total Messages sent
20
480
Total Transactions
68
1632
Table 5

Table 5 shows the action workload of the built in N85DWA script with modifications to the attachment size. The script has been heavy modified to load attachments into DAOS, so is not really indicative of standard users activity.

Message Distribution in N85 DWA Script
Message Body size distribution
Percent of messages sent
Attachment size ( if any )
64 KB
70%
64 KB
64 KB
15%
12 KB
12 KB
6%
64 KB
64 KB
5%
584 KB
640 KB
2%
64 MB
12 KB
2%
584 KB

Table 6

The resulting mail distribution is shown in table 6.

N85IMAP Script with attachment size modification
Workload Actions
Action Count per hour per user current script
Action Count per 24 hour per user current script
Refresh inbox
4
96
Read Message
20
480
Reply to all
2
48
Send Message to one recipient
4
96
Send Message to three recipient
2
48
Move to folder
4
96
New Mail poll
4
96
Delete two documents
4
96
Total Messages sent
16
384
Total Transactions
40
960
Table 7


Table 7 shows the action workload of the built in N85IMAP script with modifications to the attachment size. The script reflects the workload that is expected of a single user over the course of a day.

Message Distribution in N85IMAP Script
Message size distribution In KB
Percent of messages sent
Attachment size ( if any )
500
10
N/A
10,000
30
N/A
50,000
40
N/A
50,000
10
50 KB
150,000
9.5
N/A
1,000
0.5
10 MB

Table 8

The resulting mail distribution is shown in table 8.


3 Test drivers


The workload is generated by 16 “driver” workstations.

The iSeries workload used 9 Drivers for IMAP, DWA and NRPC

The SLES 11 workload used 5 Drivers for DWA and NRPC

And two additional clients were used for Lotus Notes Administration client's and for running statistics collection and monitoring delays in opening databases routinely on each Domino partitioned server.


4 Conclusion and Summary


The test results demonstrate iSeries and SLES 11 machines configured as described in this report were able to support up to 1000 concurrent, active Notes 8.5.3 users per Domino Server with an average response time below 2 seconds.

The addition of other application workloads will affect the number of users supported as well as the response time.

Achieving optimum performance in a customer environment is highly dependent upon selecting adequate processor power, memory and disk storage as well as balancing the configuration of that hardware and appropriately tuning the operating system and Domino software.


5 Configuration settings


Server Notes.ini

The following notes.ini variable was added to each of the Domino Servers:

DAOSDeferredDeleteInterval=30 (This deletion of NLO files is known as “pruning” and occurs at the specified “Deferred Deletion Interval.”)


DAOSBasePath=DAOS (DAOS base path, if you leave it DAOS, and your data directory is C:\Lotus\Domino\Data, the full path to the repository would be C:\Lotus\Domino\Data\DAOS)
DAOSMinObjSize=1048576 (iSeries) (The minimum size setting for an attachment to make use of DAOS is 1096 bytes.)
DAOSMinObjSize=64000 (SLES 11)
DAOSEnable=1 (Enabling DAOS)
DAOSCatalogState=1 (state of the DAOS catalog)
CREATE_R85_DATABASES=1 (To enable ODS 51 as the default)
CREATE_R85_LOG=1 (To enable the creation of Transaction Logs in 85 format)
NSF_DOCCACHE_THREAD=1 (Helps to prevent memory problems associated with profile documents)

  • Actions Show Menu▼


expanded Attachments (0)
collapsed Attachments (0)
Edit the article to add or modify attachments.
expanded Versions (1)
collapsed Versions (1)
Version Comparison     
VersionDateChanged by              Summary of changes
This version (3)Oct 28, 2011, 12:05:08 PM~Judy Fezresaburynds  
expanded Comments (0)
collapsed Comments (0)
Copy and paste this wiki markup to link to this article from another article in this wiki.
Go ElsewhereStay ConnectedAbout
  • HCL Software
  • HCL Digital Solutions community
  • HCL Software support
  • BlogsDigital Solutions blog
  • Community LinkHCL Software forums and blogs
  • About HCL Software
  • Privacy
  • Accessibility