HP “Route to market”

Download Report

Transcript HP “Route to market”

From Sage 500 to 1000
Performance Testing myths exposed
Richard Bishop
From Sage 500 to 1000
…………..Performance Testing myths exposed
 Introduction
 Trust IV & Richard Bishop
 Project Background
 8 myths of (non-functional)
software testing
 What we did…breaking myths
 Conclusion
Introduction
Who are we?

Trust IV
• Founded in 2005
• Testing consultancy, specialising in automated non-functional testing

Richard Bishop, Trust IV Ltd
• IT consultant with 20 years experience
• Specialising in Microsoft platforms and performance engineering / testing
• HP specialist, UK leader of Vivit (HP software user group in UK)

Colleagues
• Mixture of consultants and contract resources
• Primarily HP LoadRunner specialists
• On customer sites and working remotely
• Skills in multiple test tools, platforms etc.
Non-Functional Testing
What on earth is NFT?
Non-functional, automated testing specialists
What on earth is NFT?
In a nutshell……….
Usability, reliability and scalability.
Compatibility testing
Compliance testing
Security Testing
Backup and Disaster Recovery Testing
Load Testing
Performance Testing
Scalability Testing
Stress Testing
Project Background
University Hospital Birmingham





Sage 500 to Sage 1000 migration
Concerns re: numbers of concurrent users supported
Required performance test to validate potential maximum user load
Test to include a single “user journey”, simulating a requisition process
Objective is to increase user load until system failure
8 Myths
(of non-functional software testing)








The vendor/developer has already tested this so we don't need to
NFT not required if functional testing and UAT is OK
Can / Can’t test in live environment
Web applications are easy to test using low-cost / open source tools
Anyone can test……
If a test fails “it must be the test tool / tester's fault”
Testing is too expensive / time consuming
If it's slow we can “throw kit at it”
Myth 1
Vendor/developer already tested - we don’t need to…
Myth 2
NFT not required if functional testing and UAT is OK
Single user
DB Server
Load balancer
Forms Server
IIS servers
Myth 2
NFT not required if functional testing and UAT is OK
Single user
Multi user
DB Server
Load balancer
DB Server
Load balancer
Forms Server
IIS servers
Forms Server
IIS servers
Myth 2
NFT not required if functional testing and UAT is OK
Single user
Multi user
DB Server
Load balancer
DB Server
Load balancer
Forms Server
IIS servers
Forms Server
IIS servers
Myth 2
NFT not required if functional testing and UAT is OK
Single user
Multi user
DB Server
Load balancer
DB Server
Load balancer
Forms Server
IIS servers
Forms Server
IIS servers
Myth 2
NFT not required if functional testing and UAT is OK
Single user
Multi user
DB Server
Load balancer
DB Server
Load balancer
Forms Server
IIS servers
Forms Server
IIS servers
Myth 3
You can/can’t test in a live environment
..…happen
results can be unreliable…..
Myth 3
You can/can’t test in a live environment
You can…..
….. prior to launch
…..or with extremely careful planning
…..or by mistake
Myth 3
Live environments.…. a cautionary tale
Myth 4
Anyone can test an application.
Source: http://www.pixar.com/short_films/Theatrical-Shorts/Lifted
Myth 5
Web apps are easy to test using low-cost/no-cost tools
Myth 5
Web apps are easy to test using low-cost/no-cost tools


Myth 6
If a test fails "it must be the test tool / tester's fault“
Myth 7
Testing is too expensive / time consuming
“The money spent with Trust IV was the best
money spent on the whole project”
Myth 8
If it's slow we can "throw kit at it"
What we did….
Our standard test approach
POC
Analysis
Performance
tests
Scripting
Low vol.
tests
POC- January 2013
We had a “steroid ferret”

NOT “just” a web app
POC- January 2013
Not a “web only app”

“Not just a web app“

Can use low cost / open source tools to test.

POC- January 2013
Needed specialist skills

“Not just a web app“

Can use low cost / open source tools to test.

Anyone can test

Digging deeper…
SAGE 1000 uses two communications protocols
Digging deeper…
SAGE 1000 uses two communications protocols
Had to convert displayed “human readable” text to legacy formats
to allow SAGE 1000 to interpret our simulated user input…
Digging deeper…
Complex test date requirements
SUBMIT POST "http://sagetest:80/webclient/jcsp.dll?"
"Comms&"
"__CS3SessionID2621355832230" IDENTIFIER 136
BODY "MfcISAPICommand=Comms&__CS3SessionID2621355832230\x00\..............
<Some data removed for brevity>
………….."\x00\xed\x00\r\x00F\x00O\x00R\x00C\x00E\x00P\x00S\x00\x00\x00\x00\x00\x01
\x04\x00\r\x00\x00\x00\x00\x01\x1d\x00\r\x00"………….."\
Digging deeper…
Correlation
SUBMIT POST PRIMARY "http://atum:80/webclient/jcsp.dll" IDENTIFIER 49
BODY "MfcISAPICommand=Application&__CS3SessionID2121358346714&135532676\x00";
SYNCHRONIZE REQUESTS;
SUBMIT POST http://atum:80/webclient/en-gb/NVPApplicationClient.asp? IDENTIFIER 50
BODY "MfcISAPICommand=Application&__CS3SessionID2121358346714&135532676\x00";
SYNCHRONIZE REQUESTS;
135532676
=
0x08141084
=
0xA 0x08 0x14, 0x10, 0x84 0xD
What we did
Our plan…
 Script simulating login and requisition
 1000 user accounts, 30,000 products
BUT
 Application update in January, meant re-coding, added delay 
 Initial tests showed problems with scalability…before requisition step fully scripted 
•
•
•
Problems @ 20 user load
Blank screens / HTTP 500 errors
Spent time proving test tool
Manual tests with volunteers
& network traces to prove
simulation equivalent to real users
Iterative testing began…
POC
Analysis
Performance
tests
Scripting
Low vol.
tests
…modified test approach as we found defects
Initial tests




V. high thread
count
> 2300 threads
associated with
the JCSP process
V. High context
switches rate
> 17,000
switches / sec
Initial Tests
Key observations
“System Idle”
“Manual test”
“Automated test”
<2%
<2%
<6%
30.1 GB
28.73 GB
27.3 GB
JCSP thread count
82
290
371
Total thread count
797
1932
2446
Context switches/sec
275
2000
2500
Processor queue length
<1
<1
<1
CPU utilisation
Available MBytes
High thread count and context switching are key issues
Initial Tests
User experience

Response times
> 20 seconds
Next steps
Reconfiguration & re-test

We made recommendations to improve performance
We asked Datel to check:
• Heap size
• Application pool size
• Timeout values

Datel reconfigured application server:
• Encrypted login credentials within the application
• Altered TCP/IP timeout values and keep alives
• Set lifetime session limit to 30 minutes
• Registry changes
Next Steps
Re-test

Re-tested, but….
Re-test
Observations
Next steps - retest
Despite load
balancer problem
Response times
consistent
No degradation
over time
Further tests
Test stats
1. 230 users
1289 logins / hr
2. 250 users
1383 logins / hr
3. 500 users
2715 logins / hr
4. 500 users
5412 logins / hr
Final report
….some caveats

We noticed large numbers of HTTP 404 errors

Still missed some “asynchronous” traffic

Hadn’t tested complex application flows, due primarily to time constraints
Conclusion
What have we learned ?

You probably need to test
• Reduced response times for SAGE 1000 login
from > 20 seconds (and timeouts) to ≈ 3 s
• Application worked, just not our particular configuration

Testing needn’t be expensive
• Thanks to UHB for the endorsement

Don’t trust vendors (or developers) to test
Conclusion
What have we learned ?

You probably need to test
• Reduced response times for SAGE 1000 login
from > 20 seconds (and timeouts) to ≈ 3 s
• Application worked, just not our particular configuration

Testing needn’t be expensive
• Thanks to UHB for the endorsement

Don’t trust vendors (or developers) to test
Q&A
[email protected]
@richardbishop @TrustIV
0844 870 0301