Details… - Twin Cities Quality Assurance Association (TCQAA)

Download Report

Transcript Details… - Twin Cities Quality Assurance Association (TCQAA)

Performance in the New Millennium
Speaker: Mark Tomlinson
April 2012
Twin Cities Quality Assurance Association
First, a canine metaphor…
Old Dogs
New Dogs
What can we learn?
Old dogs:
• Take their time
• Consistent
• Predictable
• They know themselves
• Trusting and Loyal
• Obedient
New dogs:
• Anxious and Busy
• Ferocious Learning
• Eating and Growing
• Making mistakes
• Breaking the rules
• Unpredictable
Reactions to Novelty
Neophobic
Neophilic
…stop, look around
There are new things happening faster around us:
• Increase in globally interactivity
• Pervasive mobile use, dependency, addiction
• Era of the “connected worker”
• Power and control of information
• Accelerated cycles of novelty
“The world is shifting quite literally under our feet.”
– Roi Carmel
SO, WHAT’S NEW?
Web Performance Optimization
Why you should add #WPO to your approach:
• You can start earlier in the lifecycle
• You don’t need load, optimize with
single-session and data
• The tools are already there,
automate them (GUI Virtual User?)
• Influence performance closer to the
end-user
• It’s really popular – with meetup
groups and social media hangouts
Web Performance Optimization
80-90% of the end-user response time is spent on the
frontend. Start there.
8
Web Performance Optimization
On Twitter follow @souders or search for #WPO
stevesouders.com Head Performance Engineer at Google
Steve spurred the creation of numerous performance
tools and services including the Yslow!, HTTP Archive,
Cuzillion, Jdrop, ControlJS, and Browserscope.
Author of several books…
9
Cloud Performance
Tips on cloud-based performance testing:
• Cloud-savvy Ops teams are already open to this
• Consider your cloud provider carefully
To Cloud
From
Cloud
To On-prem
Self-service and on-demand
Security and Firewall issues
use different cloud from SUT
Data privacy issues
Repeatable but not consistent Traffic routing issues
From
Put LG’s outside the firewall
On-prem Test results stay on prem
Use secure channels to LG’s
Can’t find bugs outside the FW
Emulate real-world noise
Match production scale %
Mobile Performance
“The mobile insanity is here to stay…good are apps
being punished by bad performance every day.”
Consider new techniques for mobile performance:
•
•
•
•
•
•
Injecting mobile latency, switching, loss into tests
Inspecting mobile device performance (@JAHarrison)
Analyze mobile user behavior
Test for traffic segmentation
Test for application tolerances
Know your devices & users
Early Performance Testing
Old Performance Tester:
“As usual, performance testing is always left until the end of the project.”
New Performance Tester:
“Why not learn what we can as soon as we can? Shift left.”
Characteristics of early performance testing efforts:
–
–
–
–
–
–
Automation must “work around” unstable components
Service Virtualization and “stubbing” is a necessity
Results are always “preliminary” and not conclusive
Limited availability of test lab resources – shared infrastructure
Business visibility is much higher (agile, pairing, scrum, reviews)
Embraces the core principles of Exploratory Testing
Performance Testing in Production
Tuning in production – that’s old. Ops knows it.
Testing in production – that’s new. Ops is scared.
Real-world Performance Testing: bringing the realworld workload, noise, data, configuration, monitoring
into the lab. Enhancing Accuracy
Testing in Production: bringing the test into the
production environment. Simply easier and more costeffective than building a lab.`
Performance Testing in Production
Testing in Production: is becoming popular with new
companies who never had a lab or never knew how to
test properly the “old” way
SOASTA has a great slide
showing the list of things
that you can find with
traditional, in-lab
performance testing –
compared on a continuum
of issues that can be
found if you enhance your
approach to include
performance testing in
production.
PE vs. PT
Considering the titles:
• “…which has a more defined career path.”
• “…influences the software design.”
• “…requires developer skills.”
• “…drives the business objectives.”
• “…can save money for the company.”
• “…optimizes the end-user experience.”
• “…is exchangeable with the other title.”
The title “Performance Engineer” won 6 of 7
New Performance Testers
Developer PE
• Application Development
• Application Profiling
• Method/Query Tuning
• Architectural Review
• Component Performance
• Pre-release Load Testing
• Building Performance Tools
• International Micro-brews
Operational PE
• Capacity Planning
• APM and RUM
• Tuning and Root-cause
• Triage and Escalation
• Infrastructure
• Workload Analytics
• Scalability
• Single-malt Scotch
Software Developers in Performance
Once upon a time in the dev/test team…
Software Development Engineer in Performance:
•
•
•
•
•
•
likes automation but only to keep learning how to code
stepping stone to becoming a real developer
knows the automation code – still learning the app & users
not interested in testing, generally
responsible for running tests
accountable for running tests
Network Performance Optimization
Consider that load testing
without emulating the cache
settings on the client can
now result in total inaccuracy.
If your systems in production leverage network
accelerators or optimizers – consider this:
• Your response time measures are wildly pessimistic
• The bugs you seek are not in the application layers
• You have a new customer on the Network team
Profiling and Diagnostics
“If you aren’t monitoring the system under test while
you are running a load test – you should just give up
and go home. You’re causing more harm than good.”
Combine monitoring and profiling in your load tests:
• dig into the root-cause and recommend the solution
• correlate infrastructure usage to the application code
• dissect the system under test – follow the slow code
Agile Performance Stories
Functional User Story:
So that I can report to my management
As a campaign manager
I want to see the results of my active
campaign on a single web page
Agile Performance Stories
Performance User Story:
So that campaign managers can report
accurately and timely to their managers
As an Operations Manager
I want the data on the active campaign page
to render in less than 2 secs, when 10,000
users are logged in to the website
Listening for Performance
Active listening is a communication technique that requires
the listener to feed back what he hears to the speaker, by
way of re-stating or paraphrasing what he has heard in his
own words, to confirm what he has heard and moreover, to
confirm the understanding of both parties.
•
•
•
•
nothing is being assumed or taken for granted
reduces misunderstanding and conflicts
strengthens cooperation
it is proactive, accountable and professional
What's this DevOps stuff?
• DevOps
•
•
•
•
•
•
•
•
•
Self-proclaimed cultural movement (is it a fad?)
Embraces patterns for collaboration and
Leads next-generation engineering principles
Not grounded in top-down methods or mgmt
The “ops” part is performance savvy
The “dev” part…not so much
DEV
Continuous App Delivery
Continuous App Optimization
PERF
Performance binds Dev & Ops
OPS
Application Performance Index
A new way to measure end-user response to
experience – a score for tolerance:
• Measures response times with a calculation
• on a scale between 0 and 1.
• Infers a level of customer happiness or satisfaction based
on the calculation.
• Depends on two settings:
• T = is the threshold between satisfied and tolerating
• F = is the threshold between tolerating and frustrated
Check out: apdex.org
24
Application Performance Index
The default is set T to 6 seconds, and F to 26 seconds
It is not transaction-specific.
You still need to know your customer and set SLA’s
25
Application Performance Index
APDEX is still just a number - a product of a calculation:
To use APDEX correction you must:
1) determine the settings for T and F
2) explain the meaning of apdex to the biz
3) relate the APDEX score to revenue
26
Exploratory Performance Testing
Performance is inherently exploratory
• Has over-arching goals for performance
• Includes iteratively adapted tactics
• Fueled by self-direction and curiosity
• Requires dynamic test configurations
• Places a priority creativity and improvement
Physical
Logical
Exploratory Performance Mapping
Distributed
Centralized
Physical
Logical
Exploratory Performance Mapping
CLIENT
WEB
APP
DATA API
BROWSER
HTTPd
FX
RDBMS
OS
OS
OS
OS
Distributed
Centralized
Lifecycle Virtualization
“Using virtualization technologies along the path
of the entire application lifecycle is now an
essential part of continuous, automated
software development and release.”
•
•
•
•
Platform virtualization is the foundation – the virtual OS
Service virtualization eliminates application dependencies
Device and network virtualization can emulate the real world
Virtual Lab Management (VLM) allows agility in the test lab
New Load Testing Tools
Established:
• HP LoadRunner and PC
• IBM/Rational RPT
• Jmeter
• Microfocus/Performer
• Webload
• Compuware
• Microsoft VS
• Oracle/Empirix
Newcomers:
• AgileLoad
• Web Performance LT
• Neotys NeoLoad
• SOASTA CloudTest Lite
• Telerik
• Load Complete
New Cloud Perf Tools
Established:
• SOASTA CloudTest
• Keynote
• Compuware/Gomez
• HP LoadRunner Cloud
• HP PC SaaS
Newcomers:
• Blazemeter
• LoadZen
• LoadStorm
• Blitz.io
• Apica
• Cloud Assault
New Client Diagnostics
Established:
• Fiddler
• YSlow!
• Firefox Firebug
• Shunra
• Google PageSpeed
• Akamai Mobitest
• WebSiteOptimization
Newcomers:
• Charles Proxy
• WebPageTest.org
• Google PageSpeed
• WebWait.com
• GTMetrix.com
• BenchJS
New App Diagnostics Profiling
Established :
• JProfiler
• CA Wily
• HP Diagnostics
• Dell/Quest PerformaSure
• Intel V-Tune
• Microfocus DevPartner
• Microsoft VS - Intelitrace
Newcomers:
• Yourkit
• NewRelic
• dynaTrace
• Google Perf Tools
New Monitoring Tools
Established :
• IBM Tivoli TPV
• HP APM & BAC
• HP SiteScope
• Microsoft SCOM
• CA APM & Wily
• Dell/Quest Foglight
• Compuware Vantage
Newcomers:
• AppDynamics
• Nagios/Cacti
• Zyrion Traverse
• NewRelic
• Confio Ignite
• Gomez 360
• Splunk
• Blue Triangle
CONCLUSION
Our reactions to novelty are shaped thusly:
“The likeliness for an individual to be included in one of
those categories depends both on what it has learned
from its socialization with its own species and from
experiences from exploring its environment (as range
expansion often brings animals into contact with
novelty), especially the maternal influence of the
individuals’ experiences, and the individual’s genetics
influencing its likeliness to explore or focus on
remaining within a safe and familiar space.”
Source: “We All Like New Things, Or Do We? A comparison of human and non-human primate reactions to novelty.” (Wiley)
Mark Tomlinson
West Evergreen Consulting, LLC
[email protected]
+1-773-775-8780
mtomlins.blogspot.com
@mtomlins & @perfbytes