We released the final version of our patterns & practices Performance Testing Guidance for Web Applications. This guide provides an end-to-end approach for implementing performance testing. Whether you’re new to performance testing or looking for ways to improve your current performance-testing approach, you will gain insights that you can tailor to your specific scenarios. The main purpose of the guide is to be a relatively stable backdrop to capture, consolidate and share a methodology for performance testing. Even though the topics addressed apply to other types of applications, we focused on explaining from a Web application perspective to maintain consistency and to be relevant to the majority of our anticipated readers.
Download the guideRead the guide online
Stay tuned for a link to purchase the print version due to be available in early Oct. (via "softwaretesters" via Rosie Sherry in Google Reader)
I never really understood why so many people external to Microsoft seem to be against the Microsoft strategy to increase the amount of automation we rely on to test our products. Test automation has become sine qua non at Microsoft for many valuable reasons….(read more)
I previously had a list of software testing related videos which I had been building up overtime. However, this has now changed as I have recently been posting videos to the Software Testing Club, there has also been a bit of teamwork going on as I’m not the only one who has added videos.
If you’re looking for software testing videos or would like to add some to our growing collection the Software Testing Video Club (!!!) is the place to go.
Development houses have a right to expect a lot from a freelance tester.
Firstly, development houses often run on shorter development times and smaller budgets. Its essential for the tester to understand your requirements from day one.
Secondly, you have recognised the advantage of having an independent review of the product and that in its own right deserves to be appreciated.
Understanding testing requirements is not up there in the 10 most known IT facts. So, I’ve created a short questionnaire to help development houses clarify what they want from testing.
1) What do they value most? Consistency, Quality, Scope of testing,
2) What specifically do you want tested in cross browser testing?
3) What technology are you using?
4) What testing has already been performed?
5) Do you have any specs of any sort?
6) Is this new software, or updated software
7) What sort of feedback do you want? Defect reports, results?
Note: These questions have been created with web testing in mind, but can be changed for any type of testing
Having this information upfront helps everyone because:
1) we have an agreed understanding of the scope of testing
2) I am able to validate my quote.
3) The greater amount of information upfront, maximises your return on investment as I can prioritise and focus on your key areas (not what I think should be your key areas!)
4) It gives you an insight into the software testing process
This is the third installment of a currently unknown number of posts about heuristics and mnemonics I find valuable when teaching and conducting performance testing.
Other posts about performance testing heuristics and mnemonics are:
For years, I have championed the use of production logs to create workload models for performance testing. During the same period, I’ve been researching and experimenting with methods to quickly create “good enough” workload models without empirical data that increase the value of the performance tests. I recently realized that these two ideas are actually complimentary, not exclusionary, and that with or without empirical usage data from production logs, I do the same thing, I:
FIBLOTS. (via "softwaretesters" via Rosie Sherry in Google Reader)
My approach to any rapid change software development is to make a some basic assumptions
1) The code is going to change quickly
2) The person who coded won’t necessarily be there for the next release
3) The person who tested the last time, won’t necessarily be there for the next release
4) Test documentation helps to provide structure and is excellent as a guide to ensure adequate coverage
5) Testers are able to think outside the square and aslo be able to fill in the gaps in a test
I use one spreadsheet (if possible) to track the following information;
a) test script number
b) test link (if available which is a reference to requirements etc)
c) test purpose - a clear concise description of what I am planning to test
d) test results - Pass or Fail
e) defect number - I assign a number of use the defect tracking system
f) test estimate - the time I anticipate it will take to test this functionality
I use the document to first scope out what I want to test, a quick and dirty overview of what I am planning. I also use it to estimate how long testing is going to take. This is helpful if I am ever asked to substantiate my estimates.
Once all stakeholders are in agreement onto the scope, I create a concise and descriptive purpose of each test. This is in essence my test script. I use the same spreadsheet to enter results, defect numbers and to calculate simple metrics such as the pass rate.