Google ExcelAutomate.com: Testing Best Practices

Testing Best Practices

Test Case Creation
Once requirements have been created and approved, while coding is in process, it is time to create test
cases. The idea is to have a complete list of test cases completed before coding is complete, that way
you do not waste time during the QA process.
Use the Test Type field (choice list) to identify the type of test:
 Negative
 Positive
 Regression
 Smoke Test
Release Preparation / Regression Test Cases
Each new release has the potential to break existing features of your software. It is a great idea to have
a list of regression test cases that are run to ensure that all the existing features work as originally
designed.
Prior to the testing process for the release, the existing test cases should be updated to ensure a high
quality test process for the release. Below are the items that should be done in preparation for testing
the release:
 Clear the Smoke Test flag for all existing test cases, as we should be designing new smoke test
cases for this release. Do a fast edit to change the value to No.
 Review all the regression test cases (have a Test Type of Regression Test Cases). If there are
any that are no longer applicable to the regression test case set, change the Status to Outdated.
 Review all the regression test cases (have a Test Type of Regression Test Cases). If there are
any that are missing (maybe we added some major functionality in the prior release that is not
included in the regression set), add the test cases to the regression set.

Smoke Tests
If you want to ensure that your testing process will be fruitful, it is common practice to create smoke
tests. Smoke tests are simply a list of about 20 to 30 test cases that address the major areas being
tested. For example, if you are creating a new Project Management system, a smoke test will include
basic test cases to ensure that you can create a new project plan, add tasks to a project plan, rename a
project plan, make changes to a project plan, and delete a project plan. So these are not “over the top”
test cases that really tax the functionality, it is simply tests that should work because they are very
basic.


If the Smoke Tests fail, it may be that the QA process should stop and wait until those are fixed, or it
may be that testing can progress with requirements that did not fail the test case (as to not impede
progress). The idea is that if the requirement cannot pass the basic tests, it is not ready for a vigorous
QA testing process, so all smoke tests for a specific requirement must pass before the QA team spends
valuable time testing it.
When creating the smoke test, use these guidelines:
 Create a Test Set for the Smoke Tests
 Assign the smoke test cases to the person merging the code.

Review the smoke test set as a team, here are the key things you are looking for:
 Is every Requirement for the release covered by at least one smoke test item?
 Are the smoke test items clear enough to run?
 Are there any missing smoke test cases to ensure a good start of testing?
 Are there any smoke test items that are too in-depth and should not really be a smoke test
(should be a standard test case)?
 Do we have a manageable set of smoke test cases (20 to 30 test cases)?
Positive Testing
Positive testing is testing the software in the exact way it was designed. To create these test cases, you
will want to follow the requirements document to ensure that all features of the software are tested.
In positive testing, you are not trying to “trick” the system; you are simply testing all the features per
the design.
Negative Testing
Negative testing is used to try to break the software in ways that a typical user might, by entering
erroneous data. Here are some examples of negative testing:
 Date formats – Try entering invalid dates (like 02/30/2006). Try entering alphabetic dates
(like Jan 1,2006), try entering in totally bogus information (xxxxxxx).
 Numeric formats – If you know a field must allow only numeric entry, try entering character
data (abcdefg). Try entering commas, decimals, hyphens, dollar signs.
 Field Sizes – If you know the size of a field (let’s say up to 20 characters), try entering a large
amount of data (like 100 X’s), it should handle that gracefully. Many times this will cause a
crash. I like to go to every field and just enter tons of xxxxxxxxxxxxxxxxxx and see what
happens.
Relational Testing
Relational testing is where you try to break the software in a relational way. Here are some examples:
 Duplicate records – Try to create duplicate records. For example, if we are testing the
setting up of a Project Plan; create a new project plan by the name of Widgets 1.0. Then
add another project plan by the same name. That may be a business flaw; maybe you
should never have 2 project plans by the same name. This is going to be dependent on your
actual requirements, but should always be considered.
 Renaming items to cause Duplicate records – In the prior example, we tried creating 2
project plans with a name of Widgets 1.0. In this case, we will create a project plan named
Widgets 1.0 and another named Widgets 1.1. Then try to rename the Widgets 1.1 to
Widgets 1.0, it should recognize that this creates a duplicate.
 Deleting records should delete dependent records – If you delete data that has dependent
records, it should delete both the parent and the child records. For example, if you delete a
Project Plan that has 100 tasks, it should delete all 100 tasks, and the Project Plan itself.
 Deleting records with dependencies – In other scenarios, you may want to prevent a
person from deleting a parent item if it has child records. For example, let’s assume you
have a test case that is related to a defect. However, if you delete the defect, it will lose the
relation to the test case, so you would not allow the deletion of the defect unless the test
case was deleted first. This is just an example, and it will depend on your business rules.
 Auditing – Always test to ensure that all actions done are audited (if designed to do so). Run
audit reports to see if that is the case.
Performance Testing
It is a good idea to have some performance test cases designed for each release to ensure that each
release maintains adequate performance. Create a spreadsheet for Performance Testing, similar to the
one below. It contains the following sections:
 Before Release / After Release – Notice that you will record timings before the code is
merged and after, that way you can determine if the release is faster or slower than before.
 Preparation – To correctly achieve timings, you must clear your cache, reboot your
machine, etc before doing the timings.
 Area Testing – Test each critical area of your software, logged in with different accounts
(accounts with small amounts of data, accounts with large amounts of data, etc)

Automated Testing
Automated Testing has many benefits including: Quicker Releases – By having your regression test cases
run automatically, your software quality team can concentrate on testing new features of your software
and less time regressing existing features. Higher quality releases – Your software releases will have
fewer bugs and require less customer support because they will be of higher quality. Happier Customers
– Your customers will be happier and more willing to serve as testimonials for future prospects.

No comments:

Post a Comment