I am looking for a white paper or some process / past experience /
?how they did it? type of thing on how a company whose product is data
(NOT manufacturing), similar to Lexis-Nexis, that does full
Integration testing. NOT QA , QC, or QT or unit testing.
I found a good defination of Integration Testing on Microsofts web
site (go figure!):
Integration testing is a logical extension of unit testing. In its
simplest form, two units that have already been tested are combined
into a component and the interface between them is tested. A
component, in this sense, refers to an integrated aggregate of more
than one unit. In a realistic scenario, many units are combined into
components, which are in turn aggregated into even larger parts of the
program. The idea is to test combinations of pieces and eventually
expand the process to test your modules with those of other groups.
Eventually all the modules making up a process are tested together.
Beyond that, if the program is composed of more than one process, they
should be tested in pairs rather than all at once. Integration
testing identifies problems that occur when units are combined.
The largest hurdle I have run into is there is a TON on unit testing
or integration testing on a manufactured physical product but nothing
on a non-physical product. My product (data) goes through several
systems, can be changed by several systems, and is converted into
different formats across systems and several fields are dependent on
each other. (In other words if field A is changed it could effect
field B, C, and D) What processes or procedures can be done to do a
solid Integration Testing? How do I know that all of these systems
are
?playing well in the sandbox?? |