This INFORMATIONWEEK article (Sept. 15) reports that testing is the neglected child in y2k. Yet it is crucial that systems be tested. There are no tools that will significantly reduce the time and costs of testing. And just about everyone is saying that his firm will be ready to test at the same time: late 1998. The article reports: "Many facilities have little flexibility in their schedules, particularly during the heaviest period of testing, which is anticipated to be next year. At IBM, for example, the fully staffed multivendor recovery centers are booked through 1999, and already some customers have come for validation tests."
I ask: Where will they get the spare mainframe capacity? They won't. They will not run the necessary tests. Remember also that the United States has only 20% of the world's code. If U.S. organizations correct and test 100% of all their code, this will not save them. They are tied into the world's noncompliant systems.
* * * * * * *
In fact, tools will reduce the total year 2000 testing effort by just 2%, according to a study performed for General Motors Corp. Also, while replacing older computer systems with new ones that are year 2000 compliant will eliminate some code renovation, it won't erase the need to test the new systems for business functionality and year 2000 compliance.
"Very little of the testing effort can be automated," warns Michael Yudkin, VP of corporate systems and architecture at Chase Manhattan Bank in New York. In fact, most automation is done in the remediation phase, which accounts for just 10% of the total effort, he adds. . . .
One reason for the huge chore: The number of systems that must work together is awesome. Winkelman's Sanwa unit, for example, is 10 years old, but its year 2000 inventory turned up 215 software components, 95 hardware components, and 115 hardware and software vendors. "We've done other inventories for assessment purposes, says Winkelman, "but not to this extent."