When hurricanes come barreling toward land, people, airplanes and ships are relocated to avoid destruction. Now, some of the world’s biggest technology companies are making advances in figuring out how to quickly move large amounts of critical data and software out of harm’s way.

AT&T Inc., International Business Machines Corp. and Applied Communications Sciences have created a prototype technology to reduce the time needed to connect remote networks from days to seconds. Part of a seven-year program from the U.S. government’s Defense Advanced Research Projects Agency, the latest innovation from the partnership could allow large amounts of information and software to be more quickly shifted between private and public data centers, as well as between cloud services from different providers.

Moving data and software to another location could be crucial in coping with events like natural disasters and terrorist attacks as governments and companies have become increasingly reliant on servers, both on premise and in remote locations, to handle their information. The U.S. Department of Defense has expressed interest in the research, IBM said in an email.

“The key idea here is to have a highly dynamic backbone network,” said Adel Saleh, a research professor in electrical and computer engineering at the University of California, Santa Barbara, and a former DARPA project manager. “If the network is under physical attack or cyber attack, you can recover from this quickly.”

Remaining Hurdles

While many companies and governments already keep data remotely to have redundant systems, the companies’ prototype would let them reconfigure on the fly to handle changes in needs, such as computing bandwidth.

The ability to use capacity only when needed could also make cloud computing less expensive, Saleh said in a phone interview.

The partnership between AT&T, IBM and ACS is part of a DARPA program called Coronet that was started in 2007 to protect networks against catastrophic failures in order to help keep the Internet and government services up and running.

The technology is still in the proof-of-concept phase, said Dave McQueeney, vice president of computing as a service at IBM Research. Changes would have to be made to optical fiber networks—including the addition of new switches—for it to be adopted more broadly, he said. Work also needs to be done within cloud systems to accommodate the connections, McQueeney said.

If these changes are made, the technology could be used in military technology infrastructures and eventually for commercial use.

“Every client I talk to doesn’t want to put all of their IT in one place,” McQueeney said. “One of the long poles of the tent that needed to get solved was very, very rapid provisioning of huge amounts of bandwidth. Now we can have a vision of a multiple-cloud, hybrid environment where the networking is likely not the main bottleneck.”