By Mike Deliman
This may sound a bit funny, but in the space industry, we're constantly playing catch-up. We're either looking at or for things that happened millions or hundreds of millions of years ago, sending rockets off to get to where something will have been just in time to take a picture of or bore a hole into it, or designing new rockets for flight 5 years from now with computer bits that would have been considered top-of-the-line 5 or 10 years ago. When we're recovering data and sending commands from and to deep space probes, we point our antennae to where the probe is supposed to be 30 minutes from now and start sending our data now; the idea is by the time the data actually gets there the craft will be where it was expected and receive the commands, and send it's data back to us.
This process I just described – of anticipating where a craft is, transmitting before it's there – and it transmitting back – is pretty much how the Deep Space network is currently used. It takes huge amounts of planning, all done in advance, to set up the multiple sessions that allow one successful exchange like that to work out. People consult tables of times when craft will be "visible", consult tables of one-way light distances to find transmission times. When they think they know when they need time, and how much data they expect to exchange (how much time they need), they contact the folks who run the big antennas. if the time slot is available, arrangements are made, and that one set of transmissions can take place. The folks who run the Deep Space Network and their customers do this sort of thing all the time – it's how they try to make the most efficient use of their giant antennae.
Thinking about this whole process, it would seem that you should be able to automate much of it, if not all of it. There may be some circumstances where manual intervention is necessary, but if much of it were automated, automation might be able to help bringing data back indirectly. For a given satellite or probe a table could be maintained with information about the DSN and other contact points it maintains regular connections with. Other probes and satellites could maintain similar tables. This would enable other deep space probes, satellites, orbiters, and craft for planet exploration (landers, crawlers, flying / roving automata) to plan and create connections as-needed to trade data with each-other. On top of these communication connections you could layer protocols and methods, and from this, you could create a sort of network.
That is exactly what this article titled Dot Mars is about. Delay Tolerant Networking, or DTN, is an experimental protocol to enable the internet to reach out off of the Earth. JPL's Interplanetary Overlay Network (ION) implementation was tested last summer between the Deep Impact / EPOXI satellite and the Deep Space Network. The test results were: success.
DTN does more than just create another way to move data back from space. And it's potential reaches into terrestrial applications as well as to oter planets. The gains that could be leveraged by spacecraft could also be leveraged here.
In space DTN could be a key factor in project design. If a team knew for sure that the DTN internet was available, and would offload data at regular intervals, they could take advantage of that fact. If you knew you could send back a set amount of data per day every day, and you'd only ever have to store perhaps a week's worth of data backlog, you could leverage this in your craft's design. You may be able to design-in less long-term storage, making the craft lighter. You may have more bandwidth available, so you may decide to use sensors with more sensitivity or take more samples than you may otherwise have planned on. Returning more data and possibly better data should mean a higher likelihood of a successful mission. Overall, encorporating DTN into a space project could mean reduced costs and better returns.
Bringing the paradigm back down to earth, DTN could be deployed in a large number of terrestrial applications. For example, remote sensors currently wait to be connected to and commanded to offload data, or they attempt to send back their data at regular intervals. If they miss a specific connection it could mean lost data. If they were able to organize and tap into a DTN style of network, they could possibly offload their data to some alternative trusted node and not have to incur any data loss.
It could even affect your home life. Sensors, for example in your car, could store event related data indicating you need the spark plugs replaced. Right now if you disconnect the battery before a mechanic reads your OBDII data, you'd lose that information. With a DTN-enabled system, the car's onboard computer could store that data, and tag it as important. The next time it's near a trusted-node – at home, at work, or perhaps even hile filling-up at a gas station, it could email that data to the owner perhaps allerting them before a breakdown occurs. Having your car break down at the just wrong time could be an intolerable delay.