Thursday, November 5, 2009

Preventing Columbia-Class disaster with photogrammetry and high-flying projectile ballistic analysis

Technology Review
arXiv Blog

When and if the Large Hadron Collider finally rumbles into action, it will produce a firehose of data like nothing physicists have ever seen. Ths data will consists if the tracks from the debris from roughly a billion collisions per second, as measured by particle detectors clustered around the collision sites.

That's far too much data to analyse in detail, so most of will be simply discarded using a simple filtering system that looks for trajectories of interest and stores them. That process should end up filtering roughly a hundred events per second for later detailed analysis. And all this must be done in real time, since any delay would rapidly overwhelm what buffering facility the accelerator has.

So what's all this got to do with the space shuttle? It turns out that a group of engineers at NASA want to use a similar mechanism to analyze the trajectory of debris around the space shuttle as it takes off. Their goal is to use the trajectory of these debris particles to work out their mass and density and also to trace their origin. With the right kind of analysis, it ought to be possible to flag up potentially damaging trajectories as they occur.

There's no need to to spell out why that's important, but here goes. In 2003, the impact of debris with the space shuttle Columbia during launch, so damaged the vehicle that it was unable to survive re-entry. A better analysis of that incident might have identified the extend of the damage and so prevented the loss of that shuttle.

Philip Metzger at the Kennedy Space Center and buddies have built the first stage of a filtering system that could do that job in real time using a pair of cameras that take high resolution of the launch from different angles. Together,this footage gives a 3D view of the launch allowing a computer to reconstruct the trajectory of any debris.That's not rocket science but, strangely, it has never been used to analyse launches.

Metzger and co have put their idea through its paces by analysing a piece of debris thrown up during the launch of STS-124, in May 2008. At the time, NASA engineers worried that this debris was a brick from a flame trench beneath the shuttle. A brick hitting the shuttle during launch could have caused significant damage.

The new technique, however, shows that the debris particle is low density foam, almost certainly from the solid rocket booster throat plug. This would have posed little threat to the shuttle.

Of course, coming to that conclusion, a year later is of little use to the shuttle crew who need to assess the conditioning of their vehicle almost immediately and certainly before they embark on re-entry.

That's where the LHC-like filtering mechanism comes in. Metzger at al say the data is easy to collect using their two cameras but the trouble is combing through it for interesting and useful insights. An LHC-like filtering system would simply comb through it during the launch and filter out only those debris tracks that are dense and massive enough to pose a threat.

That could save lives and although the Shuttle is due to be retired by this time next year, the process could easily be applied to future rocket launches anywhere round the world.

Ref: arxiv.org/abs/0910.4357: Photogrammetry and Ballistic Analysis of a High-Flying Projectile in the STS-124 Space Shuttle Launch

No comments: