This year’s Hollywood Professional Association (HPA) Tech Retreat, held last week in the Palm Springs area, presented its usual array of top-notch media-technology presentations, provocative panels and intriguing demonstrations. Curated and led by the well-known “engineer and explainer” Mark Schubin, the Retreat has become synonymous with the cutting edge of the media production industry, and this year’s sold-out event did not disappoint.
The HPA itself took the opportunity to announce a name change (from the former “Hollywood Post Alliance”) and unveil its new logo (shown above). A pre-conference seminar of interest to broadcasters was “ATSC 3.0: Content Creation and Distribution,” at which the ecosystem for advanced content creation and service deliveries enabled by the developing ATSC 3.0 DTV standard was discussed in detail. This well-attended event kicked off what ATSC plans to be an ongoing dialog with the creative community on feeding the pipeline that ATSC 3.0 will provide for new and enhanced television delivery to consumers in the timeframe of 2018 and beyond.
The conference offered a number of other excellent presentations geared toward broadcasting, but one in particular captured the imagination of many attendees, and was instantly hailed as a standout. It was entitled “Deep Space Exploration: Technology Challenges for Motion Imagery Acquisition and Distribution,” presented by Rodney Grubbs, NASA’s Imagery Experts Program Manager, who Chairs the NASA Digital Television Work Group. He pointed out that one of NASA’s key goals in its exploration of space is to “virtually take everyone on Earth along for the ride” via sharing of captured imagery, but that the environment poses some serious challenges for today’s commercially available imaging technologies.
Three primary hazards in such environments include high radiation levels, the vacuum of space, and extreme ambient temperatures. Regarding radiation, the high-resolution sensors used by today’s cameras are damaged by non-ionizing radiation, causing dead pixels—on the order of 7 to 10 per day in a typical camera. CMOS sensors are more resistant to this effect than CCDs, but Grubbs pointed out that the cameras on the International Space Station are replaced annually, and inspection of the retired cameras often reveals what looks like a starfield image even when the lens cover is on, due the high number of dead pixels. He also noted that the loss of pixels increases in certain orbital zones, such as the not-yet-fully understood “South Atlantic Anomaly” (where the inner Van Allen radiation belt dips closest to the Earth’s surface, generally intersecting the orbit altitude of most manned missions). High radiation levels can also cause computer crashes, recording losses, and lens damage or coloration.
Operation in a vacuum is challenging for numerous reasons. First there is the simple fact that fans cannot be used for cooling, so all thermal control must be performed via heat sinking only. Any exposed moving parts also must be designed to work properly in a vacuum, complicating device development and testing. Finally, in any on-planet environment where there is little or no atmosphere, dust particles have not been subjected to the erosion of wind, so unlike the relatively smooth surfaces of dust particles on Earth, extraterrestrial dust can have sharp, claw-like surfaces, which quickly jam up mechanisms that operate well even in dusty or sandy conditions on Earth.
The ambient temperature in which space-borne equipment must operate varies from 250°F. to -250°F., an extremely arduous range made even more demanding by the often rapid rate of temperature change (e.g., when a spacecraft rotates from sunlight to shadow). This environment is particularly taxing to gaskets, seals and lubricants, so imaging elements like focus mechanisms must be hardened for such operation.
Interestingly, another earlier hazard that is less problematic today is extreme vibration (primarily during the launch phase of a mission). Newer launch vehicles—including those provided by commercial partners—produce substantially less vibration than earlier systems. Some of the latter are still in use, however (e.g., the Russian Soyuz spacecraft), so these are generally avoided, and the newer vehicles preferred, for carriage of any sensitive imaging systems into space.
Other constraints for any equipment brought onboard spacecraft include minimizing its mass, size, heat output and power consumption. For high-resolution motion imagery, transmission bandwidth efficiency is yet another concern, as are link integrity and latency. The latter issues render the use of IP connectivity problematic, given the long distances and single-path nature of most space-to-Earth links. For this reason, NASA is establishing multiple links wherever possible, in order to provide redundant transmission paths. NASA is also working with Vint Cerf (one of the “fathers of the Internet”) to develop delay-tolerant networking that has been dubbed “Internet for the Solar System.” An example of the difficulty posed by such latency is attempting to track a moving subject with a camera mounted on a spacecraft but remotely controlled from Earth. Due to the link distances involved, any pan/tilt/zoom commands can take several seconds or longer to be received by the camera, by which time the object to be photographed may have long since moved out of frame.
Some solutions that have been—or are being—developed to address these issues include:
- Radiation-hardened camera sensors
- High-resolution 360° cameras (having no moving parts, and providing the ability to capture wide-view images in which moving objects can be selectively tracked via pan-and-scan techniques applied in post-production)
- “Smart” cameras capable of autonomous operation (e.g., for recognizing patterns or tracking moving objects on their own)
- Large video buffers (for managing link outages and high or variable latencies)
- Auto-rebooting devices (for self-healing recovery after system crashes) and FPGAs that can update themselves in-mission
- Detachable/disposable cameras (for third-person views, spacecraft inspections after suspected damage, or other emergency operations)
- More efficient image coding systems (to reduce link bandwidth and on-board CPU demands)
Grubbs concluded with an outreach to the industry assembled at the Retreat for possible partnerships. Similar to NASA’s current work with external corporate launch-vehicle partners like SpaceX, there could be imaging partnerships in which vendors and NASA work out new, mutually beneficial business models that reduce NASA’s reliance solely on taxpayer funding for image capture and distribution. He envisioned a relationship more akin to PBS than to NASCAR, in which content rights may be shared, rather than decals affixed on launch vehicles and space suits.
For examples of current NASA imagery and other information, see http://www.nasa.gov.