The papers in the 2026 NAB Broadcast Engineering and Information Technology (BEIT) Conference Proceedings offered here were presented at the 2026 BEIT Conference at NAB Show. The program was developed by the NAB Broadcast Engineering and Information Technology Conference Committee, a rotating group of senior technologists from NAB members and partner organizations.
The content available in the 2026 BEIT Conference Proceedings is covered under copyright provisions.
2026 Proceedings Topics
- Capture Anywhere + Produce Anywhere: 8K Immersion + Deep-Sea UHD and Multi-Vendor Live Workflows
- Broadcast-Ready Innovation: Practical AI + Secure IP Links and NextGen Emergency Wake-Up
- Securing and Scaling ST 2110: Cybersecurity + JPEG XS + and DPU-Accelerated IP Workflows
- AI-Driven Live Production: Real-Time Metadata + Workflow Automation and Accessible Audio
- Smarter Media Ecosystems: Authenticity + Intelligent Metadata and Modern Networked Production
- Better Pixels & Better Sound: Solving HDR Interoperability and Measuring What Viewers Actually Experience
- Media Infrastructure + Compute and Network Control: Enabling Cloud MCR and Real-Time Workflows
- Visual Radio at Scale: Automation + Connected Car Platforms and Smarter Channel Regionalization
- Broadcast Reliability Under Pressure: Uptime Engineering + Precision Timing + and Wireless Spectrum for Mega-Events
- AI for Future Media: Smarter QA + Inclusive Experiences and Live Sports Streaming Innovation
- RF That Works: Antenna Efficiency + UAV Signal Mapping and Building the Broadcast Engineering Pipeline
- AI That Elevates Broadcast: Accessibility + Rights and Quality
- ATSC 3.0 Everywhere: SFN Coverage + Hybrid Broadcast-OTT Delivery and 5G Core Integration
- Where ATSC 3.0 Goes Next: Brazil’s TV 3.0 & Hybrid Reliability and O-RAN Broadcast Futures
- Resilient Broadcast Distribution: Hybrid Connectivity + Open Infrastructure and the Path to 100% Cloud
- Broadcast Positioning System (BPS): ATSC 3.0 Timing + Monitoring + SFN Deployment and PNT Coverage at Scale
Beyond Boundaries: Hybrid Broadcast Distribution – Multipath Last-Mile Connectivity Across Satellite, Fiber, 5G and LEO - $15
Date: April 10, 2026Topics: 2026 BEITC Proceedings, Resilient Broadcast Distribution: Hybrid Connectivity + Open Infrastructure and the Path to 100% CloudThe shift from satellite-based distribution to terrestrial and hybrid models is reshaping the technical, operational, and commercial approaches used for distributing linear video services. This paper evaluates key distribution C-band alternatives, Ku-band, terrestrial and the need for multi-path last-mile connectivity using dual‑diverse fiber, 5G Standalone and LEO satellite, assessing how each can match or complement the reliability historically associated with C‑band. It explores the operational challenges introduced, such as variable last‑mile performance, redundancy engineering, monitoring scale, and protocol selection across MPEG‑TS, DASH‑CMAF, and emerging Media over QUIC (MoQ) workflows. The paper also contrasts the fixed‑capacity economics of satellite with dynamic, consumption‑based IP cost models that enable more flexible regionalization, channel deployment, and cloud alignment. Together, these factors form a practical framework for organizations planning resilient, scalable distribution architectures that maintain broadcast‑grade availability while enabling new capabilities not achievable through traditional satellite systems.
Kenelm Deen | Synamedia | London, United Kingdom
Beyond Compliance: AI-Powered Multisensory Content Adaptation for Inclusive Media Experiences - $15
Date: April 10, 2026Topics: 2026 BEITC Proceedings, AI for Future Media: Smarter QA + Inclusive Experiences and Live Sports Streaming InnovationThis research addresses critical gaps in digital media accessibility for individuals with disabilities, where current solutions provide fragmented experiences that fail to capture the holistic storytelling essence. We introduce a novel approach leveraging generative AI, computer-vision, audio analysis, and video-scene-parsing technologies to create a comprehensive multisensory content interpretation framework.
Our methodology transcends traditional accessibility solutions by analyzing and reconstructing media content through visual characteristics, color palette, emotional expression mapping, dialogue context, and ambient sound elements. The system enables dynamic conversion of multimedia experiences across sensory modalities, including innovative translations to braille and other adaptive formats. For example, when testing with Shakespeare’s Macbeth, large language models were able to flag those famous dialogues such as ‘Birnam wood trees are moving’ might be difficult for audiences with intellectual disabilities to follow. Similarly, the depiction of Lady Macbeth washing her hands (blood on her hands) was identified as potentially unsuitable for viewers with General Anxiety Disorder (GAD) or obsessive-compulsive disorders (OCD). By reimagining accessibility as a comprehensive sensory experience rather than merely technical compliance, this framework revolutionizes inclusive media technologies, ensuring individuals with disabilities experience content with unprecedented emotional fidelity and artistic integrity.
Maheshwaran G, Punyabrota Dasgupta | Amazon Web Services India | Mumbai, Maharashtra, India
Broadcast Positioning System Deployment in a Single Frequency Network - $15
Date: April 10, 2026Topics: 2026 BEITC Proceedings, Broadcast Positioning System (BPS): ATSC 3.0 Timing + Monitoring + SFN Deployment and PNT Coverage at ScaleThe Broadcast Positioning System (BPS) encounters complications when utilized in a Single Frequency Network (SFN) environment, as each node of the SFN must transmit the time and location data. In a traditional Multi-Frequency Network (MFN) setup this would not present an issue,but an SFN presents a greater challenge, as the nodes all share the same frequency, and so can potentially interfere with each other. Because BPS is designed to be as receivable as possible, it is difficult to isolate these unique transmissions in the same manner as LDM for local content insertion, and so an alternative approach must be utilized. This paper outlines the background of these issues, describes a solution to resolve them, and provides the results of both real world and lab tests for a multi-node SFN with BPS enabled.
Liam Power | Edgebeam Wireless | Boston, Ma., United States
Nicholas Hottinger | ONE Media Technologies | Hunt Valley, Md., United States
Broadcasting from the Deep: Engineering a UHD Imaging System for Extreme Deep Sea Environments - $15
Date: April 10, 2026Topics: 2026 BEITC Proceedings, Capture Anywhere + Produce Anywhere: 8K Immersion + Deep-Sea UHD and Multi-Vendor Live WorkflowsThe MxD SeaCam is a broadcast imaging system rated to 7,000 meters of seawater, developed through a collaboration between DeepSea Power & Light, the Monterey Bay Aquarium Research Institute, and Fathom Imaging. Built around a Sony HDC-P50 camera module and a Canon CJ15ex4.3B UHD super-wide zoom lens, the system produces dual 12G-SDI outputs with HDR video in HLG/BT.2020 at 59.94 progressive frames per second and transports that signal over a single-mode fiber via a four-channel CWDM architecture carrying dual video, camera control, and telemetry over links exceeding 10 kilometers. A custom multi-element optical corrector compensates for the field curvature, chromatic aberration, and geometric distortion introduced by the hemispherical dome port air-to-water interface, maintaining sub-pixel spot sizes across the full zoom and focus range. The titanium pressure housing is validated to 7,000 meters through 5,500 pressure cycles, with a closed-loop thermal design using the surrounding ocean as the ultimate heat sink. The first operational deployment occurred during the NA176 expedition aboard E/V Nautilus in October 2025, reaching depths of 5,199 meters in the Cook Islands over 199 hours of ROV operations. This paper reports the system architecture, design validation methodology, and field trial results — including a power system interaction requiring in-field hardware modification, optical alignment sensitivity revealed during a maintenance event, and the HDR color workflow developed with OET’s video engineering team. The engineering lessons apply directly to any broadcast deployment where studio-grade signal quality must be maintained from a camera that cannot be reached, serviced, or repositioned.
Aaron Steiner, Jason Buss, Mikhail Rossoshanskiy | DeepSea Power & Light | San Diego, Calif., United States
Mark Chaffey, Dr. Paul Roberts | Monterey Bay Aquarium Research Institute | Moss Landing, Calif., United States
Dr. Paul Remijan | Fathom Imaging | Brimfield, Ma., United States
David Robertson | Ocean Exploration Trust, E/V Nautilus | Anchorage, Ak., United States
Compact 8K × 8K Camera; Imaging the Immersive Era - $15
Date: April 10, 2026Topics: 2026 BEITC Proceedings, Capture Anywhere + Produce Anywhere: 8K Immersion + Deep-Sea UHD and Multi-Vendor Live WorkflowsBest Paper of the 2026 NAB Broadcast Engineering and IT Conference
The demand for wide-field-of-view immersive video production is increasing. To address this, this work develops a compact 8K × 8K camera system. The camera incorporates a square-format image sensor with a compact 5/6-type footprint and ultra-high-throughput (59-megapixel 8K × 8K resolution, 60 fps, 14-bit depth), achieving superior performance in high-mobility shooting scenarios and vertically extended image acquisition compared with recent large-format high-resolution sensors. The camera system employs a modular design in which the lightweight camera head is separated from the signal-processing unit by a fiber, enabling flexible deployment on gimbals, vehicles, and drones. Production trials were conducted using this system, including the parallel production of immersive VR content and 4K IMAX content derived from the same 180° fisheye 8K × 8K footage, and multi-region-of-interest (ROI) extraction for 2K broadcasting studio applications. Through these trials, the system proved effective for multiplatform content production and helped simplify conventional broadcast workflows. The system not only broadens broadcast capabilities but also shows adaptability for the emerging immersive era.
Kodai Kikuchi, Akira Honji, Kohei Tomioka, Tetsuya Hayashida, Takenobu Usui,
Toshie Hiroshima, Kazuya Kitamura, Takayuki Yamashita | NHK (Japan Broadcasting Corporation) | Tokyo, Japan
Hiroshi Shimamoto| NHK Foundation | Tokyo, Japan
Delivering Reliable, Complex Connectivity in Remote Productions - $15
Date: April 10, 2026Topics: 2026 BEITC Proceedings, Broadcast-Ready Innovation: Practical AI + Secure IP Links and NextGen Emergency Wake-UpThe concept of using IP and the public internet to provide backhaul and distribution circuits for professional media applications is now firmly established. Even the largest broadcasts and events routinely rely on it.
Initially, IP replaced conventional video circuits over leased lines, microwave, or satellite. Now remote production – where only the acquisition kit goes to site, and the production hardware remains in a central base – is commonplace. This is generally a good thing: it hugely increases productivity of the expensive switchers and servers, and slashes travel costs and carbon footprint by moving fewer trucks and people.
In practice, most events are now tending towards hybrid production, where some switching happens on site as well as in the Master Control Room (MCR). This is useful for managing local destinations like commentator monitors, telestrators, and in-vision screens, as well as providing feeds for other uses like scoreboards.
Add to this the need for return feeds so commentators can see the replays being generated, and the number of individual circuits rapidly begins to grow. The direct corollary of that is increased demand for a resource – internet bandwidth – which is not increasing availability at the same rate.
It is widely recognized that video traffic makes up more than 80% of internet data, and that proportion is continuing to grow. According to Forbes, YouTube alone takes 16% of fixed connection downloads; Netflix another 12%. Professional users – houses of worship and AV and events, as well as broadcast – have to fight for their share of the bits.
The advantage of fixed video circuits was that we got “broadcast quality” all the time. The pictures looked good, the sound was not distorted, and there were no interruptions. Audiences had no idea how any of this happened, but they grew accustomed to it. Losing a key moment from a major sport would be headline news.
Sergio Ammirata, Ph.D. | SipRadius | Parkland, Fl., United States
Driving Innovation Through Participation: Engineering Best Practices for In-Dash Visual Platforms - $15
Date: April 10, 2026Topics: 2026 BEITC Proceedings, Visual Radio at Scale: Automation + Connected Car Platforms and Smarter Channel RegionalizationThis paper examines the technical and strategic importance of active participation in digital ecosystems that power in-vehicle visual platforms in modern automobiles. For almost two decades, the industry prioritized enhancing RDS and HD Radio metadata quality; however, the focus must now include connected car platforms such as DTS AutoStage, which are rapidly gaining market share. These platforms are redefining radio’s presence in the dashboard, delivering advanced visual experiences that drive listener engagement, strengthen brand identity, enable new revenue models that increase client success, and provide actionable insights through return-IP connectivity.
In 2026, delivering a best-in-class experience across RDS, HD Radio, and connected car platforms is no longer optional. It is a baseline requirement for survival. Participation is not merely recommended; it is essential, and broadcast engineers are at the center of this transformation.
This paper shares best practices and current technical guidance informed by the Third Annual In-Vehicle Visuals Report, a comprehensive study of the top 100 best-selling new vehicle models in the U.S. and other emerging developments in the connected car ecosystem
Joe Marshall, Alan Jurison | Quu | Cincinnati, Oh., United States
Evaluating Cybersecurity Vulnerabilities in SMPTE ST 2110 Media Networks - $15
Date: April 11, 2026Topics: 2026 BEITC Proceedings, Securing and Scaling ST 2110: Cybersecurity + JPEG XS + and DPU-Accelerated IP WorkflowsBest Student Paper of the 2026 NAB Broadcast Engineering and IT Conference
This paper presents a threat-lab environment for empirically evaluating cybersecurity risks in SMPTE ST 2110 and AMWA NMOS-based IP media workflows. A segmented testbed, comprising dedicated media, control, and management VLANs; a router VM; and Raspberry Pi nodes acting as sender, receivers, adversary, and monitors, enables controlled execution of representative attack scenarios. These include Layer-2 manipulation (ARP spoofing, MAC flooding), multicast/IGMP disruption (rogue querier, join/report floods), RTP spoofing and payload replacement, and NMOS control-plane interference. Synchronized multi-vantage PCAPs and logs provide temporally aligned measurements of RTP loss, jitter, skew, SSRC behavior, IGMP state transitions, and NMOS heartbeat stability. Across experiments, results consistently show that multicast media playout can appear visually stable even as control-plane signaling and timing degrade. Attacks such as ARP poisoning, IGMP floods, and NMOS HTTP saturation produced significant jitter excursions, registry instability, and forwarding anomalies, yet often with minimal immediate impact on perceived video quality. These findings highlight a critical gap between operational perception and underlying network health. The paper concludes by mapping observed failure modes to practical mitigations and emphasizing the need for robust telemetry and defense-in-depth designs as ST 2110 facilities scale.
Miles Katz | Emerson College | Boston, Ma., United States
