The papers in the 2024 NAB Broadcast Engineering and Information Technology (BEIT) Conference Proceedings offered here were presented at the 2024 BEIT Conference at NAB Show. The program was developed by the NAB Broadcast Engineering and Information Technology Conference Committee, a rotating group of senior technologists from NAB members and partner organizations.

The content available in the 2024 BEIT Conference Proceedings is covered under copyright provisions.

2024 NAB BEIT Conference Proceedings

  • Drone Measurements Validate the Accuracy of Simulation for FM Pattern Verification  - $15

    Date: April 3, 2024
    Topics: ,

    An extensive amount of experience has been gained in both drone measurement techniques of television broadcast antennas and data analysis using electromagnetic simulation. Through comparison, drone measurements and simulation predictions have time and time again validated that the techniques provide accurate measurements and predictions at UHF and VHF frequencies. Now that the FCC television channel Repack has passed, extending what has been learned to the FM market will provide new opportunities for FM broadcasters. Understanding the limitations of “old school” FM pattern range measurements and the power of computer simulation will be discussed in this paper. It will also include case studies, one of which would be impractical for any far field range and can only be realized using simulation and validation using drone-based measurements.

    John Schadler | Dielectric LLC | Raymond, Maine, United States
    Jason Schreiber | Sixarms | Gold Coast, Australia



  • Dynamic Ad Insertion through [Data Distribution as a Service System (DDaaS) – MMT Protocol – Broadcast Application] without Internet connection - $15

    Date: April 3, 2024
    Topics: ,

    Over-the-Air (OTA) advertising lacks features provided by its Over-the-Top (OTT) counterpart due to the one-way nature of OTA advertising. Introducing the Advanced Television Systems Committee (ATSC) 3.0 standard and its approach to broadcasting information utilizing the Internet Protocol (IP) narrows the gap between the two environments. While there are similarities in the mechanisms to create regionally addressable advertisements through OTT and OTA (utilizing the ATSC 3.0 standard), they are not identical, primarily due to the latter being a one-way broadcasting system. This paper introduces two potential solutions to provide targeted advertising through the timely insertion of advertisements into live TV programs. We also review a Data Distribution as a Service (DDaaS) platform and the ATSC 3.0 MPEG Media Transport (MMT) protocol, which are integral to making these solutions feasible.

    Sangsu Kim | One Media Technologies | Hunt Valley, Md., United States
    Niakam Kazemi | Sinclair Broadcast Group | Hunt Valley, Md., United States



  • Elevating Video Quality With the Video Compression Score Metric - $15

    Date: April 3, 2024
    Topics: ,

    In today’s media landscape, ensuring the delivery of high-quality content has become a top priority for service providers. Not only does it impact the viewing experience for an audience that is more discerning than ever, but it’s directly tied to customer satisfaction and retention, and therefore the bottom line. Compounding the issue is the ever-expanding volume of content being delivered. To effectively manage this enormous data flow, video content must be compressed before transmission, a process that can result in the degradation of content quality. The level of degradation depends on the temporal and spatial complexity of the content and encoding methods being adopted by the transcoders. With low-motion content like news programs, there are very small changes within frames (spatial domain) and across frames (temporal domain) that only require minor adjustments in encoding.

    With high-motion or high-textured content like action movies or car races, there are major changes in the spatial and temporal domains, which necessitate an increase in the bits required for encoding. To handle this increased bandwidth requirement, the video encoder must either supply the number of bits the content demands or be adaptive enough to change the encoding method. If not handled correctly, artifacts can occur that impact the viewing experience, such as blockiness, blurriness, flickering, and more. To estimate compression degradation — a metric known as the video compression score — reference-based or non-reference-based methods can be used. The reference-based approach involves comparing the compressed video with the original content, while the non-reference-based approach doesn’t require this comparison. 

    This paper will focus on a non-reference-based approach for calculating the video compression score that utilizes the encoded video parameters of the compression bitstream. Attendees will discover the advantages of this approach for a variety of scenarios. These will include real-time quality monitoring for live video streaming, video conferencing, or security applications in which a reference video is often not available, and providing an objective quality assessment for automation, quality control, and troubleshooting purposes. To achieve the best results, they will learn how an AI neural network can be trained to estimate the video compression score; categorize the transcoded video as unacceptable, marginal, acceptable, and excellent; and correlate the results with well-known video quality methods such as Netflix’s VMAF. The result is a highly accurate, efficient, and cost-effective metric that can be used in real-time applications across the IPTV, OTT, and post-production markets.

    Shekhar Madnani | Interra Systems | Cupertino, Calif., United States
    Yogender Singh Choudhary | Interra Systems | Cupertino, Calif., United States
    Muneesh Sharma | Interra Systems | Cupertino, Calif., United States



  • Giving Your FAST Channels a Leg Up Using SCTE Technologies   - $15

    Date: April 3, 2024
    Topics: ,

    SCTE has a set of Technical and Engineering Emmy winning technologies that can be carried through to IPTV streaming manifest technologies including DASH and HLS. Popular FAST services use a manifest driven streaming technology dependent on its ad revenues through a simple ad replacement approach. It has a high-volume demand, but its ARPU is low compared with other services. To increase its revenues, FAST services need to expand to other streaming platforms and to utilize more sophisticated ad strategies. SCTE technologies can provide this and scale the deployment of these services through utilizing the SPN information in SCTE 224 to create audience-based manifests that can integrate with manifest manipulators/SSAI systems while reducing its amount of decisioning within it to support syndication and more complicated ad techniques.

    Yasser Syed | Comcast Cable, Comcast CTS | Philadelphia, Pa., United States
    Stuart Kurkowski | Comcast Cable, Comcast CTS | Philadelphia, Pa., United States



  • HDR-SDR conversion: Live HDR Single Master Production Conversion Interoperability Challenges - $15

    Date: April 3, 2024
    Topics: ,

    Live production workflows, particularly those for sports, employ complex pipelines to deliver HD, UHD, SDR, and HDR video streams. Because content originates as HDR or SDR, effective and flexible conversions are required between formats. A growing number of live sport events are today produced in a HDR single-master workflow, where conversions are still required at production stage and HDR to SDR down-conversion is necessary at distribution side. An emerging challenge is about ensuring operational knowledge to secure proper use of conversion tools and to improve interoperability between different solutions such as static and dynamic conversions. This paper discusses the challenge and evaluation consensus. A generic solution is proposed, based on metadata to improve workflows efficiency while providing premium HDR and SDR contents.

    David Touze | InterDigital R&D France | Rennes, France
    Frederic Plissonneau | InterDigital R&D France | Rennes, France
    Patrick Morvan | InterDigital R&D France | Rennes, France
    Laurent Cauvin | InterDigital R&D France | Rennes, France
    Valerie Allie | InterDigital R&D France | Rennes, France



  • How IP-based broadcast meets 5G for resilient and sustainable media distribution - $15

    Date: April 3, 2024
    Topics: ,

    The newest generations of technical standards for digital terrestrial television broadcasting have embraced IP-based approaches. At the same time, we have seen the inclusion of multicast and broadcast technologies in the most recent releases of the global mobile telecommunications standard, with the availability of 5G Broadcast being one significant outcome, along with the integration of 5G Multicast/Broadcast capabilities within the 5G Media Streaming system. While true convergence between broadcast and mobile technologies remains unlikely to occur, the preconditions for mutually beneficial interworking between the different systems seem now to have been mostly fulfilled. This paper describes the main evolutions of both broadcast and mobile technical standards as they have approached more closely the domains of the other, culminating most notably with the arrival of ATSC 3.0, DVB-I and DVB-NIP as game-changing systems, and the aforementioned new solutions from 3GPP. Having described the innovative aspects of the different systems, the paper highlights some of the collaborative initiatives that target interworking, whether at the system core, on the radio frequency level or on the service layer, involving the standards developing organizations behind the systems.  

    Emily Dubs | DVB Project | Geneva, Switzerland



  • Innovating Live Productions: Building Software-Centric Facilities on an Asynchronous Media Framework  - $15

    Date: April 3, 2024
    Topics: ,

    Modern media consumption habits require live production to be more adaptable, agile, and scalable. Hardware-centric bespoke infrastructure cannot offer broadcasters and professional media producers the flexibility needed. The technological innovations in generic IT and cloud computing, on the other hand, look compelling as a means of addressing this through software only facilities running on prem or in the cloud. However, the transition from traditional hardware-centric approaches to IT-based architectures presents challenges. Unlike broadcast, which relies heavily on clock-driven signal synchronization, IT equipment and cloud systems operate in an event-driven, asynchronous manner. This necessitates a fundamental re-evaluation of how live video is managed and presents opportunities to build low latency, frame accurate and resilient systems that match or exceed the performance of hardware using synchronous interconnects such as SDI or SMPTE ST 2110.

    This paper delves into the intricacies of building agile software facilities in a complete IT environment using event driven asynchronous processing for live media production, covering:
    Foundational concepts of synchronous vs asynchronous operations
    System architecture, including framework design, media microservices deployment, remote provisioning mechanisms, and application control
    Empirical measurements highlighting considerable time savings by processing streams asynchronously compared to realtime
    Benefits and implications for live production, such as scalability, reliability, agility, and composability

    Marwan Al-Habbal | Matrox Video | Montreal, Quebec, Canada



  • Is Synchronous Ethernet a Must Have or just a Gimmick for the Broadcasting Industry?  - $15

    Date: April 3, 2024
    Topics: ,

    Over the last few years, the Precision Time Protocol (PTP) has evolved to become the preferred method of choice for accurate time transfer over Ethernet networks for every application domain. PTP being an IEEE standard (IEEE1588) has helped but was by no means the only reason for this development. Semiconductor and device manufacturers alike have been adding PTP hardware support to their network products – a mandatory requirement to reach sub-µs accuracies. Most importantly, PTP can be tailored to the specific requirements of an application domain via PTP Profiles – a feature many industries made extensive use of. The All-IP Studio, for example, uses the PTP broadcasting profile (SMPTE ST 2059-2) for accurate time transfer.

    As a physical transport medium, Ethernet has superseded legacy solutions which were commonly used for many applications in the past. Ethernet is inherently asynchronous with only two adjacent nodes being synchronized with each other. This feature greatly simplifies deployment and maintenance and was possibly the driving factor of its success. When it comes to time and frequency transfer there is an obvious drawback. Accurate time must be transferred via a constant stream of packets, while frequency transfer cannot. Every end node must regenerate the frequency derived from the time information. This method has proven to be sufficiently accurate for many applications and is widely deployed, yet it has its limits concerning overall accuracy. If the quality of the time information deteriorates, the quality of the re-generated frequency will suffer as well. Specifically-optimized digital phase-locked loops can mitigate that effect but only to a certain extent. If end devices require accurate as well as highly stable frequencies for their operation, this limitation must be carefully considered.

    To circumvent this problem, the local synchronicity of Ethernet can be extended to provide a common frequency for the complete network. How can this be accomplished? Whenever two devices establish a communication channel via a physical medium, a transport frequency must be provided by either of the two nodes to which the other must synchronize to. In standard Ethernet, the selection of the respective devices taking over that role is arbitrary. If, however, the selection process is made user-definable, a common frequency can be propagated through the complete network.

    In this paper, we will describe synchronous Ethernet’s (SyncE’s) basic principles as specified by ITU. We will highlight the prerequisites of network devices to comply with SyncE requirements. Furthermore, we will focus on the software and system aspects of deploying and maintaining a SyncE network. Special consideration will be taken on how to best combine SyncE and PTP to improve both the accuracy and the resiliency of time and frequency transfer. Although SyncE was primarily designed to provide highly accurate time and frequency for modern telecom applications, we will analyze whether and to which extent the broadcasting industry can benefit from this technology. The paper concludes with real-world measurement in networks with SyncE and PTP support. We will highlight its performance under different operating conditions and demonstrate the impact of different failure modes. We will compare the performance of PTP with SyncE-assisted PTP.

    Nikolaus Kerö | Oregano Systems; Nvidia; European Broadcasting Union | Vienna, Austria; Geneve, Switzerland
    Thomas Kernen| Oregano Systems; Nvidia; European Broadcasting Union | Vienna, Austria; Geneve, Switzerland
    Ievgen Kostiukevych | Oregano Systems; Nvidia; European Broadcasting Union | Vienna, Austria; Geneve, Switzerland



  • Leveraging traditional GNSS time servers for resiliency and interoperability in Broadcast Positioning System (BPS)  - $15

    Date: April 3, 2024
    Topics: ,

    As a complement to existing Positioning, Navigation and Timing services, the Broadcast Positioning System (BPS) requires proper time synchronization in each of the base stations for proper operation. The shared time reference in the different stations combined with their known locations, allows the triangulation needed for positioning and provides a trusted time reference to users benefiting from this system. In order to maintain a traceable and accurate time synchronization, different features supported by traditional GNSS time servers can help, easing and expediting the deployment of these systems.

    Francisco Girela Lopez | Safran Electronics and Defense | Rochester, N.Y., United States
    Mark Corl | Triveni Digital, Inc. | Princeton, N.J., United States
    Alexander Babakhanov | Avateq Corp. | Markham, Ontario, Canada



  • Making the CIE Chart Indispensable for Color Grading! - $15

    Date: April 3, 2024
    Topics: ,

    With the advent of Wide Color Gamut (WCG), constraining color grading to a smaller gamut while grading to a larger gamut is a common enough task or problem in post-production. For example, one might set the grade to DCI-P3, while constraining the colors to ITU-R BT.709. During this process, colorists typically need to determine the amount of color excursions outside the gamut of interest and then decide on whether they need to remap the colors or ignore the excursions and allow the colors to clip. Most colorists typically use a combination of traditional tools like waveform monitors, along with reference monitors and work experience to make that determination. The CIE chromaticity chart provides a 2D view of the chromaticity content in the image. So far, the general feeling in the industry is that the CIE chart is complex and difficult to use and therefore it has been mostly confined to textbooks and academic publications. This paper proposes a few innovations [1] that help demystify the CIE chart and enable it to provide instantaneous useful information that can help colorists make quick decisions during the color grading process. The first step involves “linearizing” the CIE chart by effectively unrolling it and creating Gamut Excursion Measurements (GEMs), that provides a quantitative snapshot of the excursions outside a gamut of interest. These excursions can then be visualized using a false color heat map to help make quick determinations regarding the excursions. Adding luminance qualification to the CIE chart helps to additionally constrain the CIE chromaticity visualization to certain luminance ranges of interest. Combinations of these enhancements provide tools to make effective and fast decisions during color grading.

    Lakshmanan Gopishankar | Telestream LLC | Beaverton, Oregon, United States