R1-2410216
discussion
Discussion on specification support for beam management
From Sony
Summary
Sony's document presents 18 proposals addressing AI/ML-based beam management for 5G NR, focusing on model inference procedures, time synchronization requirements, and performance monitoring mechanisms for both UE-side and network-side AI models.
Position
Sony advocates FOR explicit time-related information reporting in BM-Case 2, dynamic adjustment of time windows for both data collection and prediction instances, probability-based reporting for Top K beam predictions, and event-triggered monitoring with measurement overhead reduction. They push AGAINST using probability information as a self-evaluation metric for model performance and oppose one-shot performance calculations, instead favoring time-window-based evaluations with failure detection mechanisms similar to existing beam failure procedures.
Key proposals
- Proposal 1 (AI/ML model inference): For BM-Case 2 and the NW-side model, the time-related information corresponding to the measurement results from different transmission occasions should be explicitly or implicitly reported
- Proposal 2 (AI/ML model inference): For BM-Case 2 and the NW-side model, the time window for input data collection and the maximum number of transmission occasions within that window should be indicated to the UE, which should be based on the format of the model's input
- Proposal 3 (AI/ML model inference): For BM-Case 2, the time window size for data collection and the duration value for N future time instances can be dynamically adjusted and should be aligned between the NW and the UE
- Proposal 4 (AI/ML model inference): For the UE-side model, regarding the reporting of inference content, support beam information for the predicted Top K beams among a set of beams, along with probability-related information for these beams
- Proposal 5 (Model monitoring): For BM-Case1 and BM-Case2 with a UE-sided AI/ML model, for Option 2 (UE-assisted performance monitoring), we support Alt 2 and Alt 3
- Proposal 6 (Model monitoring): Support for defining event(s) to trigger reporting for monitoring, such as when the measured RSRP differs from the predicted RSRP for the same beam during the inference stage in BM-Case 1
- Proposal 7 (Model monitoring): For BM-Case2, Further study is needed to reduce measurement overhead based on the RS resource set configured by NW in model monitoring
- Proposal 8 (Model monitoring): For BM-Case 2, support defining events to trigger monitoring reports, such as when the multiple prediction results for the same future time instance vary significantly
- Proposal 9 (Model monitoring): Model failure is determined by the performance metrics that are calculated multiple times over a specific time period