R1-2508637
discussion
High-Level Considerations for the 6GR Air Interface Design
From AT&T
Summary
AT&T provides high-level considerations for the 6GR sync signal structure and overall air interface design, presenting 3 proposals and 8 observations. The document argues for a bottom-up design paradigm starting from massive IoT requirements, advocating that 6GR support 3 MHz minimum spectrum allocations, deferring UL/DL bandwidth asymmetry decisions, and narrowing the study on smallest maximum UE bandwidth for low-tier devices to 3 or 5 MHz.
Position
AT&T proposes narrowing the smallest maximum supported RF and BB UE BW study for low-tier 6GR devices to 3 MHz and 5 MHz only, removing 10 MHz and 20 MHz options (Proposal 3). They require RAN1 to assume 6GR supports 3 MHz minimum spectrum allocation, as in LTE and NR, while explicitly decoupling this from SSB design optimization (Proposal 1). They argue UL/DL bandwidth asymmetry decisions must be deferred until DL minimum BW is agreed (Proposal 2). They present a bottom-up design paradigm where a single 6G RAT scales from minimum IoT KPIs upward, requiring massive IoT specification in 6G's first release to avoid the historical pattern of late-cycle IoT introduction seen in 4G and 5G (Observations 6-8). They insist MRSS efficiency arguments must reference real-world deployments rather than theoretical Rel-15 specifications and propose binding certification procedures with basic feature groups for global alignment (Observations 1-3).
Key proposals
- Proposal 1 (High-Level Considerations): RAN1 should assume that 6GR will support spectrum allocations of 3 MHz; this does not mean that the SSB design is optimized for 3 MHz.
- Proposal 2 (High-Level Considerations): RAN1 should not agree whether the UL bandwidth may be different to the DL bandwidth until the smallest maximum supported RF and BB UE BW without spectrum aggregation for at least one low-tier device type has been agreed for the DL.
- Proposal 3 (High-Level Considerations): Study the smallest maximum supported RF and BB UE BW without spectrum aggregation for at least one low-tier device type with only Opt1: 3MHz and Opt2: 5MHz, with FFS items on UL/DL bandwidth differences, SCS/duplex mode/band variations, and RF vs BB UE BW equality.
- Observation 1 (Conclusion): Any argument on the efficiency and feasibility of MRSS must be grounded in real world deployments, not in what Rel. 15 specifications theoretically allow 'on paper'.
- Observation 2 (Conclusion): A globally aligned and normatively specified testing and certification procedure should ensure features are not just declared as mandatory, but deployment of these features is tracked in 3GPP with a binding certification procedure for both the network and the user equipment.
- Observation 3 (Conclusion): Basic feature groups should be defined to ensure implementation, certification, and deployment of large numbers of features as components of a single feature group, rather than specifying a myriad of individual feature groups for a small number of components.
- Observation 4 (High-Level Considerations): TSG RAN #110 will revisit the minimum spectrum allocation in December.
- Observation 5 (High-Level Considerations): 4G LTE and 5G NR support spectrum allocations of 3 MHz.
- Observation 6 (High-Level Considerations): The 6G RAN and 6G Radio should be designed bottom-up, meaning a single RAT is scaled from a minimum set of KPIs, and resources are allocated upwards to the most demanding ones.
- Observation 7 (High-Level Considerations): The 6G RAN and 6G Radio should enable a platform comprising one set of specifications that support all use cases with minimum customization and large degrees of economy of scale.
- Observation 8 (High-Level Considerations): Because crucial design questions can most efficiently be decided in the first release of 6G, 6G massive IoT must be specified in the first release of 6G; solutions should be designed to last across technology generations with significantly improved efficiency compared to DSS between 5G and eMTC.