CCF: A Context Compression Framework for Efficient Long-Sequence Language ModelingWenhao Li, Bangcheng Sun, Weihao Ye, Tianyi Zhang, Daohai Yu, Fei Chao, Rongrong Jihttps://arxiv.org/abs/2509.09199
CCF: A Context Compression Framework for Efficient Long-Sequence Language ModelingScaling language models to longer contexts is essential for capturing rich dependencies across extended discourse. However, naïve context extension imposes significant computational and memory burdens, often resulting in inefficiencies during both training and inference. In this work, we propose CCF, a novel context compression framework designed to enable efficient long-context modeling by learning hierarchical latent representations that preserve global semantics while aggressively reducing …