GN provides domain-adaptive compression that improves with conversation length, outperforming general-purpose compressors like gzip and brotli in various scenarios:
-
Performance Against gzip: GN consistently beats gzip on all tested corpora.
-
Performance Against Brotli:
- Per-message: GN surpasses brotli on all five public corpora (ShareGPT, WildChat, LMSYS, IRC, Claude conversations).
- Ubuntu-IRC: GN shows a significant improvement of +47% over brotli.
-
Latency: The p50 latency for GN is negligible at 0.007ms per chunk.
GN leverages conversation history as a compression resource by using prior turns as a preset dictionary, which general-purpose compressors cannot access effectively. This approach demonstrates the potential of context-aware compression in enhancing efficiency for large language model (LLM) conversations.
Read the full article at DEV Community
Want to create content about this topic? Use Nemati AI tools to generate articles, social posts, and more.

![[AINews] The Unreasonable Effectiveness of Closing the Loop](/_next/image?url=https%3A%2F%2Fmedia.nemati.ai%2Fmedia%2Fblog%2Fimages%2Farticles%2F600e22851bc7453b.webp&w=3840&q=75)



