Π-Attention: Periodic Sparse Transformers for Efficient Long-Context Modeling | Not Hacker News!