Small-lenguaje-Model-Hybrid-Norm-Furier-Formers
PublicA compact language model implementing HybridNorm and Fourier-based attention. Combines CoLA (low-rank projections), FANformer, and hybrid normalization to create an efficient decoder-only transformer. Leverages periodicity modeling and gated residuals to enhance performance while maintaining a small parameter footprint.