4x faster LLM inference (Flash Attention guy's company)

⚓ IT    📅 2025-10-12    👤 surdeus    👁️ 7      

surdeus

Warning

This post was published 96 days ago. The information described in this article may have changed.
Comments 🏷️ IT_feed