High-performance WebAssembly attention mechanisms: Multi-Head, Flash, Hyperbolic, MoE, CGT Sheaf Attention with GPU acceleration for transformers and LLMs
@ruvector/[email protected] low health (58/100) — consider alternatives
Get this data programmatically — free, no authentication.
curl https://depscope.dev/api/check/npm/@ruvector/attention-wasmFirst published · 2025-12-02T18:40:40.314Z
Last updated · 2026-04-20T18:12:28.027Z