High-performance WebAssembly attention mechanisms: Multi-Head, Flash, Hyperbolic, MoE, CGT Sheaf Attention with GPU acceleration for transformers and LLMs
[email protected] low health (60/100) — consider alternatives
Get this data programmatically — free, no authentication.
curl https://depscope.dev/api/check/npm/ruvector-attention-wasmFirst published · 2025-11-30T21:04:07.306Z
Last updated · 2026-03-27T21:02:47.160Z