When Nvidia data-analytics-id=”inline-link” href=”https://www.tomshardware.com/pc-components/gpus/nvidia-shares-blackwell-ultras-secrets-nvfp4-boost-detailed-and-pcie-6-0-support” data-before-rewrite-localise=”https://www.tomshardware.com/pc-components/gpus/nvidia-shares-blackwell-ultras-secrets-nvfp4-boost-detailed-and-pcie-6-0-support”>began to disclose details about its new 4-bit floating point format — NVFP4 — earlier this year, it stated that while it is mainly designed for inference, it could also be used for AI training without significant loss in accuracy. Recently, the company released data-analytics-id=”inline-link” href=”https://arxiv.org/pdf/2509.25149″ target=”_blank” data-url=”https://arxiv.org/pdf/2509.25149″ referrerpolicy=”no-referrer-when-downgrade” data-hl-processed=”none”>a paper describing how it managed to train a 12-billion-parameter model on a 10-trillion-token dataset using the NVFP4 format, with several supporting techniques, and achieved results that closely match those of an FP8 baseline.
icture…
icture>

![[CITYPNG.COM]White Google Play PlayStore Logo – 1500×1500](https://startupnews.fyi/wp-content/uploads/2025/08/CITYPNG.COMWhite-Google-Play-PlayStore-Logo-1500x1500-1-630x630.png)