News Product Releases

Nvidia GeForce RTX 4080 and GeForce RTX 4070 could consume lesser power than initially anticipated

A new leak suggests that the GeForce RTX 4080 will consume around 320 Watts of power and the GeForce RTX 4070 285 Watts. The latter is said to have a base clock of 2,310 MHz and a boost clock of 2,610 MHz.
New information about the Nvidia GeForce RTX 4070 and GeForce RTX 4080 has emerged online

So far, prolific leakers have all agreed that Nvidia’s upcoming Ada Lovelace-based RTX 4000 series of graphics cards will be quite power-hungry. The top-of-the-line RTX Titan (tentative) is rumoured to draw up to 800 Watts at peak loads. Its younger siblings, the GeForce RTX 4080 and GeForce RTX 4070 could guzzle up to 420 Watts and 300 Watts, respectively. However, those figures could be a lot lower at launch.

Kopite7kimi, a reliable source of Nvidia leaks, now says that the GeForce RTX 4080 will have a TDP of 320 Watts and the GeForce RTX 4070 a TDP of 285 Watts. The leaker states that the performance of the cards should remain largely unaffected. However, both models have the potential to outperform the GeForce RTX 3090 Ti thanks to the improved TSMC N4 GPU, which is essentially a two-generation leap over Ampere’s Samsung N8 GPU.

Kopite has also provided more details about the Nvidia GeForce RTX 4070’s clock speeds. It will operate at 2,310 MHz by default and can boost up to 2,610 MHz. The latter can go as high as 2,800 MHz under certain workloads. Other GeForce RTX 4070 specs could include a full AD104 GPU, 7,680 CUDA cores and 12 GB of 21 Gbps GDDR6 RAM on a 192-bit bus. These specs are identical to the rumoured GeForce RTX 4070 Ti from earlier, indicating that the actual RTX 4070 Ti could be even more powerful, complete with shiny new GDDR7 VRAM.

Furthermore, tech-savvy users should be able to extract more performance out of the GeForce RTX 4080 and GeForce RTX 4070, thanks to their overclocking potential. Previous rumours speculated that some Ada Lovelace SKUs could easily operate at ~3 GHz under the right conditions. In its current state, the performance difference between the two models is a bit slim, so one can expect Nvidia to nerf/buff one of them to ensure consistency across the lineup.