Brazilian technology enthusiasts Paulo Gomes and Jefferson Silva and -> Mota (Ygor Mota) cooperationSuccessfully converted a piece of EVGA GeForce RTX 2080 "Turing" Graphics Card into 16 GB memory.
The method is to replace the 8 GBIT GDDR6 video memory chip to a double density (16 gbit) chip.On the 256 -bit wide memory bus of the graphics processor, 8 such chips add up to 16 GB.The memory speed remains unchanged by 14 GBPS, as is the GPU clock.
The modification process includes each chip in 8 8 GBIT chips for welding, removes short -circuit pins on the memory pad, placed the welding ball with GDDR6 template, and then heating the situation.Welded the new 16 Gbit chip to the pad.
In addition to replacing the memory chip, you also need to adjust a series of SMD jumper near the BIOS ROM chip so that the GPU can correctly identify the 16GB memory size.The TU104 chip supports higher density memory by default, because (NVIDIA) uses the chip on the professional graphics card with 16GB memory, such as Quadro RTX 5000.
and the software does not need to do any operations, the TechPowerup GPU-Z will display the detected memory size.The benchmark test by running the "Biochemical Crisis 4" shows that the game uses nearly 9.8 GB of memory, while the original graphics card uses only 7.7 GB, the performance increases by 7.82%, from the average 64 FPS to 69 FPS.This is more than the performance gap between the 8 GB and 16 GB versions of the RTX 4060 TI.
In addition to the improvement of gaming performance, 16 GB memory can also significantly improve the RTX 2080 AI generating performance.