NVIDIA GeForce GTX 1630 is finally introduced as the new entry model. The new graphics card, which reflects the features we expect, appeals to a lower budget and will be a slightly more accessible graphics card.
We see that the new graphics card, which reflects the features we have seen in the leaks before, lags far behind the RTX 30 series when we evaluate it as hardware. So what features does this card have?
What are the NVIDIA GeForce GTX 1630 Specifications?
The new graphics card has a Turing TU117-150 GPU unit. So it comes with an old generation architecture. We already see that the expression GTX, not RTX, is used in its naming.
The graphics card, which offers 512 CUDA cores as the number of CUDA cores, will offer a standard clock speed of 1740 MHz . The increased clock speed will be at 1785 MHz.
The choice of the card in the memory part was surprising. Yes , 4 GB of GDDR6 memory is used, but the memory bus value is set to 64 bit .
Theoretically, the maximum bandwidth is 96 GB/s . Considering the year 2022 and seeing the point of graphics cards, we can say that these values are low.
But of course, we should not forget that we are faced with an entry-level graphics card. Apart from all this, I will answer another curious question for the new Turing architecture graphics card, which is announced to have a TDP of 75W .
There is no ray tracing support on the card anyway, but it should be noted that there is no Nvidia DLSS support . Among these three, there is only support for Nvidia Reflex technology.
GeForce GTX 1630 Specifications
- Architecture: Turing
- CUDA Core: 512
- Clock Rate Baseline: 1740 MHz
- Boost Clock Speed: 1785 MHz
- Memory: 4GB GDDR6
- Memory Interface: 64 bit
- TDP: 75W
- DLSS: No
- RTX: No
- Reflex: Yes
With all these features, the price of the card for the Chinese market has been announced. It’s roughly $150 .
We will see how the prices will be for other markets over time. It certainly won’t be a very expensive video card. Because we are talking about a card that will compete with the GTX 1050 Ti .