NVIDIA has announced a new budget graphics card, the GeForce GTX 1650. It is a replacement for the previous GTX 950 and the subsequent GTX 1050 GPU and comes in at $149.
Like the GTX 1660 and 1660 Ti before it, the GTX 1650 is based on NVIDIA's new Turing architecture and uses the TU117 GPU. NVIDIA claims 2x performance compared to the GTX 950 and 1.7x over the GTX 1050.
Compared to the 1408 CUDA cores on the 1660, the 1650 has 896. It's clocked at 1485MHz for the base frequency and 1665MHz for the boost frequency. The 1650 comes outfitted with 128-bit 4GB GDDR5 memory with 8Gbps memory speed and 128GB/s memory bandwidth.
Being a non-RTX card, the 1650 does not support hardware accelerated ray tracing but does support other Turing features, such as variable rate shading, enhanced NVENC encoder, enhanced memory compression, and more.
1650 graphics will be available starting today from OEMs such as ASUS, Colorful, EVGA, Gainward, Galaxy, Gigabyte, Innovision 3D, MSI, Palit, PNY and Zotac.
NVIDIA also announced a record 80+ new laptops coming to market with the new GTX 1650 and the GTX 1660 Ti GPU. These will be available from manufacturers such as Acer, ASUS, Dell and Alienware, Gigabyte, HP, Lenovo, MSI, and Samsung over the coming months.
And this will sell like hot cakes compared to 570 or upcoming Navi...
I recently bought an 8GB RX 570 ... it is good value for money. However, I was told that NVidia GPUs are better for Machine Learning stuff. I regret buying AMD now as I need a good GPU for my machine learning education.
Rx 570 is better priced and it has better performence