ALPHABOY17, 02 Sep 2020Well those users have 11gb of vram, no worries about vram for them. Just like gtx 1080 vs rtx ... morePlease tell what games require all of 8gb at 1080p? Because I have yet to see a single one
Going to upgrade from 2080 to 3080. 3090 costs a bit too much, I rather buy another subwoofer with the extra money left from buying the 3080 instead.
StuiWooi, 02 Sep 2020Hey, story author, I don't know if you'll read this as there doesn't seem to be... moreWell, if you are going that deep, the p usually indicated progressive scanning. There were also i for interlaced.
1080p/1080i were two methods to scan the pixels. So things evolve
Anonymous, 02 Sep 2020yer men same wit fones dont buy latest model save money buy last gen when we are all graved
No name , 02 Sep 2020That means you are still living under rocks😂😂😂. Nice joke. But I've used both. And I'm not kidding.
prasad-gsma, 02 Sep 2020The 'p' designation has evolved over the past few years. Of course it started off as... moreIt's usually 1080p144, not with p behind it. This tells you it's 1080p at 144Hz. With less letters.
StuiWooi, 02 Sep 2020Hey, story author, I don't know if you'll read this as there doesn't seem to be... moreThe 'p' designation has evolved over the past few years. Of course it started off as a way to identify that the signal was progressive and not interlaced. These days, however, all signals are progressive so we don't need to use it that way.
Recently, most companies have taken to use the 'p' as a replacement for either 'FPS' or 'Hz'. Camera companies often denote frame rate options within their settings menus as 24p, 30p, 60p, etc. On the other hand, companies like Intel also often use the 'p' to refer to refresh rate in their driver software. So if you go into the Intel Graphics Command Center, you will see the refresh rate listed as 60p, 120p, etc.
Of course, "120Hz" is the most technically correct term to use here, which is why I have now changed the article to reflect that, but "120p" wouldn't be inaccurate by modern standards.
3070 is a bargain if the claim is true
Hey, story author, I don't know if you'll read this as there doesn't seem to be the same level of engagement with the comments as I see on other tech websites but...
"8K 60p and 4K 120p"
This is not the first time I've seen such notation used here, *p* in display terminology typically refers to the number of rows of pixels, eg. 1080p being 1920x1080 in a 16:9 aspect ratio. *K* as shorthand for thousand (SI prefix kilo-), confusingly, refers to the approximate number of columns of pixels, eg. 1920x1080 can also be 2K.
Do I presume that what you're really going for in the quote is *Hz*, indicating the refresh rate these support and not some truly outlandish 3840x120 (288:9) or 7680x60 (1152:9) resolutions?
Anonymous, 01 Sep 2020A laptop or computer without Intel processor and nvidia graphics card is useless. No AMD for me.That means you are still living under rocks😂😂😂.
But recommended for RESIDENT EVIL 2 and 3 Remake
Waiting how AMD gonna answer to this lineup with their rdna2 whether it can keep up with the performance or they will have to be less than $50-$100 than nividia launch price.
YUKI93, 02 Sep 2020I'll just wait for AMD's new Navi 2.0 Radeon GPU card.amd better in cpu
I'll just wait for AMD's new Navi 2.0 Radeon GPU card.
lottery248, 02 Sep 2020but they ripped the users just bought 2080Ti.Well those users have 11gb of vram, no worries about vram for them. Just like gtx 1080 vs rtx 2060, guess who have had a better time! ; ) I bet my *** that Nvidia in 3 months releases RTX 3070 Super with 11gb, just like they did with rtx 2060! (Rtx 2060 6gb and then Rtx 2060 Super 8gb)
I really wanted to buy RTX 3070, considering that my gtx 1060 is aging. But after 4 years I get what? 2 gigs of more Vram, and much better framerate for what? I don't care about 144hz, I want the maximum graphics possible, you're gonna push the cpu on the edge all the time, except in 4k! Also, "suitable for 4k" you serious!? At 1080p there are some games that requires MORE than 8gb of vram, in 1080p!!! Seriously, is Nvidia lost?
lottery248, 02 Sep 2020but they ripped the users just bought 2080Ti.Your never gunna have the best for long with technology. Not like the cards are obsolete all the sudden