No Fan Boy...!, 12 Nov 2015If you are trying to say that how the Mali T880 MP12 will perform against the the PowerVR GT76... moreWell said man finally someone who know benchmark scores depend on screen resolution apple uses 1080p 720p screens while samung uses 4k screen so more tha 5. 88 more textures more to render some fools don knw the fact and telling apple is great apple was once great bt now no way samsung is developing
If this is true than I'm really proud of Samsung. I am glad that phone company's like Samsung can make faster phone's while not compromising battery life. 30% faster 10% more efficient chip- 30 or so mega pixal camera- 4k display- super strong new glass. I think we can all agree the new s7 is looking like a Beast and is going to be a huge leap for Samsung.
It really is going to be almost perfect there's just one more thing and that's for lithium ion battery's to step out and let one of those crazy new battery technology's to come in
No Fan Boy...!, 12 Nov 2015If you are trying to say that how the Mali T880 MP12 will perform against the the PowerVR GT76... morePowerVR Gt-7600 is with no doubt a powerful GPU and the fact that Iphone 6S and 6S plus uses 720P and 1080P screens may help the gpu doing it's magic better and more efficiency.On the other hand,the 6s plus suffers a lot of frame drops all over the UI which by the way the smaller 6s does not .maybe the 1080P screen is too much for this GPU to handle,maybe IOS is not optimized for bigger screens,i don't really know for sure.What i do know for sure is that PowerVR Gt-7600 offscreen benchmarks(screen resolution does not matter here) are higher than any of the competitors by much and that says something.And i really so excited to see how the the Mali-T880MP12 will perform inside the upcoming S7.
The GPU is quite interesting. Anandtech claim it can be potentially 2.25x better, which would put it ahead of a9 in terms of some on-screen tests too. They may instead focus on efficiency, and hopefully they don't go to 4k or at least give an option to lower resolution or something.
Nothing is definite yet, and we still don't know much about the improvements in adreno GPU. Plus they're only using 12 cores of malit880, we may get to see the 16 core in the note6.
Anonymous, 14 Nov 2015Better than A9X in multicore. Slower in Single core. Most likely better in GPU.
Better than x... moreAccording to a review of the iPad pro I have just read on CNET, the geek bench scores for the A9X are as follows:
Single core: 3146
Multi core: 5287
And the 3Dmark (ice storm unlimited) is 32,412
As far as I can gather the geek bench scores for the X1 are:
Single core: 2599
Multi core: 4691
But it's 3Dmark score is 40,535
So yeah it looks as thought the X1 is still the king GPU wise, but the A9X best handily beats it CPU wise. Still looking for figures on how they stack up power efficiency wise, but I would be surprised if the X1 wins in this regard.
But so far your guess looks to be on the money, X1>AX9 (GPU), A9X>X1 (CPU).
They care only about numbers, but when it comes power efficiency, you get 10%.
who in hell needs 4k display!!! No one dammit.
Seems like it's going to be on par with the 820, just maybe more power efficient and better multithreaded performance (which actually is the main thing already in Android, check Anandtech's analysis). I think we have an interesting year ahead, with several mature, similarly performing chipsets. That's great for the competition, as long as Qualcomm steps down with anti-competition measures.
Lex79, 13 Nov 2015I don't know everything, for example I said I would like to see a comparison between the X1 an... moreBetter than A9X in multicore. Slower in Single core. Most likely better in GPU.
Better than x1 in both single and multi core. Slower in GPU (likely) but much better performance/watt.
With 30% increase it means for the final customer nothing, same as other processor with the difference in money, more expensive, crap marketing stuff.
[deleted post]Doze will help - what is the proof available as of now? The right statement would be - doze might help improve battery life.
[deleted post]Oh great, now you as well with the childish fanboy accusations.
Read the Anandtech review I linked to earlier, 20nm was a flawed process, and again, am more inclined to go by their findings than your say so.
And yes I did say custom core designs are aimed at improving performance in key area's but success in that regard is by no means automatically guaranteed.
And we will see what difference doze makes to Samsung android based devices when we actually see one, I would argue that until we do, your certainty that it will have the same effect it does is on stock android devices is based more on your own bias towards Samsung than anything else.
And am getting sick being called a fanboy because of what amount to juvenile tantrums over what I say!, I don't do bias in tech, I don't care for any one particular OEM, platform, or device, I leave that to you lot (and pity you for it).
[deleted post]I don't know everything, for example I said I would like to see a comparison between the X1 and A9X, I don't know which would come out on top and don't much care either way.
You however, already know somehow who would win, so who's the know it all again? Who's the fanboy again???
And if you can't control yourself as you say, then maybe just refrain from saying anything at all perhaps?????.....just a suggestion.
AnonD-465175, 13 Nov 2015apple fanboy see u r wrong again. i told u TEGRA X1 is 7 months old ... very old u cant compar... moreOf course you are sure, that's what being a fanboy is, absolute certainty that your favourite brand/product is automatically better than everything else based on no evidence what so ever, just simply because it's your favourite!
Thank you for proving my point!
AnonD-465175, 13 Nov 2015do you remember what did i tell u ? TEGRA X1 is still on old 20nm that means more power consum... moreYes the X1 is 20nm we all know this, stop going on about it!!!
Even at 14nm it would still draw to much power, that GPU is just oversized for most mobile application situations, it's that simple. I believe Nvidia knew this going in and tried instead to focus on smaller portable gaming consoles such as the Shield console and larger form tablets, stuff like that.
As for your 'excuses giving you cancer' line, that's just more proof that you are indeed a nutbag
Lex79, 13 Nov 2015He's not getting your point, he's not going to get your point, I think we should both give up.... moreapple fanboy see u r wrong again. i told u TEGRA X1 is 7 months old ... very old u cant compare it. but u can compare it with Next Tegra . (Im sure A9X will be the loser of the two) :)
Muthu, 13 Nov 2015Hmm, you are still not getting my point. Tegra X1 should be compared against A9X, not A9 (or S... moreHe's not getting your point, he's not going to get your point, I think we should both give up.
But you are right in what you say, and I for one would be very interested to see a Tegra X1 v A9X comparison, done by Anandtech or someone credible.
Lex79, 13 Nov 2015Great, now compare the power draw between the two???? You might find that the X1 is a tad m... moredo you remember what did i tell u ? TEGRA X1 is still on old 20nm that means more power consumption and stop . u r giving me cancer with your excuses .
AnonD-465175, 13 Nov 2015Dude im not talking about Nvidia's marketing startegies ...etc im just talking about the level... moreGreat, now compare the power draw between the two????
You might find that the X1 is a tad more thirsty for power, just enough so that not one single vendor has decided utilise it in one single smartphone!
Muthu, 13 Nov 2015Hmm, you are still not getting my point. Tegra X1 should be compared against A9X, not A9 (or S... morecuz we have desktop soc and mobile soc :) simple and stop making excuses Tegra is always one-step ahead of its competition in the performance category (with yearly release)
AnonD-465175, 13 Nov 2015Dude im not talking about Nvidia's marketing startegies ...etc im just talking about the level... moreHmm, you are still not getting my point. Tegra X1 should be compared against A9X, not A9 (or SD 820 or Exynos 8890) becase A9 is used in mobile phones. But Tegra X1 and A9X cannot be used in mobile phones, due to limitations of the chipsets (read heating in a smaller device). So that comparison is invalid in the first place. Why don't you compare the power of Tegra X1 with Intel i7?