You Know Me, 20 May 2014Stop dreaming because it will be reality in 15-20 years years. We already have flexible scree... morePC and notebooks will never disappear, mobile technology will improve, but will not be a match for desktop processing power.
uzi.m, 20 May 2014The funny thing is that its not about the cores nor the clock speed. It's about the number of ... moreSee, this is a prime example of why everyone isn't built for mobile technology. The amount of misinformation you're providing here is ridiculously funny. Lol.
Hope it is power friendly. Waiting for the next generatiin chips which do not produce heat with terahertz speed.
Djzz, 20 May 2014Is it really nessesery with 8 cores? Wont it drain the battery like hell?Ever heard a word called...idle
uzi.m, 21 May 2014I wouldn't talk about something I have no knowledge of but ever happen to notice that computer... moreComputer systems slow down (noticeably) because of the additional drivers, programs and overhead (such as registries in the case of windows) which get added to the system and become bigger over time, eating up more RAM and CPU cycles.
The computer either runs or, as the other commenter wrote, it becomes FUBAR because transistors don't do their job and instructions become corrupted.
I have an old Gateway 386/33mhz system and an old Compaq 386 as well. They both still runs at 33mhz and as fast (or more appropriately, slow) as they ever ran.
If you notice your computer system is considerable slower and the hardware is the same reinstall the OS and voilà! it runs like new.
Why not apple buy chips from companies rather than order from samsung usung old antiquated american engineering apple chips... Well apple is stil 640 pixels and one gig slow ram so what else is new but a rehash old tech sold at platinum price hahaha. Go samsung go hehehehehe
uzi.m, 21 May 2014I wouldn't talk about something I have no knowledge of but ever happen to notice that computer... moreEven though it could be possible that H->L and L->H times became slower at the EOL, it highly unlikely it would be noticeable (to the end user) in that way.
There is a whole lot of logic built on top of any single transistor, logic which depends on meeting a certain (fixed) window of time. Should that window not be met, then the system will simply misbehave (it will act abnormally, as in fail to calculate proper results).
It wouldn't slow down to the user, the whole logic would just become FUBAR.
I just don't believe one could live to see this effects happen just from using the device (on planet earth at least).
We are talking about something which must take, at bare minimum, over XXX years of non-stop intensive use to even consider the chance of happening.
It's more likely the material will fail from corrosion than from use, and even then there is a whole lot of things that will surely fail first.
Anonym, 20 May 2014It's actually "true" in a sense (which hardly is applicable to the real world). A... moreI wouldn't talk about something I have no knowledge of but ever happen to notice that computers laptops and phones, whatever has millions of nanotransistors eventually slows down? It's cause of that. All of the products you purchase have a life date set on them. Better materials mean better life but if things don't stop working who'd buy the newer faster more expensive versions....
Julian, 20 May 2014Really... Transistors burn every time you power on your screen? I have seen stupid statements ... moreIt's actually "true" in a sense (which hardly is applicable to the real world).
A CMOS transistor depends on the integrity of the dielectric material that separates it's gates. He could be talking of some theory from the very complex field which studies the "wear-out" of this dielectric material, a field which quite frankly isn't my plate...
Still, we are most certainly talking about a completely negligible wear which is HARDLY A PROBLEM IN THE REAL WORLD.
McKnull, 20 May 2014This is why Intel will make a difference. Once the core has been increased, the power consumpt... moreDo realize that you are talking about performance advantages in the x86(x86-64) architecture, one whose origins had very little considerations about power usage (those considerations only began very late, with the growth of laptops).
ARM is a totally different beast. It's questionable if Intel has any advantage here (they did walk away from it a few years back, selling all their ARM operations to Marvell, thus conceding their failure to make any strides as far as they were concerned).
uzi.m, 20 May 2014The funny thing is that its not about the cores nor the clock speed. It's about the number of ... moreReally... Transistors burn every time you power on your screen? I have seen stupid statements before, but congratulations man, you have them all beat.
I absolutely hope that it will open/start a new era of flagship smartphone graphics, speed, etc!!!!! Better even than the NVIDIA Tegra K1!!! And if it will be! ? So really really compliments to QUALCOMM! !!!!!
The funny thing is that its not about the cores nor the clock speed. It's about the number of transistors in the CPU. See Moto X has a dual core processor but it can beat a lot of older quads. The number of cores helps with multitasking and power saving. Plus every time you turn on your phone's screen transistors burn out so that's the reason why phones slow down after a year or so...
Ferrij F.L., 20 May 2014I have a dream that one day, people going to work bring no notebook but Smartphone. The txt in... moreStop dreaming because it will be reality in 15-20 years years. We already have flexible screens, processing power improves every year, and a new form of a portable high-density energy source will be here soon.
This is why Intel will make a difference. Once the core has been increased, the power consumption will inevitable follow. Intel works a lot more effectively per core than any other CPU maker. A proof to that is the fact that AMD is going to use similar technology as Intel in their future high end processors.
I don't see how Qualcomm will maintain their momentum after 810 and 805.