Intel’s 8th gen CPUs, known as Coffee Lake, are rumored to remain on the 14nm manufacturing process, and are expected to launch this year.
The Eve V is more future-proof than I expected
I guess Intel really is feeling the heat.
Source : https://mspoweruser.com/intel-promise-8th-generation-core-i7-chip-later-year/
Well, as far as I know, Coffee Lake is not a full refresh, as it doesn’t include neither desktop SKUs nor Y-series. So I doubt they will indeed call it 8th gen… more like 7th gen refresh. IIRC they did the same with Skylake - some chips were released later, they just got new model numbers but were not called a new generation. They do that from time to time.
Additionally, they showed off an actually working Cannon Lake tablet just a month ago, while pulling the release date back from 2018 to 2017. They just can’t change their mind so fast
So… I really don’t see what those guys are basing their rumors on, but their arguments don’t seem strong to me.
There are also other ways of manufacturing than miniaturisation to increase cpu performance. And with only few nanometres in scale the top-down method will soon reach its limit. But will ask my nanophysics prof.
(That’s why they do research in order to develop a quantum computer - but that is still far in the future).
I think 7nm is the limit, but if they switch from Silicon to some other semiconductor, they can make it even smaller. It just depends on the material used. But of course, they will come to a point where no new material can help anymore…
Results: Coffee Lake is indeed fast, simply unmatched in single-threaded performance, but come on: its power demands can easily meet 12-core Ryzen ThreadRipper 1920X and that coupled with an inferior thermal interface material between the CPU die and the package material, makes it steaming hot (it’s Coffee Lake, after all).
One potential concern repeatedly discussed online is the 8700K’s thermals: the i7-8700K processor runs worryingly hot and to maintain comfortable (sub-80 degree) operating temperatures at its stock settings, it requires liquid cooling. Unlike Skylake-X, the Core i7-8700K is Intel’s mainstream flagship, so this thermal situation can only mean consumers and PC manufacturers who build with lesser, often compromised cooling solutions will run into thermal throttling, possibly thermal shutdown, and also possibly thermal damage. The worry is rather evident especially towards thermal damage since the i7-8700K does run hotter than the i7-7800X and other Skylake-X processors and the X299/Skylake-X processors have several documented cases of thermal damage occurring.
The power consumption is much lower than that of 7th gen chips with equivalent performance…
Yeah yeah… Same applies to any processor in the world, it just depends on how you define “lesser”. Some people prefer fanless solutions, for example. They will have problems with i7-7700K too… And liquid cooling is becoming more and more popular anyway
Contrary to enthusiast belief, liquid cooling is still not mainstream (after all these years, it is still the grand old Cooler Master Hyper 212) and i7-8700K is Intel’s mainstream flagship, not an exclusive HEDT product like X299. Like the 7700K before it, it will be going into more mainstream consumer products that often use cheap, off-the-shelf heatsink-and-fans. Unlike the incredibly hot 7700K before it, though, it draws even more power (sometime double the wattage!) and is even hotter (20 degrees hotter!) so this will pose a problem for most systems. You could get by with air cooling before: not here, realistically speaking. I knew people before who had issues with the 7700K hitting 90-degree temperatures in a variety of cases; now, seeing that reviewers with open-air, premium cooled test benches are getting 90-degree-plus temperatures, you have to stop and wonder how well the 8700K will perform for ordinary users. Its unsatisfactory thermal performance might explain why there was such a low Cinebench score (around 1200 instead of the 1400 we see in the reviews) for the leaked benchmarks of the i7-8700K on an HP machine–again, likely thermal throttling. Again, please look more closely at my images above. The 8700K draws as practically much power at times as Ryzen ThreadRipper (a 12- to 16-core processor with double the multi-threaded performance) and the 8700K runs hotter, thanks to its inferior thermal interface material. ThreadRipper itself requires premium liquid cooling so you can only imagine what kind of setup is required here for the 8700K. Here are a few other charts to take a look at:
Liquid cooling has been the requirement for overclocking top of the line “mainstream”, as you call it, processors for quite a while now. The thing is, these processors are not really “mainstream”. They’re meant for enthusiasts. For those who are going to overclock it. And for those people, liquid cooling is pretty “mainstream”.
We’re going to be waiting awhile for mainstream 10nm Intel chips that can outperform their 14nm counterparts from mny understanding.
As if it couldn’t get any worse, by Intel’s own admission, its first- and second-generation 10nm technologies – 10nm and 10nm+, respectively – will offer worse performance than its upcoming 14nm++ technology . Intel says the company’s 10nm technology won’t open up a clear performance lead over its 14nm++ technology until its third iteration – known as 10nm++ – which should go into production sometime in 2020.
Personally, I’m pretty excited for AMD/GloFo (on the desktop side) and Apple/Qualcomm/TSMC/Samsung (on the mobile side) to finally steal an edge from Intel. It’s possible for the first time in like, what, 15 years?
Umm, Qualcomm doesn’t make desktop processors. Completely different market, and Intel never really competed with them. Except their Atom efforts which failed soon. And Qualcomm never really competed with Intel either - it’s not their market.
I specifically said “on the mobile side” after Qualcomm there.
But, since you mention it, Windows 10 will soon run on Qualcomm chips, with x86 apps running in emulated instances.
Qualcomm this year is directly competing with Intel (and others) in the server chip market where ARM is actually quite viable. Microsoft even ported Windows Server to Qualcomm’s platform.
Lets be clear here: Microsoft and Apple are both hedging against Intel. The work they’re doing (in Microsoft’s case, decouple Windows from x86 dependency, in Apple’s case, to basically buy all the best semi talent in the world) isn’t like, just for fun.
As for why I would mention mobile in the first place? Because it dwarfs the desktop market, has done for awhile, Intel knew it, and they couldn’t compete. That’s why. The mobile chip designers are moving up, and Intel has already failed to move down. And this, in a year where Apple has sold millions of devices with silicon spun with a more advanced process than Intel currently has. That’s never happened before, ever.
Firstly, I know it’s your shtick here, but can you try being less condescending?
Just a tad?
Or should I try being more condescending?
Lets give it a go.
Uh, what? What exactly is your association between talent hunting and ARM architecture?
Hmm, I don’t know, why would Apple, who previously successfully navigated a compete architecture change on their desktop OS, go all-in on semi design? Not just on small factor chips but now on dies that are actually larger than Intel at this point, even though they have NO performance competition from Android on tablets?
Gosh, it’s a complete mystery why they would do that.
Yeah and that’s thebullshit part. Emulation is not enough, and Intel is there to stay.
Great, tell the people at Microsoft and Qualcomm who are working on this that they are wasting their time and they should hire you as their CEO instead.
I believe that it is more about securing their future. While they may not believe that Intel will be sidelined in any major way but they need to hedge their bets. As of now, Intel does not have good processors for small devices and communicative devices that has a larger form-factor but needs greater battery life. So, a Qualcomm ARM device could be a niche worth exploring. They are not looking to replace Intel but they know that Intel doesn’t have all areas of future computing covered. While emulation is not perfect, though getting better all the time, but for most people it will be an acceptable trade-off for the potential benefits. Also, who knows what Qualcomm could come up with in the future.
Please… Just explain it in human language. Previously you made a vague reference to talent hunting with no explanation, now you “explain it”… this way. Wat? Translate plz.
And for the record, every big player hunts talents. That’s their way of life, you know… crapple didn’t invent this.
This isn’t the first nor the last time. Microsoft has created more flops than successful products… Not surprising at all. Take a look at Windows Phone for a recent example.
What’s to explain, you know what I’m talking about:
Besides the purchase of Intrinsity, it’s generally understood that Apple has been hiring engineers fresh out of Intel, Qualcomm etc. In fact, they recently hired Qualcomm’s former VP of engineering iirc.
The acceleration of Apple’s performance lead, from about parity to almost doubling the performance of their nearest competitors at Qualcomm and Samsung kind of underlines how much money they’re putting into this and how serious they are. And not just on their main moneyspinner, which is the iPhone. The massive a*x chips exist for a reason, and it’s not because Apple needs to compete against Android tablets. It’s because they believe iPad Pro will eventually compete against Windows and Intel.
Speaking of which, re: “This isn’t the first time or the last time” - it’s the SAME time. The work that Microsoft did porting Windows to ARM for RT wasn’t thrown away. They’ve continued to build a translation layer for their APIs so that actual architecture emulation is kept to a minimum with these upcoming Qualcomm processors.
Microsoft are going to continue chiselling away at this because the alternative is to lose out on mobile computing altogether (where mobile computing includes lighter laptops etc.).
Eh… No it’s not. There were dozens of completely different flops unrelated to each other. Are you now telling me all of them were some giant plan? No, lol…
No, the alternative is bringing actual desktop computing to mobile devices. Nobody else is doing it. Not because it’s not worth it. Because they can’t. Nobody else has such a strong desktop OS as Microsoft that they could port to the mobile world.