Lenovo just anonce a surface pro killer for novembre
the speck looks like similar as V ( except the screen) but what do you think of it Lenovo Miix 520
Up to 8th Gen Intel® Core™ i7
Windows 10 Home
Graphics Up to Intel® HD 620 Graphics
Camera / Webcam
Front: 5 MP fixed focus
Rear: 8 MP auto focus WorldView Camera
Microphones 2 x front, 2 x side
Up to 16 GB LPDDR4
Up to 1 TB PCIe SSD
Stereo speakers with Dolby® Audio
Battery Up to 8 hours with local video playback
12.2” Full HD+ (1920 x 1200)
Dimensions (W x D x H)
Tablet: 11.8" x 8.1" x 0.4" / 300 mm x 205 mm x 9.9 mm
With keyboard: 11.8" x 8.1" x 0.6" / 300 mm x 205 mm x 15.9 mm
Tablet - 1.98 lbs (900 g)
With keyboard - 2.65 lbs (1.25 kg)
1.5 mm travel keyboard with precision touchpad
1.5 mm travel backlit keyboard with precision touchpad
PTP with glass cover
802.11 ac WiFi
This looks like what I wished Lenovo had available before I learned about the V. It has a 3D scanning camera, which is interesting for those of us in the 3D-printing field. However, the lack of Thunderbolt 3 ports makes eGPU usage impossible. The 8th gen CPU is the big selling point for the medium-power users.
Lenovo’s watchband hinges are always popular and seem strong. Full HD is fine for a 12.2" screen, but a lot of competing 2-in-1 have much higher. “Up to 8 hour” battery is always a bad sign, since normal usage might bring that down past 5h if you have the screen brightness up.
It seems to be in a good price range for the competition, and looks professional to fit into an office setting (as most of Lenovo products usually do). The 3D scanner will appeal to a small group, but Lenovo usually adds something like that to portables as well (such as their tablets with projectors built in). I wouldn’t expect it to be very good, but maybe serviceable enough to open a small object in Paint3D. The big factors are going to be how good the screen looks (could have bleached colors or bad contrast depending on their choice for saving money) and how long the battery lasts under actual testing.
Not Thunderbolt 3. USB-3.0 Type C. Not even USB-3.1
That means it doesn’t even get the 10Gbps of 3.1, certainly not the 40Gbps with PCIe of Thunderbolt 3. It is locked at the speed of 5Gbps USB 3.0 that the latest standard USB connectors get.
I’m kind of over tablets now b/c no matter what, there is a compromise. Surface Book almost pulled it off and I would love to see a Surface Book 2 w/o the hinge, still a detachable, a better battery and port selection with the MX150 @ 4GB GDDR5. With tablets, I worry more about overheating and throttling because of the limited space and how many tablets go fanless. Surface Pro showed me that you shouldn’t be using a 15W processor w/o a fan. Dell’s Latitude throttled less than the new Surface Pro. I’m also over eGPU compatibility b/c of the bottlenecking.
I’ll have to wait and see, but I’m hoping Lenovo learned from the Miix 720. But no Thunderbolt 3 is a dealbreaker.
Well no, I would not expect a core m5 paired with a 960M to give stellar results in AAA titles, especially when run from an Acer Aspire Switch 12S, which famously overheats and throttles since it was the first test product of Acer’s liquid cooling loop system.
The Acer Graphics Dock (while technically an external GPU) barely qualifies for the intended usage when people want a Thunderbolt 3 port for eGPU. It was designed to give some graphics power to an otherwise graphics incapable device while still remaining portable enough to carry in the same bag. Even though that test gives valuable knowledge on problems inherent to some mobile performances, it is incomparable to what you should expect from a core Y with a higher end graphics card.
The Core Y is a rebranded Core M. That wasn’t a Broadwell Core M, that was a Skylake Core M, just a generation behind. The single core clock is higher, but it can’t typically (not talking V here) sustain the higher clocks. I’ve even saw them run the test with the Core (used a quad core laptop, not dual core) to determine just how much of a loss of performance there would be. In some of the gaming benchmarks and FPS tests, the quad core with a 1070 eGPU lagged behind the laptop with the 1060. That doesn’t spell value or purpose to me. Do what you wish, but I’m not getting a Core Y+eGPU setup when a Skylake H series or Kaby Lake H series laptop with dedicated graphics will give me the performance I want without the eGPU. Enclosures are still expensive and with the shortage of GDDR5 chips, expect the graphic card costs to increase. That doesn’t spell price/performance ratio to me.
As for your article, the performance is pretty much identical to quad core gaming laptops, as shown in the article itself. The Travelmate laptop has totally playable framerates at even ultra settings. So the CPU is not a problem. They also tested it with the Aspire, and the performance ws shitty. So what? that just means the laptop had terrible cooling. We already ran benchmarks on Eve V prototypes, our processor performs just as good as that Travelmate. Even better in some cases. And you just provided evidence to yourself that this kind of performance is enough not to bother most games.
I didn’t mean to imply that core Y was vastly superior to the core M processors, only that a mobile processor with a slow GPU shouldn’t be used as a performance expectations for a mobile processor with a stronger GPU.
In general an eGPU gives 5-15% less performance than a dedicated GPU (worse if you use the same TB3 connection to run video back to the native screen). A lot of the eGPU benchmarks were done on systems that have half host-to-device speeds due to an issue in software that has now been fixed. With twice the bandwidth available, speeds have improved considerably on these devices.
You are right though, that a laptop with a dedicated GPU is better performance-wise than a tablet with an external GPU. I don’t think anyone going for the eGPU setup on a 2-in-1 is doing it to be more powerful than available systems with GPU built in. I know I want it for portability’s sake since I travel a lot, and I only want to carry a small device with me when I do. I also find the concept interesting enough to warrant a try. If you don’t need/want a setup that requires more separate parts for less performance, then I wouldn’t expect you to get it.
It isn’t a perfect system, and the technology isn’t perfect yet either. Indeed, computer technology can never be perfect just because of how fast it changes. An eGPU isn’t intended to make an unusable system perfect though, it’s intended to take an unusable system for the application and make it good enough. With the added possibility of future upgrades, that makes it worthwhile to some of us.
The thing that is bothering me about it isn’t the concept, but the expense.
It is ridiculous. But we will be at 10nm chips soon. So we will see.
Qualcomm did it well as multi core geekbench scores match or beat the
Skylake Core M devices. I like the idea, but it is the money. I hope a full
set up becomes cheaper soon.
Never compare benchmark scores of different architectures. I don’t mean different iterations of the same architecture - Skylake vs. Kaby Lake is fine, for example. But don’t ever compare ARM to x86. There is no benchmark that can compare them directly.