Personality Cafe banner
1 - 13 of 13 Posts

The red spirit

· The spirit of the spirits
Joined
·
12,515 Posts
Discussion starter · #1 ·
Finally, Zhaoxin chips are reality. After long lasting duopoly of AMD and Intel, there's a third player (well there was VIA/Cyrix and VIA actually survived to some extent) the Zhaoxin. In around 2016 they presented their new chips to world and they promised almost as great performance as in Ryzen (only 30% or 40% behind) and even then it was enough to be rather impressive. However, after that presentation chips were supposed to be made and nothing happened. Not a single computer, laptop or CPU was to be seen. Only some rumors that Lenovo has some laptops with them, but that's it. And there was one CPU-Z screenshot. But now more is known about Zhaoxin and those chips were benchmarked:
https://www.tomshardware.com/features/zhaoxin-kx-u6780a-x86-cpu-tested

And the hype died. Chips are nowhere close to Ryzen, in fact loses to gimped bulldozer based chip (the one without any L3 cache) and is closer to Athlon 64's IPC (it's only like 30% behind bulldozer IPC), consumes even more power than bulldozer and at best competes with Athlon 3000G.

Still it's just their very first chip and Chinese government is supporting them. Results aren't too awful. It's also pretty impressive how they got x86 license to make those chips. Turns out it's mostly from VIA, yet AMD helped a bit too. It could be speculated that this is VIA reborn. Those chips even have VIA based graphics inside, but their performance is... they don't even work properly all the time and performance is awful regardless. It also lacks DX12 support. It's even worse than Intel's UHD 630 graphics, so yeah it's a completely hopeless video adapter.

And yet, at least for now, those chips are going to be for Chinese market only and Zhaoxin isn't going to stop. Their chips aren't completely hopeless and team behind them is promising. One day it might be possible that Zhaoxin is a truly meaningful third player in CPU market. Only time will tell.
 
[The] Chinese government is supporting them.
This would be very problematic in regards to a debut within the western market. I'm not interested in CCP's backdoors.

I'm surprised they're bothering with x86 at all, wouldn't it be more prudent to develop an ARM chip instead? We're already moving away from CISC.
 
Discussion starter · #3 · (Edited)
This would be very problematic in regards to a debut within the western market. I'm not interested in CCP's backdoors.
That's not their point. They are only intended to be used in domestic market. Basically China wants more global independence and they made program called China 2025. Until then they are supposed to make some advancements and their own CPUs are one of their goals. China also has special Windows version without Microsoft's spyware.

And before you say anything about spyware inside processor, you should be aware that every Intel CPU has had those backdoors for US government for a long time. Pretty much since the launch of G45 chipset in 2008. Not sure about AMD. Intel CPUs have minix derivative OS inside CPU and it works all the time. That part of processor is called Management Engine. I think it's used for managing CPU's work, but it also has backdoors. Barely anything is known about minix fork and what more it could do as it is frequently updated and nobody wants to release information about it. Only US government can get chips with it disabled and it might suggest that it's only there to enable spying. AMDs are most likely affected too and have their own equivalent. Certain ARM CPUs, especially Snapdragons, might be affected too. Welcome to America.


I'm surprised they're bothering with x86 at all, wouldn't it be more prudent to develop an ARM chip instead? We're already moving away from CISC.
I wouldn't say that there's any move from x86. On desktops it's irreplaceable and ARM at best is just a possibility. Once you see massive library of software and even longer legacy software list that only works on x86, you will understand that x86 is there to stay. And why wouldn't it be? It's pretty good at that job, on desktops ARM wouldn't really bring anything.

As to this situation it's likely that they needed x86 chip for Windows 10 Chinese edition for critical software support. Another reason is that ARM is licensing requires USA or some other middle man. Meanwhile, VIA has been irrelevant for a long time and at this point pretty much dead company. Plus with ARM you would need licenses for every revision of architecture, meanwhile with x86 you don't. Also with ARM it's unknown how bigger chips would turn out and how they would perform, with x86 it's much easier.

There may be many other reasons why, but at this point it's far more important that they even made their own x86 chips.
 
I was under the impression that china makes their own software too, is that not the case? Besides, they're no stranger to blatantly stealing intellectual property I think they should make desktop ARM happen. It's not like powerpc never existed, this could work.

I'm just disappointed that they didn't take this chance to make progress.
 
Discussion starter · #5 ·
I was under the impression that china makes their own software too, is that not the case? Besides, they're no stranger to blatantly stealing intellectual property I think they should make desktop ARM happen. It's not like powerpc never existed, this could work.
There were ARM PC in the past. I'm pretty sure that some British home computers used ARM and that SGI workstations used ARM (or at least RISC CPUs, yet many of those in one machine)


I'm just disappointed that they didn't take this chance to make progress.
It certainly would be interesting to see ARM with Windows actually being something more than just an elaborate technical preview, but then again making x86 CPU in 2020 is just something else completely. We might even see PowerPC arch coming back, who knows.

But if ARM is of your upmost interest, then China has made some really cool single board computers with ARM CPUs. Good examples of those:
Odroid XU4
Odroid C2
Odroid N2
Huawei HiKey 960

There are many more of those single boards computers that are cool, but I'm not sure of their origin. They all end up being manufactured in China anyway. It's frankly amazing how Raspberry Pi isn't finished yet, because they often have very bad issues and clones are either cheaper, better value or are outright bulldozing it in performance.
 
Chinese chips will be a cheap knock-off of Intel or AMD, for sure. They'll have whole teams of people trying to reverse engineer or otherwise steal the intellectual property of other big players to outright copy the designs and claim them as their own.

They do it for everything, it's literally how Chinese business culture operates. They see a good thing and copy it. In the west we think of this method as fickle and unoriginal, but the Chinese see it as pragmatic. Sure, it makes sense to utilise the best ideas, but they will NEVER be able to innovate.

Chinese chips will never be as good as the best, not for at least a decade anyway, if ever. Plus, we're already pretty much at the end of Moore's Law, so there's not much left to compete for. I have no interest at all in Chinese chips.
 
Discussion starter · #7 ·
Chinese chips will be a cheap knock-off of Intel or AMD, for sure. They'll have whole teams of people trying to reverse engineer or otherwise steal the intellectual property of other big players to outright copy the designs and claim them as their own.

They do it for everything, it's literally how Chinese business culture operates. They see a good thing and copy it. In the west we think of this method as fickle and unoriginal, but the Chinese see it as pragmatic. Sure, it makes sense to utilise the best ideas, but they will NEVER be able to innovate.

Chinese chips will never be as good as the best, not for at least a decade anyway, if ever. Plus, we're already pretty much at the end of Moore's Law, so there's not much left to compete for. I have no interest at all in Chinese chips.
Well... Knocking-off in CPU industry has been since the start of industry. Do you know how AMD became a thing? They knocked off Intel CPU. Literally examined Intel chip, improved and sold. Only in Pentium era Intel finally sued them and achieved something. Many Silicon Valley engineers worked at same factories and it was common for them to start their own companies after being fed up by their original companies. Motorola used to be a CPU making powerhouse. Zilon Z80 was ripped-off Intel at first, but vastly upgraded and improved. AMD was a copy-cat, but was able engineer their own chips to prove courts that they are also legit CPU makers. Not everyone succeeded there and many were driven out of business.

Intel is vastly different company and they were at first just memory manufacturers. They weren't making CPUs at all. They only started to do so after they got order from Japanese company to make a calculator and Intel engineer wanted to make chip programmable. They sold it for them and then nicely speaking "borrowed indefinitely" their own design after understanding that microprocessor has shit ton of potential. After that they made sure to acquire that Japanese company and disband them, so that no-one would make microprocessors.

And even then after they started they improved their chips, but they were ass to use and their manuals were steaming pile of poo. A big reason why many others wanted to join a game. Because they knew how to improve that design and they did. Once Intel understood threat they made sure to drive them out of business in some kind of way and to fight with lawyers.

Late in the game they also used to steal from AMD and use lawsuits to kill them off, as well as exclusive business deals. That's why Pentium 4 was sold so well and not much faster Athlon 64 (same with Athlon XP). Because it was a fraud.

Let's not talk about who is stealing and innovating in this industry, because after digging deeper pretty much every semiconductor company has stolen something, used some bribery or did something dirty. And most of these incidents happened purely out of greed, inability to suck it up that your parts stink or some other not so noble reason.

I heard that in China, there are different copyright laws and they are laxer than Western ones, so technically many things aren't knock-offs. And to be honest, copyrights did very little to benefit humanity. Many copyrights just stagnate certain things are are counterproductive.

At least for now Zhaoxin processors are using their own in-house architecture. They made their own thing, it was written in Tom's article. For Chinese product it's pretty original, way above their average standards. And this time they don't want knock-offs as their goal is to get more independence from other countries. If you have to rip-off others, then you aren't independent.

As for Moore's Law, please stop using that phrase. It only mentioned transistor count doubling each 2 years, nothing else. And it's not really a law, more like observation of semiconductor industry of 80s. No longer it applies to current industry and hasn't been correct for at least 20 years. And it could be said that Intel's "we are at limits of silicon" is just their PR way of saying, that they just don't want to innovate anymore, because AMD managed to get out far more than Intel at this point and they are still making lithography smaller. However, Intel doesn't and it seems that Intel abandoned desktop market at this point. Not only AMD is pushing limits of silicon, but also nVidia and ARM. It's only Intel getting left in dust.
 
As for Moore's Law, please stop using that phrase. It only mentioned transistor count doubling each 2 years, nothing else. And it's not really a law, more like observation of semiconductor industry of 80s. No longer it applies to current industry and hasn't been correct for at least 20 years. And it could be said that Intel's "we are at limits of silicon" is just their PR way of saying, that they just don't want to innovate anymore, because AMD managed to get out far more than Intel at this point and they are still making lithography smaller. However, Intel doesn't and it seems that Intel abandoned desktop market at this point. Not only AMD is pushing limits of silicon, but also nVidia and ARM. It's only Intel getting left in dust.
Do you know anything about semiconductor physics? I do. And I know Moore's Law isn't literally a law FFS, thanks for lecturing on the obvious.

The only innovation left for semiconductors is in cooling techniques, because that's how you mitigate quantum tunnelling effects between transistors, which are now so close together that the space between them cannot physically get any smaller. And that's why Moore's Law, or any transistor density trend, is over.
 
As for Moore's Law, please stop using that phrase. It only mentioned transistor count doubling each 2 years, nothing else. And it's not really a law, more like observation of semiconductor industry of 80s. No longer it applies to current industry and hasn't been correct for at least 20 years. And it could be said that Intel's "we are at limits of silicon" is just their PR way of saying, that they just don't want to innovate anymore, because AMD managed to get out far more than Intel at this point and they are still making lithography smaller. However, Intel doesn't and it seems that Intel abandoned desktop market at this point. Not only AMD is pushing limits of silicon, but also nVidia and ARM. It's only Intel getting left in dust.
Intel did something similar with broadwell chips, they didn't really release any relevant desktop chips that generation and I suspect that's the case with the current 10th gen CPU's. to be honest I don't know what they're doing anymore but it doesn't look good for them. We're really approaching the limits of silicon and for intel to remain relevant they ought to innovate with graphene or whatever it is that eventually supersedes silicon chips.
 
Discussion starter · #10 ·
Do you know anything about semiconductor physics? I do. And I know Moore's Law isn't literally a law FFS, thanks for lecturing on the obvious.
Might as well you should stop using that phrase then.

The only innovation left for semiconductors is in cooling techniques, because that's how you mitigate quantum tunnelling effects between transistors, which are now so close together that the space between them cannot physically get any smaller. And that's why Moore's Law, or any transistor density trend, is over.
And yet there doesn't seem to be any plans by big players to try something that isn't silicon or quantum.
 
Discussion starter · #11 ·
Intel did something similar with broadwell chips, they didn't really release any relevant desktop chips that generation and I suspect that's the case with the current 10th gen CPU's. to be honest I don't know what they're doing anymore but it doesn't look good for them. We're really approaching the limits of silicon and for intel to remain relevant they ought to innovate with graphene or whatever it is that eventually supersedes silicon chips.
From what I remember Broadwells were okay, probably faster than Haswells or just the same, but with much lower power consumption. It was of limited appeal and it didn't help that Haswell and Devil's Canyon was in stores and selling well. Many people just outright ignored Broadwell as if it never existed. I think that it was a marketing failure.

I have no idea myself why Intel is in such state. I heard speculations that it might be abandoning desktop market and instead will focus on mobile and HPC devices. Maybe it's working on some truly next gen tech, maybe it's fucked up internally, maybe they truly lack competence to make anything better. It's so unclear now and AMD is basically dominating at every front. Be it servers, supercomputers, high-spec desktops, budget builds, laptops, consoles, integrated graphics. Pretty much everything except NUCs and compute sticks are crushed by AMD. And Intel says nothing about what is going on. They sort of made their preview GPU, but it sucked, not the first time Intel pulling out some barely functional GPU. They seemingly made new architecture for laptops, but they don't rework it for desktops. There are no real news from Intel for 2 years at this point.

Well, 10980XE, i5 9400F, i9 9900KS and similar rehashes happened, but they are just nearly identical Coffee Lakes with different clocks and no iGPUs.
 
Discussion starter · #12 · (Edited)
More news about Zhaoxin:

And a little bit of info about Hygon:

So yeah, Chinese CPU is really low end. At least it barely consumes any power. Performance is straight up awful, so awful that I think that my old AMD Sempron Mobile 3000+ might actually be faster and as efficient. Too bad he didn't run old Cinebench. That Sempron managed to get score of 35 points in R11.5.

BTW I investigated AMD's super budget A4 6300 APU a while ago and I made a ton of benchmarks, which might be useful for comparison:
https://www.personalitycafe.com/science-technology/1279685-trs-lab-project-2-mild-velocity.html

I would still like to mention that CPU in Tom's review is much faster and actually servicable, meanwhile GN's CPU is low tier one. Those Hygon chips are the fastest, but they don't really have a truly unique design. The chip that Tom's Hardware investigated, the KX-6780A is faster than AMD A4 6300, which is something.

The chip that GN reviewed in terms of performance might be similar to low end Atoms, AMD C series or AMD E series chips, which are similar kind of potatoes. So, Zhaoxin is nothing out of ordinary. BTW there are some Intel J series Celeron and Pentium N series boards with CPUs embedded into them. They are just as crappy and low end. It's just such a weird market, but inside of it Zhaoxin is too awful.

@HAL Obviously I'm also uninterested in buying Chinese x86 CPUs yet, as well as something really low end. It's just interesting that they exist, that they have an interesting story and it would be fun to see a new player in x86 market, as well as something bizarre, unknown and unpopular. Due to similar reasons I tested AMD A4 6300. It was a fun little project on budget of chip that barely anyone had or used. It's just that, a simple curiosity.
 
Discussion starter · #13 ·
One comment from GN's video's comment section (by lordofduct) sums up a little bit info about those chips:
"Well the video stated that they have a relationship with Via and Cyrix. See back in the day Intel actually sold rights to manufacture x86 processors. This is how AMD got access to it, and later when Intel tried to revoke all licenses and started suing various manufacturers, various lawsuits came about. AMD one a royalty free license to the Am386 design and owned the rights to any design built from that base point. Other companies like Via and Cyrix started about reverse engineering versions of the architecture as well based on this base architecture back then as well. That's how they made and sole processors that were compatible through the 90's and even early 2000's (Via was big in the small form factor x86 PC market back in the mid 2000's, I remember building one myself).

Now here's the thing. That base x86 architecture, that's what Windows cares about! All new expansions of the x86 instruction set are tacked on to the end. Note, AMD and Intel have been independently developing the architecture since this split way back when. This means that through the 90's and into the early 2000's Microsoft was contending with the fact that there were 2 independent big players as well as a bunch of small players. So design wise the kernel has been contending with this fact the entire time.

So, it's designed in a way to deal with that. The kernel will use a newer instruction if it's available, otherwise falling back to a software based solution if the instruction doesn't exist. So as long as you have that base x86 architecture, Windows doesn't care, it'll keep trucking on forward.

At one cost... PERFORMANCE.

Hence why this CPU is dog ass slow. It just doesn't have all those newer high-end math instructions. Hence why all the benchmarks they employed were god awful. It was doing all that math in software! I bet it performs just fine as a general purpose UI processor. It probably runs word processors just fine (which is what a government office computer is going to need).

This is why dude man in the video later talked about hitting the road running. They don't have as much to make up from here. All they have to do is develop those newer faster instructions.

And here's the nice thing here... they don't actually have to reverse engineers those.

The instructions are defined. We know their interface... the way to call them, and the way to use their results. That's all that's really needed. And of course we know them, because if we didn't the likes of Microsoft, Apple, the linux community, and any other OS developer out there could not reliably create software that runs on that processor.

Furthermore, if you want a very stable OS that runs across decades of hardware (like Windows), you're not going to exploit those functions/instructions. Like... OK, if f(x) is defined to take inputs on registers 1 and 2, and have an output in register 0. Just because it MIGHT also place a remainder in some other register because it used it as a scratch register. You don't read that register for anything. It's not part of the definition. This means it's unreliable... if you did rely on that, and then the hardware manufacturer changed its internal behaviour, then your software breaks! This concept is called encapsulation. And as a software engineer, even if you know how to break encapsulation, you do so at your own risk. And this is how an OS like Windows operates... (and of course Windows has become more stable over the years for various reasons... including respecting these things and purging parts of the kernel that exploit hacks like this).

Of course in more embedded systems where you know the specific hardware, you can exploit it willingly, to get that extra edge. Say in things like video game consoles.

So, the jobs of these companies are really to come up with their own implementations of those instructions. Just making sure that inputs and outputs are shaped the same as the definition is. And as long as they do that they'll have nearly perfect stability (of course perfect stability is impossible... there's all sorts of mishaps that can impact this... but they're severely limited at this point because OSs compensate for it). The problem here isn't stability... it's creating performant hardware implementations of these instructions and packing them into the silicone. That's the intellectual property that AMD and Intel hold on to. In simple terms... it's not knowing if they need to implement square-root, that they know. It's knowing how to create an algorithm on silicone that does square-root as fast as AMD/Intel can do it in their chips (replace sqrt with a far more complicated instruction).

Of course, software that is designed to directly access specific instructions can become more unstable because they do have much higher expectations on the hardware you're running. Software that more directly accesses hardware to perform more time sensitive processes like video games, renderers, etc which need to perform more modern maths. These will become far more unstable and is why most of the tests that failed in this video were related to these.

Mind you things from here get a little more complicated into regards of x86-64, since this is technically a different set of instructions developed by AMD on top of x86 and holds the licensing rights to. But this problem is just legal licensing issues (hence the videos long rambling about the complicated business relationships and how AMD was involved but no longer involved due to various legal/political hub-bub). But the hardware side principal stands the same. The right to sell hardware the uses the instruction sets has nothing to do with how stable it will be.

TLDR;
Via and Cyrix are involved, and they already did most of the leg work of gaining access to the x86 architecture back in the 80's and 90's, as well as more leg work in more recent years with AMD to get the x86-64 base architecture. Tacked on with the fact that Microsoft explicitly designs their OS to be fault tolerant due to its need to support decades worth of evolving hardware. Means it can be very stable as long as the core x86 instruction set exists, if only at the impact of performance."
 
1 - 13 of 13 Posts