所以啊。玩HiFi就要敢于面对真实的自己,听得出就是听得出,有就是有,无就是无,没玩过没碰过就不要乱评论,不能人云亦云。反之,有钱的这样玩,会把HiFi弄得乌烟瘴气,没钱的会把自己弄得伤筋动骨。 作者: dxdxdx 时间: 2019-11-18 17:45
HDMI is an analog signal.
In HDMI, a digital bit stream is modulated into an analog signal. Think of WiFi: It might seem digital, but I assure you, the radio signal is an analog waveform. HDMI is not Morse Code; to pack billions of bits per second over each wire, a modulation scheme is used. Bits can most definitely be lost over the cable. This data loss is referred to as Bit Error Rate.
HDMI has a design in which video data is transmitted in three separate twisted pairs: R, G and B, or Y, Cb and Cr. A fourth twisted pair carries a clock signal. What’s interesting about this is that each pixel is comprised of information from all three differential pairs. Because of this, when a bit error occurs, it is nearly impossible to see. For example, in Y Cb Cr, if one bit of Cr is lost, the luminance of the pixel is unchanged, and a small portion of one color component will slightly change. It is almost impossible to see such an error unless the error rate is very large. HDMI cables are required to pass certification testing which requires Bit Error Rates many, many times lower than what the best human eye can perceive under the best possible conditions. But uncertified cables which are manufactured with low quality are “just good enough,” and higher bit error rates can occur.
So as you see, the answer to your question is: DEFINITELY there is a difference between one HDMI cable an another, but you may have difficulty seeing it. If you see sparkles or horizontal lines on your TV, try using a better HDMI cable.