Sorry, you need to enable JavaScript to visit this website.

Wrong color encoding suspected on HDMI output

1 post / 0 new
Izrodov's picture
Starter
Izrodov

Oct 25, 2013 - 05:14am

  • Hello,

    I struggled with this problem for two weeks and I could not find any information on the internet, so I decided to ask for help here.

    I have recently installed Ubuntu 13.10 linux on my Haswell i5-4670K based system. I compiled and installed the 3.11.6 kernel.

    The HDMI output of the motherboard is connected to the HDMI input of my LG full hd TV (i am not sure about the exact model). On first look everything runs fine.

    But, on a closer look I noticed that the text and thin lines sometimes are kind of blurry. After some experimentation, I noticed that the blue-colored vertical, single-pixel lines are the most blurry, the reds are also blurry and kind of brownish, the green is almost OK. The grey and white lines are perfectly sharp. It does not matter if X is running. The behavior is the same in the console, and even in GRUB menu (I tried setting the grub menu colors to blue - they are blurry).

    I tried the DVI output - the situation is the same.

    Also tried the VGA output (my TV has a VGA input). In this case everything with the image is perfect - all the lines are perfectly sharp, no blurriness whatsoever, no matter the color. The colors are clean and saturated.

    This led me to the conclusion, that something is wrong with the color encoding, selected for the HDMI output. I thing that if YCbCr 4:2:2 colorspace is used over HDMI exactly these defects would show up.

    After some research, I learned that the supported colorspaces for a given display device may be read from it via EDID. So, I tried using a custom edid, adding the relevant kernel command-line option (used the edid built-into the kernel - 1920x1080). The custom edid was read (as was stated in Xorg.0.log) but there was no improvement in the quality of the picture.

    So, I would like to ask if there is a way to force the HDMI output color encoding?

     

    Regards, Dephlector.

    Oct 25, 2013 - 05:14am
  • I actually found a partial fix to this problem. In one of my TVs there is an option to give labels to the inputs. I can choose from several labels. If I choose the label "PC" the issue is fixed, and RGB is used for the color encoding. I have no idea why it works - I guess the TV supplies another EDID, which somehow forces the video card's firmware to use RGB encoding.

    Anyway, most of the TVs on the market do not have such an option, or if they have it it does not work. There must be an option in the driver to force it to use RGB encoding, no matter the EDID. My research led to the conclusion, that the color encoding features of the video card are not controlled at all by software. The decision what color space to use is left to the firmware of the video card. I guess I will have to wait until it is implemented by the driver developers, or implement it myself if I find the time and resources to do so.

    Another solution would be to construct a device which fakes the EDID of the TV. This, I think, will be easier, but uglier solution to the problem.

    Regards

    Jan 27, 2014 - 12:19am
  • Topic locked