Join now - be part of our community!

4k 60Hz signal problems with XD9305

gobblerjook
Member

4k 60Hz signal problems with XD9305

Hi everyone,

 

I'm having loads of problems getting high quality signals from my PC to be accepted by my new XD93.

 

This is my setup:

  • PC with GeForce GTX970 using:
  • Brand new HDMI 2.0 cable that is 18GBps capable, confirmed on Amazon that it is capable of doing YCbCr 4:4:4 at 4k 60Hz with 10 bit colour.
  • XD93 - using HDMI port 2 which I have set as being Enhanced Signal.

 

According to this Sony page the TV should support YCbCr 4:2:2 when using 3840 res, 60Hz and 10 or 12 bit colour. I have tested this extensively and my TV simply does not register any signal when trying any of these settings.

 

I also tested YCbCr 4:2:2 at 50Hz, still does not work. I have to go down to 30Hz to get it to work (most of the time, once it wasn't recognised and then the TV crashed) and it also works at 24Hz.

 

My TV firmware and PC graphics card drivers are all up to date.

 

Can anyone else confirm whether they have been able to get high quality 10 or 12 bit colour 4k signals to work at 60Hz on this television?

 

I feel that this may be a firmware issue 😞

18 REPLIES 18
gobblerjook
Member

Do you mean the cable specs? Its this one that I am using currently. 

 

I'm considering one of these which says it will cover extended colour spaces. It's still 5m though. I'm not sure I can make 3m work in my living space, we have house rabbits and cables have to go through certain routes. I could try a temporary 3m cable though to test it. Need to decide which is the better use of money!

Anonymous
Not applicable

The cable your considering - all my cables are that brand - had no troubles with them and have only heard good things too.

 

Sorry, my bad - you already posted what I was looking for :slight_smile:

http://sony-eur-eu-en-web--eur.custhelp.com/app/answers/detail/a_id/112975

 

You see, i have this linked to your model TV, but as the link you provided, its been updated previously via firmware.

_20160719_181127.JPG

gobblerjook
Member

Many thanks for your help, I'll try another cable soon (after pay day!) and report back.

gobblerjook
Member

Hi again...

 

So the new cable arrived. I got this one

 

Unfortunately it doesn't make any difference, I can still only output at the same formats as before. I've been experimenting with a number of different colour formats, colour bit depths and refresh rates. The conclusion that I have come to is that my TV will only recognise the formats show in black on this Sony page. Clearly this is wrong, since my firmware is bang up to date so I should be able to use the formats shown in red as well.

 

Perhaps the updated firmware is not working properly with this model? I'd really like to hear from someone who has some of the signal formats in the red text in the link above working on an XD93...

azx62p5
Explorer

My X930D cannot do more than 8 bit on RGB or ycbcr 444 either. Could it be a GPU issue, wouldn't mind getting a new 1070 or 1080 to see if they do something different. If not something is seriously wrong.

gobblerjook
Member

Interesting that you have the same problem, I assume you're using a cable that should be capable of doing the job?

 

To be honest I don't think it's a GPU issue, the 970 that I have is able to support these output signals and the Nvidia drivers allow them to be selected.

 

Perhaps this needs to be raised directly with Sony?

xx4L0Mxx
Contributor

Is your TV a 2015 model? Then I bet it has an 8-bit panel. 

 

4k, 60hz, 12 bit colour with YCbCr4:2:0 over pc is managed by using using dithering which is cheating to achieve 12bit, in a nutshell, it's compressed and not a real 12bit output.

 

I can happily output this with my 2xMSI 970gtx in SLI to my 55" KDX8509c , but I don't as the best colour output from pc to your Sony TV is 8bit Full RGB 4:4:4@60hz. 

 

The only game on the market that supports 10bit colourspace currently is Alien Isolation, everything else is rendered in 8bit.

 

Buying a 1070 or 1080gtx over a 970gtx (apart from the obvious boost in performance), will only add native HDR support over your graphics card, but as far as I'm aware there are zero games on the market that support HDR. 

 

HDR currently in games such as Half Life 2 Lost Coast, Prepa3d etc is a game engine HDR output, not to be confused with your TV's HDR capabilities. 

gobblerjook
Member

Hi, my TV is the XD93 which is a 2016 model with a 10 bit panel. Its rated as being able to accept 4k 60Hz 4:2:2 at 12 bit over HDMI, as long as the firmware has been updated.

 

Basically, if you look at this webpage, my TV should support all of the formats in red but in reality will only accept the formats in black based on my current experimentation. Even forgetting about 10 bit or 12 bit, it won't even accept full RGB 4:4:4 at 4k 60Hz 8 bit as you mentioned you do with your model. That is only possible right now up to 30Hz.

 

It's interesting, but not surprising, to learn that games are still predominantly 8 bit, but that was not actually my primary use case. I have some 10 bit video on my PC that I would like to play on the TV over HDMI. I'd also like to know that my modern TV is actually providing the capabilities that it is advertised as having!

 

 

gobblerjook
Member

I completed a series of tests to explicitly show which signals work/don't work for Sony support, thought I would share here too:

 

tests.JPG