Tuesday, June 22, 2010

Screen Resoultions & HDTV

Hello Mr. Sokol

I have on several occasions consulted your page on Video Format
Resolutions (http://www.videotechnology.com/0904/formats.html) when I
needed to find out about a specific resolution. I'm happy to say that
the page has never failed me. Until now.

I'm curious about the 1360x720 resolution of many (as I understand it)
entry-level tv's. Does this format have a name and where does it come
from?

If you know the answer, I was hoping that you'd consider putting it on
your excellent webpage.

Yours
KB

1366 x 768 is the resolution of some of the low end HDTV's this is a side effect of the LCD panels as there are far fewer manufacturers of the raw panels then there are the TV's made from them. These panels were made for computer display and it is easy for them to make in to HDTV's.

See also:
http://en.wikipedia.org/wiki/WXGA  ( I need to add this to my chart.)
 &
http://en.wikipedia.org/wiki/High-definition_television

1280x720 is 720P resolution, most HDTV from Broadcast and Cable is in this format and these TV's rescale it to 1366 x 768 which take a real hit in picture quality as it has the effect of low pass filtering the image (removing detail) most people will never appreciate the difference as there far enough back and it's still a huge improvement over the old analog NTSC we had.

1360x720 seems to be a graphics card resolution, probably to get the 720p video to play, in researching I don't think you can get Pixel accurate display using that.
There is no restrictions to the resolutions graphics cards can output. With many VGA chip set's you program it to output any arbitrary X Y resolution up to the limits of it's memory and video dac (digital to analog converter) Older CRT displays would make a best effort to display it, LCD's though have to map the input to actual display pixels.

I don't think anyone is making sets with a native 1360x720 resolution.
You should look up what your display's native resolution is and set your video card to match this will give the best possible image as each screen pixel that your OS sees will match each display pixel on the actual LCD that you see. 

I myself have a 52" Aquos which was the smallest with a full 1920x1080 pixel LCD panel at the time.
I then am using a ATI Radeon HD4350 graphic card with HDMI out.

Using that I was able to get Pixel accurate 1920x1080, although by default the driver insisted on scaling the image to be smaller.

Basically having a 1920x1080 display from the PC's perspective, then sending that bit map shrunk down 10% or so, then sending that in 1920x1080 resolution down to the TV over HDMI resulting in black bars all the way around around my image and blobs rather then text when displaying. This completely killed my pixel accuracy.

With some effort I was able to find that setting and fix it.

John

No comments: