In Reply to: RE: Is 720 Better than 1080 in the Real World? posted by cdb on October 27, 2011 at 08:54:43:
> Don't get too hung up on the resolution numbers since the quality of the source material can easily be the determining factor.Totally agree.
So many people are hung up on spatial resolution, while some broadcasters bit-starve the subchannels to the point of visible artifacts. The focus on resolution totally ignores the fact that the content are not transmitted in raw 1920x1080x30 or 1280x720x60 images, but rather in encoded and lossy compressed data that has *discarded* both spatial and temporal picture information.Short of measuring the bit rate of the subchannel (e.g. TSreader), I use the recorded file length as a gauge of the transmitted bit rate. The recordings of local CBS (1080i) and Fox (720p) programs are typically about 7 gigabytes per hour (which is close to the 19 Kbps ATSC max), whereas NBC (1080i) and ABC (720p) programs are only about 5 gigabytes per hour. ABC seems to care the least about picture quality, since some primetime dramas go as low as 4.2 GB/hr.
> 1080 can also vary a little bit, but typically not much.I find "30 Rock" (broadcast in 1080i) to have the worst picture quality of any show I watch; it's dark (or poor contrast) and soft. Then my eyes are shocked when a (bright and detailed) commercial comes on.
Edits: 10/28/11 10/28/11 10/28/11
This post is made possible by the generous support of people like you and our sponsors:
Follow Ups
- you should be concerned with the transmitted bit rate - blue_z 12:51:54 10/28/11 (0)