Originally posted by djpecker
Ah thats better!
So bitrates can't be compared across codecs?
No, think of it as two different vehicles with very different MPG ratings. x264 is able to do a lot more with a lot less, but requires more CPU power to do so.
How does resolution play into this then?
Higher res = needs higher bitrate
So if a DVD is 480, is 720 50% bigger?
DVD = 720 x 480 (NTSC) = 345,600 pixels
720P = 1280 x 720 = 921,600 pixels
720P programming has 2.6667 as many pixels as a DVD.
If so, then the bitrate would need to be 50% more?
If 50% was the magical number then yes, you should increase the bitrate equally as you increase resolution.
ie 720p = 2 2/3 higher bitrate required to maintain the same quality of image.
But if DVD's are MPEG2, then you can't compare an equivalent bitrate?
Nope, because MPEG2 is an old format now, and simply isn't as efficient.
Hmmm....this is confusing and my maths isn't very good.
Are these things mostly judge by eye?
Sure. Some people will watch any old shit, out of aspect, clipping audio etc. It's all about personal choice.. if something looks good to you, watch it. If it doesn't, bitch about it
I want to talk about sources (BD, HDTV etc) too, but i think i'll hold back in this post.
HDTV tends to be 720P with limited bitrate as there's not as much bandwidth over the air as from an optical disc in a blu-ray player. Pretty much same as DVB-T/C/S compared to DVD (same res, same codec, just less bitrate). I'm not sure how many HDTV content providers use MPEG4, I'm guessing a lot are still on MPEG2 which needs a higher bitrate to perform at the same quality as discussed earlier. I'm not up to date with North American broadcasts.
Europe should know better tho, as we adopted later. It's a mixture over here from memory.. earlier adopters are MPEG2, later channels are MPEG4 based.