PDA

View Full Version : 1080i / 1080p



vrlika
05-03-2008, 09:33 PM
what is the difference between them when playing a blu-ray dvd 'cos my panasonic viera only goes up to 1080i ?

SPX
05-03-2008, 09:40 PM
1080i= interlaced picture (alternate lines interlaced to give picture)
1080p=progressive (picture arrives all at once,lcd and plasma perfere this)

HD1080
05-03-2008, 11:23 PM
1080p is better basically.

But 720p out of all of them - i would say is the best.

bonovox
06-03-2008, 01:56 AM
1080p is better basically.

But 720p out of all of them - i would say is the best.

Errrr not quite correct. How can a 720p picture be better than a 1080p picture. That just doesn't make sense. A 1080p picture will be 1920 x 1080 pixels and so will be designed to be viewed on a 1080p display. A 720p picture will be 1280 x 720 pixels. So more definition and more detail.

bob
06-03-2008, 10:18 AM
The Advanced Television Systems Committee (ATSC), which originated the HDTV standard, states both 720p and 1080i qualify as true HDTV. So does the Consumer Electronics Manufacturers Association, But for various reasons other parties in the hi-def debate are choosing sides. Most television manufactures have chosen to display HDTV in 1080i. Among their supporters are CBS, NBC and HBO. While others claim 1080i is significantly flawed and that 720p is better looking, better suited for new display technologies, and therefore more in tune with the future of digital television. On this side are ABC, ESPN HD and Fox among others. Still others state that on ‘real world’ sets the difference is minor.

Let’s consider 1080i vs. 720p. Who's right, which looks better? And does it really matter? Let's start by looking at the numbers. Counting scan lines is how 1080i proponents make their case. If you count scan lines, the number 1080 is higher than the number 720, right? But, the more accurate method is to count "pixels," or picture elements, the dots that make up the picture. It’s mathematical really, multiply the number of vertical pixels in each format by the number of horizontal pixels. In 1080i, 1080 times 1920 equal 2,073,600 dots. In 720p, 720 times 1280 equal 921,600 dots. When you count dots, 1080i seems to have more than twice as many dots as 720p, and therefore a picture that's more than twice as sharp. However, numbers aren't everything. As the (i) and (p) imply, there is another distinction, 1080i is an "interlaced" format while 720p is a "progressive" format. Each uses a different method to turn a succession of still images into moving pictures.

Difference between interlaced and progressive:

Interlaced scanning produces a still picture field, or a ‘frame’, by scanning two sets of alternating lines. Progressive scanning creates a frame in one pass. If both are moving at the same rate "refreshing" the screen at the same number of passes per second, that gives progressive scanning the advantage, because it scans a complete picture ‘frame’, not half a picture ‘field’. It produces fewer dots and lines, but at twice the speed. So now it's a question of timing. As ABC's FAQ touches on: The number of lines of resolution in progressive and interlaced pictures is not a clear cut comparison. In the time it takes 720p to draw 720 lines, 1080i draws only 540 lines. And by the time 1080i does draw 1080 lines, 720p has drawn 1440 lines.

The truth is that 1080i and 720p each look better in different situations. The 1080i format is better at producing fine detail in still frames and pictures with little or no motion. Regardless of how long it takes to produce a picture, that picture has more lines, more dots. But this works well only as long as nothing moves. Remember how two fields make a frame? If something moves, the path of the motion changes between the alternating fields. That introduces "motion artifacts," or visible distortion, such as stair-step patterns on a diagonal edge. The 720p format excels at reproducing motion, introducing little visible distortion regardless of the timing of moving objects.

What it all means:

So in the end both 1080i and 720p are high definition, but unless you’re watching a lot of slow moving transitional shots of lilies in a field, 720p for the most part is as hi-def as you need to get. What about 1080p you say? Take all of 1080i and 720p’s attributes and combine them and now you’re really cooking.

Tmull
06-03-2008, 11:12 AM
Thanks, very interesting. TMULL

Micko50
06-03-2008, 12:20 PM
Thanks Bob.

Gone_Fishing
06-03-2008, 06:40 PM
Blu Ray also has the ability to output 1080p/24 in fact thats how its stored on the disc, 1080p/24 is exactly what you
would see at a cinema 24 progressive frames per second

Not many affordable screens are able to output native 1080p/24 but most can accept 1080p and then process it for viewing


TNT

r.maxwell
06-03-2008, 07:24 PM
1080p is better than 720p but 1080i is worse than 720p depends on equipment being used. If you can display 1080p then use it. 1080i is for those who want 1080 but cant display it in its full glory. 720p gives a better picture than 1080i on my set-up, I haven't got 1080p.

dubious
07-03-2008, 11:00 AM
The Advanced Television Systems Committee (ATSC), which originated the HDTV standard, states both 720p and 1080i qualify as true HDTV. So does the Consumer Electronics Manufacturers Association, But for various reasons other parties in the hi-def debate are choosing sides. Most television manufactures have chosen to display HDTV in 1080i. Among their supporters are CBS, NBC and HBO. While others claim 1080i is significantly flawed and that 720p is better looking, better suited for new display technologies, and therefore more in tune with the future of digital television. On this side are ABC, ESPN HD and Fox among others. Still others state that on ‘real world’ sets the difference is minor.

Let’s consider 1080i vs. 720p. Who's right, which looks better? And does it really matter? Let's start by looking at the numbers. Counting scan lines is how 1080i proponents make their case. If you count scan lines, the number 1080 is higher than the number 720, right? But, the more accurate method is to count "pixels," or picture elements, the dots that make up the picture. It’s mathematical really, multiply the number of vertical pixels in each format by the number of horizontal pixels. In 1080i, 1080 times 1920 equal 2,073,600 dots. In 720p, 720 times 1280 equal 921,600 dots. When you count dots, 1080i seems to have more than twice as many dots as 720p, and therefore a picture that's more than twice as sharp. However, numbers aren't everything. As the (i) and (p) imply, there is another distinction, 1080i is an "interlaced" format while 720p is a "progressive" format. Each uses a different method to turn a succession of still images into moving pictures.

Difference between interlaced and progressive:

Interlaced scanning produces a still picture field, or a ‘frame’, by scanning two sets of alternating lines. Progressive scanning creates a frame in one pass. If both are moving at the same rate "refreshing" the screen at the same number of passes per second, that gives progressive scanning the advantage, because it scans a complete picture ‘frame’, not half a picture ‘field’. It produces fewer dots and lines, but at twice the speed. So now it's a question of timing. As ABC's FAQ touches on: The number of lines of resolution in progressive and interlaced pictures is not a clear cut comparison. In the time it takes 720p to draw 720 lines, 1080i draws only 540 lines. And by the time 1080i does draw 1080 lines, 720p has drawn 1440 lines.

The truth is that 1080i and 720p each look better in different situations. The 1080i format is better at producing fine detail in still frames and pictures with little or no motion. Regardless of how long it takes to produce a picture, that picture has more lines, more dots. But this works well only as long as nothing moves. Remember how two fields make a frame? If something moves, the path of the motion changes between the alternating fields. That introduces "motion artifacts," or visible distortion, such as stair-step patterns on a diagonal edge. The 720p format excels at reproducing motion, introducing little visible distortion regardless of the timing of moving objects.

What it all means:

So in the end both 1080i and 720p are high definition, but unless you’re watching a lot of slow moving transitional shots of lilies in a field, 720p for the most part is as hi-def as you need to get. What about 1080p you say? Take all of 1080i and 720p’s attributes and combine them and now you’re really cooking.

Hmm, most of what you have copied and pasted here is largely irelevent here in Europe.

It's obvious from the first paragraph that the article is American orientated, in the USA, they only use mpeg2 for broadcast HD, so whilst what is written there maybe true for the USA it isn't neccessarily true for Europe.

Although broadcasters have the choice of 1080i or 720p here in Europe they only use 1080i also they use mpeg4 for broadcast HD which uses much better compression techniques than mpeg2 which in turn will affect how good the the HD picture is.

By their very nature all flat panel TVs are progressive, so, no matter what signal you feed the panel, for example 1080i, the panel will be scaling the picture to it's native progressive resolution. So, for example if you take the OPs situation where his TV will accept a maximum 1080i signal the panel itself will convert a 1080i signal and display it as 1080p even though his TV can't accept a 1080p signal. So it depends how good his TV processes and converts the signal to which actually looks best for his particular panel, 720p or 1080i, I would suggest 1080i would in his particular case will give the better picture as there is little processing of the signal to be done.

Still confused ? thought so, let's clear up something, an HD ready TV is a first generation HD TV and most if not all are not True HD, basically they do nothing more than take a 1080 or 720 signal and convert/display it to it's native resolution.

Go here for more information about HD Ready _http://en.wikipedia.org/wiki/HD_ready

Go here for more information about Full HD _http://en.wikipedia.org/wiki/Full_HD

Take this example, lets suppose the you have a HD ready plasma with a native resolution of 1365 x 768, it's obvious that it's neither 1080 or 720, so what happens ? lets say you feed it with a 1280 x 720p signal from an HD sat receiver, in this case the panel will upscale the signal to it's native res of 1365 x 768p, now lets say you feed it from the same sat receiver with a 1920 x 1080i signal, what happens ? in this case the TV will downscale the signal to it's native res of 1365 x 768 and display it as progressive.

So you can see there is alot of processing of the original 1080i signal going on by both the HD sat receiver and the TV itself, now depending on how well your own equipment is doing the processing depends which looks better on your own HDTV, 720 or 1080, there can be no hard and fast rules for this.

Theoretically a downscaled higher res should look better than an upscaled lower res but again it depends which one your panel handles the best.

I would suggest that most ppl on here are using HD ready HDTVs, but anyone looking to buy an HDTV today should really be looking to buy a True HDTV with a native res of 1920 x1080 that will accept the full range, 24p for Blu-ray, 50Hz for European broadcast HDTV and 60Hz for most downloaded US material.

I've made this post in good faith in order clarify a few misconceptions about HD lately on these forums. Make what you will of the information provided :)

bob
07-03-2008, 03:21 PM
Returned from the US with my cut and paste.
There seems to be some confusion in this thread about supported resolutions and displayed resolutions Compression and resolution tradeof.
All HD Ready TVs accept both 720p and 1080i, but there is a big difference between the resolutions a TV accepts and the resolution it displays.

There are no TVs available that can display both 720p and 1080i natively. Flat panels (with the exception of ALiS panels) are inherently progressive devices. They have a fixed resolution, and all incoming signals must be scaled to the panel's native resolution, such as 720p or 768p, and deinterlaced if necessary.

One subject that rarely seems to be touched upon in discussions of HDTV is the importance of having a good quality deinterlacer. While some displays do a very good job of deinterlacing standard definition signals, they are almost uniformly bad at deinterlacing 1080i. Instead of deinterlacing to 1080p they deinterlace to 540p and then scale to the panel's native resolution. As a result you won't get to see the full resolution of a 1080i broadcast even if you have a 1080p display. The only current solution is to use an external scaler with high-definition capabilities, but they are not cheap.


http://www.bbc.co.uk/commissioning/production/docs/discovery_campus.pdf

Gone_Fishing
07-03-2008, 04:05 PM
Returned from the US with my cut and paste.
There seems to be some confusion in this thread about supported resolutions and displayed resolutions Compression and resolution tradeof.
All HD Ready TVs accept both 720p and 1080i, but there is a big difference between the resolutions a TV accepts and the resolution it displays.

There are no TVs available that can display both 720p and 1080i natively. Flat panels (with the exception of ALiS panels) are inherently progressive devices. They have a fixed resolution, and all incoming signals must be scaled to the panel's native resolution, such as 720p or 768p, and deinterlaced if necessary.

One subject that rarely seems to be touched upon in discussions of HDTV is the importance of having a good quality deinterlacer. While some displays do a very good job of deinterlacing standard definition signals, they are almost uniformly bad at deinterlacing 1080i. Instead of deinterlacing to 1080p they deinterlace to 540p and then scale to the panel's native resolution. As a result you won't get to see the full resolution of a 1080i broadcast even if you have a 1080p display. The only current solution is to use an external scaler with high-definition capabilities, but they are not cheap.


http://www.bbc.co.uk/commissioning/production/docs/discovery_campus.pdf

I would say thats a pretty old cut and paste :)

There are quite a few 1920x1080p panels available now that do 1080p at 50Hz and 60Hz but very few that do 24Hz

Most people wont notice the judder produced by 50Hz and 60Hz so wont be bothered anyway

Anyone looking for a 1080p panel that does it at 24Hz over HDMI should look at the following ;)

Pioneer PDP-LX508D
Pioneer PDP-LX608D

http://www.picturepile.net/files/i3knzjnyzkxkmdtryjrt.png


TNT

dubious
07-03-2008, 04:19 PM
Instead of deinterlacing to 1080p they deinterlace to 540p and then scale to the panel's native resolution. As a result you won't get to see the full resolution of a 1080i broadcast even if you have a 1080p display. The only current solution is to use an external scaler with high-definition capabilities, but they are not cheap.

Sorry but that's just plain wrong


To get all 1080 interlaced lines to appear on the screen at the same time on a progressive high-definition display, the processor within the HD set has to weave together both 540-line segments to form the full-resolution frame. It does so by holding the first field in its memory, receiving the next field, then electronically knitting the two fields together.

Source:_http://en.wikipedia.org/wiki/720p#720p_versus_1080i