Announcement

Collapse
No announcement yet.

Why jaggies of interlaced video on non-interlaced monitors?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Why jaggies of interlaced video on non-interlaced monitors?

    Or actually, more specifically, why are jaggies visible when playing interlaced video on progressive sources, but not visible on interlaced sources?

    When a NTSC video is played on a TV one field is displayed every 1/60 of a second. Jaggies are not really that noticeable. Is that because one field has faded out by the time the next one is displayed?

    When playing interlaced video on a computer monitor is one field displayed every 1/60th of a second? If so I would assume the preview field would be "written over" by the current one and jaggies should not be present.

    But if both fields are automatically combined into one frame and playback is 1/30th of a second then jaggies would be especially obvious. I have a feeling this is what is going on but I just wanted to be sure.

    Also, do any HD monitors playback interlaced video as interlaced video? That is play back field every 1/60th of a second?
    Plasma
    LCD
    Front Projection
    Rear Projection

    - Mark
    - Mark

    Core 2 Duo E6400 o/c 3.2GHz - Asus P5B Deluxe - 2048MB Corsair Twinx 6400C4 - ATI AIW X1900 - Seagate 7200.10 SATA 320GB primary - Western Digital SE16 SATA 320GB secondary - Samsung SATA Lightscribe DVD/CDRW- Midiland 4100 Speakers - Presonus Firepod - Dell FP2001 20" LCD - Windows XP Home

  • #2
    That's why people use dscaler when playing back interlaced material on progressive scan device's like monitors.

    Some media player software has optiion's for deinterlacing, or you could get a dvd player which does deinterlacing...etc

    Go to any HTPC foum and you will see a lot of info on the subject.

    Deinterlacing is almost a necessity when viewing normal content on any HD capable device (projectors..etc)

    Comment


    • #3
      Thanks but I know all about interlacing and various methods to deinterlace. I am interested in answering the questions in my post. Believe me, I've been around and haven't seen these questions addressed.

      I'm sure you know that I've been posting here for about 5 years and have contributed reviews to the site. I wouldn't ask a question that is easily answered by having a look at a "deinterlacing" tutorial. I was looking to initiate discussion of the technical aspects of this subject as I think it will be increasingly more important as we begin to edit HDV, especially 1080i (the Sony format). I've started many similar threads over the years and they usually turn into a great source of information sharing.

      Thanks for taking the time to reply!

      - Mark
      - Mark

      Core 2 Duo E6400 o/c 3.2GHz - Asus P5B Deluxe - 2048MB Corsair Twinx 6400C4 - ATI AIW X1900 - Seagate 7200.10 SATA 320GB primary - Western Digital SE16 SATA 320GB secondary - Samsung SATA Lightscribe DVD/CDRW- Midiland 4100 Speakers - Presonus Firepod - Dell FP2001 20" LCD - Windows XP Home

      Comment


      • #4
        Re: Why jaggies of interlaced video on non-interlaced monitors?

        Originally posted by Hulk
        But if both fields are automatically combined into one frame and playback is 1/30th of a second then jaggies would be especially obvious. I have a feeling this is what is going on but I just wanted to be sure.
        - Mark
        That's correct. That's what adaptive de-interlacing (not motion compensation, whoops ) is for, it attempts to stick the fields back together in an intelligent way as opposed to simple deinterlacing.
        Last edited by Jon P. Inghram; 4 February 2005, 23:07.

        Comment


        • #5
          Do you mean like frame tearing or "jitter"?
          (bad sync?)

          edit toadd: Hulk, I should have know better , a good % of what I know about deinterlacing have been in those same threads you have been posting in :.
          Last edited by Marshmallowman; 4 February 2005, 00:53.

          Comment


          • #6
            Your graphics card is the culprit. It refreshes your monitor at, say, 65-100 times/s. It is obvious that trying to display something interlaced at 29.97 or 25 frames/s would send it bananas, strobing all over the place, so it converts your video frame by frame to deinterlaced and repeats each frame n times until the next one comes due for display. To do this interlaced would be horrendously difficult, although not impossible, I would guess, but it would require a special card. I understand that the Ulead DVD Player (which works with most usual formats if you have the codec) uses software to send a compromise frame to the graphics card, giving a better appearance on a 'puter monitor. Certainly, it looks better than WMP.
            Brian (the devil incarnate)

            Comment


            • #7
              Marshmallowman,

              No problem. I know you meant no ill will. Sometimes a post just rubs one the wrong way. I didn't mean to get defensive. It's been kind of dead in here lately and I thought we could have a little discussion.

              The bottom line is that I'm really curious as to why Sony went with an interlaced format for the FX1/Z1U camcorders. They are both anamorphic HDV 1080i, which is 1440x1080 with a PAR (pixel aspect ratio) of 1.333.

              This begs the question of what, if any, HD display devices are natively interlaced devices? None that I know of.

              So, this means that the video signal WILL HAVE to be deinterlaced. I have worked with Z1U video and it is really good. Deinterlacing leaves a very good signal, perhaps with a perceived resolution of 1280 vertical lines, the full 1080 horizontally of course. So the deinterlaced resolution is at least as good at 1280x720 or HDV 720p. But still why cripple the output in this way.

              I'm under the impression that it's a lot cheaper to produce 3 chips that have 960x1080 resolution than 1440x1080 and that's where the format originated.

              I have handled the camera and it is VERY nice, as is the output. But it just seems like there's always one big flaw in every camera or format. Why would a consortium spec HDV 1080 as interlaced?

              In many ways I'd rather have the lower resolution 720p format and have the following advantages:

              full progressive, no interlacing to worry about
              easier on the NLE editing-wise

              Unfortunately the Sony cams are so superior to the JVC offerings that there really is no competition for Sony in the HDV arena right now. Perhaps Canon has something up their sleeve...

              BTW, even with the Cineform conform editing this HDV 1080i stream is a bear on my P4 3.06 system. And the only good previewing method is to use dual displays and "sit" the preview in the second display. Of course the second display must be calibrated, or a color correction can be placed on the final video output (global) stream. You also have to remember to remove it before the final render!

              I also forgot to mention that there are color space concerns as HDV cams are supposed to use the 709 spec, the Sony cams do but the JVC cams use the SD 601 spec. Causing more problems if your NLE doesn't recognize the properly.

              Ah the growing pains of a new format!

              - Mark
              - Mark

              Core 2 Duo E6400 o/c 3.2GHz - Asus P5B Deluxe - 2048MB Corsair Twinx 6400C4 - ATI AIW X1900 - Seagate 7200.10 SATA 320GB primary - Western Digital SE16 SATA 320GB secondary - Samsung SATA Lightscribe DVD/CDRW- Midiland 4100 Speakers - Presonus Firepod - Dell FP2001 20" LCD - Windows XP Home

              Comment


              • #8
                I agree with Brian.Especially if refresh isn't a multiple of videos frame rate

                Comment

                Working...
                X