SandyKG opened this issue on Jun 21, 2010 · 10 posts
replicand posted Tue, 22 June 2010 at 5:00 PM
Fields. This discussion lean towards theory. Some of the information may be inaccurate and I encourage those with more experience to chime in.
When television was first broadcast over the airwaves (1935?), bandwidth was pretty limited. For the sake of argument, NTSC video plays at 30 frames per second. Rather than overload the bandwidth, interlaced fields were introduced. Interlaced fields reduce bandwidth by transmitting half of the picture twice as often. I know, doesn't sound like it makes sense.
A CRT monitor has an effective resolution of 720 x 480 pixels. All the "odd numbered" lines of pixels are transmitted, followed by all the "even numbered" lines of pixels. For NTSC video, this happens at 60 frames per second; for PAL this happens 50 frames per second. This can be illustrated by pausing a videotape. There will be (turbulent) noise because the fields change quickly, but you will also see that half the picture is missing. That's interlacing.
With the advent of HD content, progressive scans have become more common. Blocks of pixels are displayed for the screen's top left corner to the bottom right corner (and repeat). You can see this artifact in action watching ESPN when signal fidelity decreases. Those blocks are the progressive scan.
Fields are necessary for broadcast TV and videotape, but (possibly) not for DVD.
[edit] After Effects does sound, transitions and everything else Premiere does. The key difference between them is the way they handle the types of input files they accept. Premiere leans towards video but you can do rasterized motion graphics. After Effects leans towards motion graphics and can handle most standard definition video content. I personally do not use Premiere for animated 3D renders. [/edit]