LCD TV
various parameters

I   
've been boning up on LCD tvs for the past few weeks in an effort to find an affordable yet acceptable solution to my design problems [mounting an LCD monitor on an arm so I can watch it in bed from about two feet away]. I've got a cheap ... really cheap ($75 and I'm waiting on a $30 rebate to make it $45) ... Viewsonic VB50HRTV TV tuner, scaler, de-interlacer to go with a Sceptre 17" 16ms monitor ($288 at Sam's - Chinese Outlet) and I got a Cyberhome progressive scan DVD player at Chinese Outlet for only $38. I won't be using it for computer display ... only for DVD and TV watching. I don't know if the outcome will be acceptable. Some comments about the VB50 are good and others bad. It may depend on the monitor used. Anyway, if it works, I'll have a 17" LCD TV on an extendable arm for less than $400. (Yes, I'd love to have a 24" LCD TV like I see in Best Buy and Fry's, etc. ... but I just can't justify spending another 1000 to 1500 bucks).

While hunting up some kind of a system, I became aware of an entire breed of gizmos I hadn't known about. These are the "real" TV scalers/de-interlacers I would like to get but can't really afford ... like the Zinwell, AVT or DVDO line (~$400 to thousands). These thingies take your digital/analog signal and turn it into progressive scan (de-interlace) so it will look better and scale it up or down to be viewed on your TV or monitor with the optimum number of lines or pixels, i.e. resolution.

Here's a few links for explanations about such things:
Guide to Intelligent Video Scaling
Line Doublers and De-Interlacers
Audio Video 101

Interlacing (sending a video frame in one thirtieth of a second but dividing it up into odd (1,3,5,7 ...) and even (2,4,6,8 ...) scan line groups sent alternately every 1/60th of a second) had its origins in early TV development and is now becoming legacy stuff to be done away with ... except for 1080i which is there for the same reasons and will become "legacy" at some future date.

The original interlacing was done because the phosphors on the CRT faded too quickly when progressive scanning (1,2,3,4,5 ...) was used which produced "flickering", i.e. before the entire frame was rendered the first part had already started to disappear. So, they divided the frame in two parts (odd and even) and the mind assembled the two pieces as one with less perceived flicker. This is why greater frame rates are desirable in CRT monitor ... to reduce this flicker.

The problem with interlacing

When you make a cartoon motion picture, you draw one frame at a time and flip through each frame quickly to simulate motion. But if you take pictures of the real world in real time you get a problem ... what do you do about objects which move during the exposure?

lcdtvpa1.gif - 3kb Interlacing is just an extension of the same problem you get with any motion picture taking. You take the first scan of odd lines ... then ... go back for a second scan of the even lines ... but ... the subject has moved some distance in the intervening 1/60th of a second. The brain assembles the interlaced image and perceives a "comb" effect where the odd lines are the teeth of the comb and the even lines are the gaps where the subject has moved in the intervening 1/60th of a second. How to even out the appearance of the complete picture ???

Here's what the de-interlacer does. It processes the lines to even out the movement mess and send them to the monitor as progressive lines 1,2,3,4,5,6,7 ... Supposedly, this makes for an overall better looking picture. Well, that's not all there is to it. After you've reassembled the two parts, de-interlacers send that picture twice, i.e. it sends the signal 1.2.3.4.5 ... twice in 1/30th of a second ... the same frame is repeated. This is what a line doubler does. This also helps to make the picture clearer. Then there's the line quadrupler which makes four of the same lines in the same time and you get twice as many lines of resolution, i.e. you get 960 lines instead of 480 for instance. It's a cheesy way to make a bigger picture ... kinda like MS Paint skewing the bitmap larger by just adding some more copied lines to fill in the required space.

When you take a picture with a film (or digital) camera, you get all the data at once in each frame ... not line by line. So, it doesn't matter how the data is transmitted. Either progressive or interlaced will display the same thing without the comb effect. Today ... LIVE TV! ... means ... processed to some extent. If for nothing else, for foul language. Thus, all live TV is transmitted a second or two late. In pro-football games, they process the signal to put the yellow line in for the first down marker. So, there is no reason on earth to send a signal other than progressive.

Why then send an interlaced signal?

The only answer I can figure is that interlacing is a legacy problem. The early TVs were done this way so they keep doing it so that the entire fleet of interlace TVs doesn't have to be changed out for progressive all at once. They are making interlaced output so that our TVs can display that output, i.e. if they sent it as progressive, we get no picture.

Why make 1080i instead of 1080p? I assume that it's just to feed CRT monitors ... but CRTs are going the way of the dodo. Or, maybe it's the plasma screens? They may have the same flicker problem as CRTs. But plasma is going the way of the dodo too. It's too heavy, too expensive and suffers burn-in. The future is LCD and more probably OLED (organic light emitting diode).

You don't get burn-in or flicker from an LCD. In fact, the main problem with LCD is slow response time, i.e. the pixel doesn't respond fast enough. Unlike the CRT which fades away too fast producing flicker, the LCD fades too slowly producing artifacts during fast motion. At 25 milliseconds response time you can show 40 frames per second. That should be enough to cover the 30 frames per second which is the movie standard. However, the DVD player is sending out 60 frames per second (interlaced) for an acceptable (flicker-free) refresh rate of 60 Hertz on a CRT. So, if you have an LCD you need a response time ("refresh rate" in LCD terminology) of 16 milliseconds to keep up with the Joneses (1000/16 = 62.5 distinct frames per second). At 40 fps, your display may be lagging behind when something moves too fast. (In NTSC standards you need 16 milliseconds response time but for PAL and SECAM you need 25 milliseconds because they have 40 fps which is why many LCDs are pegged at 25 milliseconds) ...

"Ah so, Jahpahneez TV for aw Amehwican dog!"

Horizontal and Vertical Resolution

When a signal is sent to your TV (regardless of the type), you get the picture one pixel or picture element at a time. You can't receive the entire picture at once as in a theater or even an entire line at once. If you wanted a vertical line at once instead of its being painted, you'd need, say, 480 electron guns firing simultaneously instead of one. Want to buy 479 more components (to say nothing of the complexity involved). Similarly, in a computer monitor, to receive more than one pixel at a time, you'd need correspondingly more wires. To present the entire picture at once, you'd need two separate dedicated wires (electron paths) for every pixel element, i.e. if you have an SXGA monitor (1280x1024) you'd need 1.3 x 6 = ~8 million separate wires instead of 1280+1024 x 6 = 13,824 ... plus a two order of magnitude increase in computing power or thereabouts. Want to pay for it? So, you get one pixel at a time.

You know the vertical and horizontal resolution of a computer monitor but what does this mean in the CRT realm?

Well, imagine yourself with a machine gun hosing down the enemy who are charging your fixed position en masse. You'd want to put one bullet in each of them ... so ... you hose at a rate the will just spread the bullets to each enemy (or in our case ... a pixel). Clearly if you increase the bullet rate of the electron gun, you can hose the pixels at a much faster rate ... or ... hose more per pass. Let's say you want to fire 480 passes at the enemy each of which contain 640 bullets. You'd need an ammo supply of 480 passes x 640 bullets per pass x 60 of these "frames" per second ... equals ... 18,432,000 bullets per second. That is, your machine gun must fire at the rate of around 9 megahertz or fall behind (two bullets for each cycle).

You can see why when computing power moved into the gigahertz ranges, many things became possible which were not possible when computers were running at 33 megahertz (when I got my first). There are plenty of cycles left over to do video processing ... even 3-D processing for gamers. We now have the electronics capacity to drive a high definition LCD monitor to deliver acceptable moving images with sound, color and many other taken-for-granted cool things.

Viewing Angles

lcdtvpa2.gif - 5kb I've read a good deal about the fact that you can't see an LCD TV if you are too far off to the side ... usually it's about 70o off center when the picture pretty much disappears. This is very sad but true. Surely it's a terrible hangup as I'm really looking forward to seeing TV from that angle. In fact, I'm just gonna' give up altogether and kill myself.

Wait a minute! I just went to Best Buy this weekend and the LCD TVs are all (every one of them) comparable to plasma! The picture doesn't disappear ... unless ... you are buying a computer monitor. In that case, they don't do anything to improve the viewing angle because nobody sits to the side while using their computer. So the best screens are upgrades (for more cash) so you can have multiple viewers at wide angles. The above mentioned issue no longer exists for flat panel LCD screens made for TV. Just observe the LCD and plasma screens down the same aisle from the end. It's obviously not an issue anymore.

Pixels in quantity

I've looked at hundreds of LCD outputs in stores in the past few months and no 640x480 set was ever acceptable to me. They are just too grainy. LCDs look good at 1280x720 and higher. Why? The signal sent doesn't have that sort of resolution. DVD has just the 480 horizontal lines of resolution. Why should the greater pixel count make for a finer detailed picture? Shouldn't it just be bigger and fuzzier? ... with no greater detail?

Here's where the scaler comes in.

lcdtvpa3.gif - 4kb When you take a signal with 480 lines in it and want to make it twice that (960) you could do what MS Paint does (at left) and just add an extra line just like the the one next to it. If you want the pic to be another third as large you can add that extra line every same thing as every third line in the original. This is the cheesy, cheap MS way to do the job. If you make the pic smaller, you lose lines. So if there is text, a lost line can mean the the letter "E" ends up as three little horizontal bars and no vertical bar connecting them, i.e. your text gets destroyed. But this requires almost no processing.

Note: MS Paint has lost functionality over the years,
i.e. nobody at Microsoft works on it ... ever.

The other option is to process the picture's pixels to make an intelligent decision about what to put in the extra space when you expand the picture. There must be rules (algorithms) for deciding what to do about each pixel. These rules require more intense processing and function at the capacity of today's video processors which can handle a lot more than just "add one line same as the one before ... skip one line ... add another line same as the one before, and so on".

What a video processor does to expand the picture into a larger number of pixels is sort of reverse JPEGing. Instead of removing pixels from a raster image (bitmap or tiff) and replacing them with mathematically estimated pixels, it adds pixels estimated mathematically to be there if the picture could have been taken at a greater resolution.

A JPEG takes the huge image from a 3 megapixel camera and saves every, say, tenth pixel horizontally and vertically. The resulting image has 1/100 the file size of the original at the loss of 99% of the information in the original picture. But then, when you display the JPEG image, the JPEG display program fills in the missing 99% with what it estimates to have been there originally. The resulting image looks somewhat fuzzy and if you had detail in the two pixel range it's likely to have been lost altogether. But ... you can still appreciate the image for what is there, i.e. you can still appreciate the "faked" resolution from the JPEG program.

I don't know the exact specs for JPEG nor for its motion counterpart MPEG2 which is currently used in DVDs and satellite broadcasting. Without MPEG, movies would not fit on a single DVD (maybe not even on ten of them) and DirectTV couldn't exist. As processing power increases so too does the ability of MPEG to encode data. File sizes may shrink while the ability to send data increases.

MPEG differs from JPEG in that even greater compression can be obtained by sending a frame of a movie only once if it doesn't change over time. Thus, if the movie starts out with a shot of a still desert for 10 seconds, you don't send 300 pictures all the same to display it. You just send the one. Now, if a tumbleweed blows across the picture, you don't need to send all 300 frames either ... just the part of the picture that changed. So, the frame may be divided into multiple zones. If a zone doesn't change, it doesn't need to be sent again. Similarly, if the camera pans across the scene, you just show the new piece added on one side and subtract what's no longer visible on the other side.

Now, as for that scaler. You just take the information that's sent and fill in the expansion pixels according to some JPEG formula. Since JPEG can be scaled you could conceivably create a reasonable picture with a faked resolution of 9,000 x 16,000 pixels. But you don't need to go nearly that far. The higher resolution currently used is about 1900 X 1200 and I don't expect to see much higher. There simply isn't any artistic need for such high resolution. In fact, I read an article in WIRED about an 18 minute film that was made at such high resolution (many terrabytes) that those who view it become nauseous because it cannot be distinguished from reality.

Estimating what pixels to put into the expanded lines of resolution involves mostly getting the edges right. If two adjacent pixels are identical in color, it is easy to see what must be put in an expanded space between them ... the same color obviously ... because being identical is indicative of their being on the same surface of the same object. At least that would be the way to bet.

But, if one pixel is white and the other black, you would guess that it was an edge ... so ... you wouldn't want to do the democratic thing and make the intervening pixel gray because that would tend to blur the supposed edge. You'd want to pitch the color over closer to the black (or the white so long as the algorithm is global). That would define the edge more than blur it. The same is true for radical, pixel-to-neighboring-pixel, color changes. Such big differences would indicate a different body or edge or pattern. To preserve such edge detail, pitch the color value to one or the other globally. Only when the colors are less different would you suspect a gradual transition rather than an edge. In that case, you'd go for the average color.

lcdtvpa4.gif - 4kb

Observe the above. Here we have a four pixel image. In MS Paint, you just double the rows and columns and get the identical picture ... just bigger. You don't lose any information in Paint when you expand it but you may gain some because it no longer functions exactly in this way but will put different colors into the edges. It looks like a programming artifact that nobody bothered to take out when they went to more "advanced" Windows OSes. Certainly, it couldn't have been done intentionally because the resulting image is much worse than just leaving it "primitive". :o)

In the third image, the row "D" and column "4" are not filled in because what goes there must be "processed" based on what would be in row 5 and column E which isn't shown. So, we have the pixels from the original spread apart by one row and one column. We have then five new pixels to decide upon based on the given information. Since A1 and C1 are very different in brightness, an edge is inferred generating a color "pitched" to the lighter color for B1 (in this example). The same is true for pixel A2. Pixels B3 and C2 are easy to decide since they lie directly between two pixels of the same color implying a solid body. But ... what to do about the diagonal, B2? Here, we can introduce an algorithm based on the four corners which surround it. Thus, "if two diagonals are identical (implying an edge) make that pixel the same as those pixels. In this way, a picture can be built up which is superior to the MS Paint style.

Of course, the real MPEG and JPEG up-scaling rules may be much more sophisticated than what I have just shown. And ... the individual display monitor, which has a scaler employing JPEG type rules built into it (if it can show multiple resolutions), may have other ways of doing the picture their way. At any rate, because of the sophistication of the algorithms and the power of the video processors and scalers, the "real resolution" may not be recognizably different from the resolution thus "faked". Hence, when we speak of resolution it is important to note that there is a gray area involved concerning what the true definition of "resolution" actually is. Is it the "appearance" of objects as we subjectively perceive or does it have to do with the objective information which is transmitted? For the purposes of viewing an artistic display, i.e. a movie ... clearly, we only have to worry about what the thing looks like subjectively ... not what it "is". If we were doing a scientific analysis of heat flow through a metal plate in false color ... then ... the true information is more important and the artistic merit of scaling is unimportant. See?

Now, when you go to Best Buy to get your LCD TV, you know what to look for ... the tv that "looks the best" ... generally this will be with lots of pixels and a low response time (16ms or less). But you didn't need me to tell you that, did you?


Google
 
Web ebtx.com


Ebtx Home Page