resolution question

  • Started
  • Last post
  • 7 Responses
  • johnnnnyh

    Hi,
    I have to explain some basic concepts about resolution/pixels/ etc. One thing I'm not 100% certain about is screen resolution and the number of pixels actually within a monitor.
    So for example, I have my LCD screen here and I have a display setting of 1024pxx768px. If I set the resolution to display 640x480 what is happening to the pixels that are now not being "used"?
    Monitors have a fixed number of pixels, presumably, so if theoretically one lowers the display resolution what's the best way of understanding what the pixels are doing. Are they doubling up, or not displaying.
    The more I try and understand this the more confused I'm getting in terms of actual pixels on my monitor and the resolution set by the graphics card.
    Any help would be appreciated!!

  • acescence0

    unlike CRT monitors, LCD has a single native resolution. one pixel = one dot on the screen. If you lower the res, you're "stretching" that pixel across more than one dot on the screen. for instance if you lower the res to 640, 1024/640 = 1.6 dots on the screen represents 1 pixel.

  • version30

    johnnnyh has never resized an image

  • monospaced0

    If you send a lower resolution (640x480) image to a LCD that is natively higher (say, 1280x1024), the software/video card will simply do its best to remap those pixels across the new ones. The result is often a little blurry, which I'm sure is done in hopes of avoiding jagged edges, and is surely done through hardware/software.

    Every LCD has a "native" resolution, which means it has a set number of pixels that it would like to display (also its maximum). Pixels don't double up when they are "zoomed" in on, it's more like what happens when you blow up an image in Photoshop. I hope this helps in any way.

    • basically, anti-aliasing is what happens when you try to spread one pixel across 1 1/2, or 2 1/3, or whatevermonospaced
  • johnnnnyh0

    OK that's great. You've all explained it in the way I was assuming it may work so essentially there are fixed numbers of pixels but the video card remaps to the "new number" - I can explain that OK.

    version3 - it may appear a simple question. But sometimes, having worked with this kind of thing for so many years I realised I didn't actually know the mechanics of what was happening. When demonstrating to beginners I will no doubt get "simple" beginner questions like this and I really need to be sure I could give the technical right answer back. Not just a well it's always worked like that for me . . .

    . . . Now back to resizing those images . . . where was I?

  • monospaced0

    This reminds me of my father, who buys this awesome Mac Pro and an external Apple Cinema Display (24" or thereabouts), and sets the screen resolution to 1280x800 because his eyes are bad. I sit down at his machine and all I can think about is that this is a) this is too much machine for web-browsing and tax software, and b) he has no care whatsoever for native resolution. I can't stand the blurry images and distorted icons, but he lives with it. Feh.

  • SteveJobs0

    here's more than you'll ever need to know about it:

    http://www.howstuffworks.com/mon…

  • Mojo0

    This is incorrect. When you run a 1024x768 monitor at 640x480, the other 110,592 pixels are set to disabled, and the 640x480 pixels are aroused into level 4 proximity. The screen flickers black when you change, because that is the point of proximity refresh.

    If you set it higher than 1024x768 (and that is native), then the remaining pixels connect to arbitrary disabled pixels and subdivide into subpixels, which is the basis of an interpolated, rebiased pixel plane.