Thread: Brightness compensation for display monitors

  1. #1
    Registered User
    Join Date
    Mar 2011
    Posts
    596

    Brightness compensation for display monitors

    Is there a function to correct the nonlinearity of brightness response of display monitors?

    The problem is that the perceived brightness of a pixel does not track the brightness value
    of the pixel, i.e., it is not linear.

    My plan right now is to create a parametric function to adjust the transfer curve of color
    intensity values to perceived brightness. This will work on my display, but not necessarily
    on others, especially if those others already have some type of correction.

    Does a general function for this already exist?

    -

  2. #2
    Registered User MutantJohn's Avatar
    Join Date
    Feb 2013
    Posts
    2,665
    Wait, are you trying to say that you're not able to turn up the brightness on your monitor? Or are you stuck in this Hell between too bright and not bright enough?

  3. #3
    Registered User
    Join Date
    Mar 2011
    Posts
    596
    No, I'm not having problems with my display or anything. All is working correctly.

    I am working on antialiasing lines and curves. There is a point along a line of single pixel width,
    where the line is rendered by two pixels of 1/2 intensity each. This occurs where the line passes
    between the two pixels; where the line passes directly through a pixel center, the pixel is at full
    intensity. The problem is that the two half intensity pixels are not as bright as the single full intensity
    pixel. So you get a line that periodically varies in brightness along it's length.

    The reason for this is the nonlinearity of the pixel value to perceived brightness curve of LCD displays.
    Video cameras (and most light sensitive and light producing devices, I suspect) also have a nonlinearity.
    The function for this curve, and it's correction, is normally called gamma.

    Display adaptors are often supplied with utilities for adjusting the gamma for the red, green, and blue
    intensities. Monitors also usually have a utility for adusting the colors in one way or another, though
    not necessarily the actual gamma.

    I can always adjust my monitor for an optimum display, or make the corrections in my program for an
    optimum display. But I don't have access to many other displays to test that things look acceptably good
    on all of them.

    So I guess my question is actually several questions:

    Is it common for PC users to adjust their displays for optimum colors and contrast?

    Is it common for monitor manufacturers to incorporate gamma compensation in the monitor itself?

    Is there a gamma type compensation curve in general use within programs?

    And so then, where is the best place, if any, to apply a gamma type correction? Within my program?

    I don't want to correct things within my program, for my own display, only to make things look worse
    everywhere else.

    -
    Last edited by megafiddle; 04-19-2014 at 08:12 PM.

  4. #4
    Master Apprentice phantomotap's Avatar
    Join Date
    Jan 2008
    Posts
    5,108
    O_o

    1): Yes. You can find various implementations in the wild.
    2): Yes. You can find various implementations in the wild.
    3): Nope. Every display is different, and few people have the time, equipment, and eyesight to expend the effort.
    4): Yes. Video cards and drivers often also provide some adjustable behavior.
    5): Yes. Several. You can find these by searching for "gamma correction" followed by operating system/software.
    6): What you are trying to do? How are you trying to do it?

    As for the general theme of the questions, use generic defaults and provide a means of configuring the "scheme" to fit the monitor which the user may use if they so choose to change the "scheme".

    Soma
    “Salem Was Wrong!” -- Pedant Necromancer
    “Four isn't random!” -- Gibbering Mouther

  5. #5
    Registered User
    Join Date
    Mar 2011
    Posts
    596
    Quote Originally Posted by phantomotap View Post
    What you are trying to do? How are you trying to do it?
    Both my monitor and display adaptor are set to default values (gamma for R, G, B at 1.0).
    My antialiased lines can use some improvement. I can do this either by adjusting my display settings,
    or by adding a gamma correction in the program. Neither of those is a problem in itself. I can create a
    gamma correction for my own use or for incorporation into my program. I am just trying to avoid an
    overcorrection, in the event that gamma correction is already in common use.
    I should add that this is all limited to antialiased line drawing within my programs, and not to overall display
    of the program window, etc.

    Quote Originally Posted by phantomotap View Post
    As for the general theme of the questions, use generic defaults and provide a means of configuring the "scheme" to fit the monitor which the user may use if they so choose to change the "scheme".
    So is it safe to assume that most users will have their display gammas set to "flat" (uncorrected)? And that my
    programs will look optimum on most monitors if I correct the gamma within the program?

    Would it still be good to provde a utility for optimizing the display, anyway?

    -

  6. #6
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    You're getting into a complex area with this question. The fundamental problem is how to reproduce color intensities in a way that gives the intended impression to the viewer. There are a number of non-linearities in the system. The most important non-linearity is actually within the human visual system. Another big non-linearity occurs in the response of the display device. The non-linearity of the human visual system is mostly fixed from person to person. It can vary significantly from one display to another.

    The high grade solution is to carry full color profile information through all stages of the imaging process. This is really the only way to ensure that the visual appearance is as it was intended on a given device. Unfortunately there is no way of verifying that the color profile in use at each step of the chain is actually the correct profile, i.e. things may not be properly calibrated. For printed material the process is even more complicated because it has to account for the likely illumination conditions.

    The ideal situation would be a properly calibrated monitor and a video card which can correctly interpret color profile information, as well as a color profile describing the image signal you are generating. Some of this is under your control but a lot of it is not.

    For your specific case where a linear stimulus doesn't look right, the best you can probably do is apply your own gamma correction and tune it until you like the results. It will look different on other displays, no real getting around that.
    Code:
    //try
    //{
    	if (a) do { f( b); } while(1);
    	else   do { f(!b); } while(1);
    //}

  7. #7
    Master Apprentice phantomotap's Avatar
    Join Date
    Jan 2008
    Posts
    5,108
    Both my monitor and display [...] program window, etc.
    O_o

    Everything you've said is a rehash of what you've already said which is a description of your question.

    I asked about your problem.

    What are you doing drawing "lines and curves" "by hand"?

    Are you writing a game? Are your writing a SVG viewer? Are you just learning theory?

    If this is still your fractal viewer, I'd say leave the gamma correction to the display and video card drivers because near perfect reproduction across diverse hardware isn't a selling point of fractals.

    If you are doing something where near perfect reproduction is a target, you should consider investing the time to provide a means for "calibrating" your software.

    So is it safe to assume that most users will have their display gammas set to "flat" (uncorrected)?
    No.

    Of course, that is also irrelevant.

    Why?

    Would it still be good to provde a utility for optimizing the display, anyway?
    Because I'm not going to adjust my display for your software.

    I have equipment which I've used to calibrate my monitor.

    I have equipment and my own software which I've used to find a point of balance between my good monitors and my good printers so that while both are "off" they are mutually consistant.

    I am not going to change that for you or your software.

    If for whatever reason you do want to provide gamma correction for your software, learn the algorithms to bake adjustments into your rendering algorithms so that the user may fine tune your software to their display.

    If you have the extra time to also provide a utility to help optimize the display, have incredibly knowledge of the issues, understand the relationships between various displays, and have access to numbers published by various standards bodies feel free to provide such a utility. If any of these don't fit, you aren't the person for writing that "optimizing" software.

    Soma
    “Salem Was Wrong!” -- Pedant Necromancer
    “Four isn't random!” -- Gibbering Mouther

  8. #8
    Registered User
    Join Date
    Mar 2011
    Posts
    596
    Quote Originally Posted by brewbuck View Post
    For your specific case where a linear stimulus doesn't look right, the best you can probably do is apply your own gamma correction and tune it until you like the results. It will look different on other displays, no real getting around that.
    Thanks, I am going to do that. Not really concerned with getting all displays to look good, just most of them.

    Quote Originally Posted by phantomotap View Post
    What are you doing drawing "lines and curves" "by hand"?
    Are you writing a game? Are your writing a SVG viewer? Are you just learning theory?
    If this is still your fractal viewer, I'd say leave the gamma correction to the display and video card drivers because near perfect reproduction across diverse hardware isn't a selling point of fractals.
    If you are doing something where near perfect reproduction is a target, you should consider investing the time to provide a means for "calibrating" your software.
    The program provides certain output in the form of a drafting "blueline" type drawing, a white drawing on a dark blue background.
    The drawing is created by various math functions, to provide a graphic representation of the various parameters that the program
    works with. This is not a general purpose modeling program, but is dedicated to certain musical instrument parameters.

    The program is an assignment which I gave myself for the purposes of learning Windows (Win32) programming.
    I am not looking for perfection. However, a simple adjust of the gamma from 1.0 to 2.0 in the display adaptor utility, did produce
    perfect looking lines. I believe that a simple log function is used there. And the improvement is great enough that I believe it is
    well worth implimenting a correction in one way or another.

    Quote Originally Posted by phantomotap View Post
    Because I'm not going to adjust my display for your software.
    I was not suggesting that as a solution. And I wouldn't be requesting the user to adjust their system settings. If I did add a utility
    to the program, it would only affect my own graphics output,, and not be system wide.

    Quote Originally Posted by phantomotap View Post
    If for whatever reason you do want to provide gamma correction for your software, learn the algorithms to bake adjustments into your rendering algorithms so that the user may fine tune your software to their display.
    That is exactly what I am considering doing. My question is "should I do that"? Are enough displays out there that are uncompensated?
    And I believe you already answered that most are not compensated. And so that is indeed what I will do.

    I appreciate the help. Sorry if the questions weren't clear.

    -

  9. #9
    Master Apprentice phantomotap's Avatar
    Join Date
    Jan 2008
    Posts
    5,108
    My question is "should I do that"? Are enough displays out there that are uncompensated?
    O_o

    I fear I did not make my point clear.

    That possibility is irrelevant.

    My development monitor is specifically tuned, yet your software may not look as good as it might.

    My television is specifically tuned, but calibrated for the games I play which often have poor rendering for high contrasts so your software may look like crap.

    My tablet has a great display, but factory--as in poorly--calibrated, yet despite such calibration your software may look great.

    The variability in technologies, purpose, and manufacturing runs leaves even properly gamma compensated displays wonky in some areas.

    The only question, assuming you are willing, is one of reproduction quality. If you feel what you are doing would benefit by offering gamma correction so that the view presented by your software is accurately reproduced as you've designed, the answer is providing a means to bake adjustments into rendering.

    Soma
    “Salem Was Wrong!” -- Pedant Necromancer
    “Four isn't random!” -- Gibbering Mouther

  10. #10
    Master Apprentice phantomotap's Avatar
    Join Date
    Jan 2008
    Posts
    5,108
    My question is "should I do that"? Are enough displays out there that are uncompensated?
    O_o

    I fear I did not make my point clear.

    That possibility is irrelevant.

    My development monitor is specifically tuned, yet your software may not look as good as it might.

    My television is specifically tuned, but calibrated for the games I play which often have poor rendering for high contrasts so your software may look like crap.

    My tablet has a great display, but factory--as in poorly--calibrated, yet despite such calibration your software may look great.

    The variability in technologies, purpose, and manufacturing runs leaves even properly gamma compensated displays wonky in some areas.

    The only question, assuming you are willing, is one of reproduction quality. If you feel what you are doing would benefit by offering gamma correction so that the view presented by your software is accurately reproduced as you've designed, the answer is providing a means to bake adjustments into rendering.

    Soma
    “Salem Was Wrong!” -- Pedant Necromancer
    “Four isn't random!” -- Gibbering Mouther

  11. #11
    Registered User
    Join Date
    Mar 2011
    Posts
    596
    I may have misunderstood what you meant by "baked in". I understood that to mean that any corrections
    were being done in the line rendering process itself, and not as a separate display adjustment.
    And I also understood that to mean a fixed adjustment, not something that would adapt to the
    system running the program. Or is it more than that?

    If "baking in" is as I thought, then I understand that it is not going to give good results on all displays.

    -

  12. #12
    Master Apprentice phantomotap's Avatar
    Join Date
    Jan 2008
    Posts
    5,108
    I may have misunderstood what you meant by "baked in".
    O_o

    You did, but fear not I could have better explained.

    I use "bake" the way you might use "deserialized cache".

    I understood that to mean that any corrections were being done in the line rendering process itself, and not as a separate display adjustment.
    Correct. You only have one real place to do both gamma correction and anti-aliasing, and the place for those processes is while rendering your primitives before the final "pixel" is "set".

    And I also understood that to mean a fixed adjustment, not something that would adapt to the system running the program.
    Incorrect. I am not suggesting "static"--what you've called "fixed"--adjustments.

    I am suggesting an algorithm which uses adjustments.

    I am suggesting that the adjusted values be "baked".

    However, I am suggesting an approach that "bakes" the values for a given environment before the rendering process allowing you to keep good performance while allowing the user to adapt the gamma correction to the hardware they have available.

    If your designs--or the designs created by the program--legitimately benefit from accurate reproduction, you have little other choice.

    If "baking in" is as I thought, then I understand that it is not going to give good results on all displays.
    I'm sorry for the misunderstanding, but I would not suggest what you have understood.

    The adjustments you make with "static" values may actually be worse for any given monitor.

    If you do not wish to allow--Let us continue by calling them "dynamic" instead of "baked".-- "dynamic" adjustments, you should prefer to make no adjustments relying on the "dynamic" adjustments offered by hardware/drivers if the user so desires to tune for your software. Yes. The approach has problems, but "static" adjustments carry the same problems and requires you to expend considerable effort implementing capable algorithms.

    [Edit]
    An example of "baking":

    I allow users (One of the my many "hobby" projects is a SVG/printer widget with a great many adjustments with the aim of "near perfect" reproduction.) to specify various color corrections in several ways. One of the available adjustments may be expressed with a cubic Bézier curve. Sampling a cubic Bézier curve to determine gamma correction for every "pixel" while accounting for every primitive in a 600 DPI rendering would be insane. I allow the cubic Bézier curve because you may easily express slight adjustments to common curves. Under the hood, the implementation uses the same "code path" as for the provided 4,8,12,16 bit correction arrays. The implementation simply "bakes" the complex "values" into something the algorithms may quickly consume.

    I hope this description helps regardless your choice.
    [/Edit]

    Soma
    “Salem Was Wrong!” -- Pedant Necromancer
    “Four isn't random!” -- Gibbering Mouther

  13. #13
    Registered User
    Join Date
    Mar 2011
    Posts
    596
    So the baking in just refers to the corrections being incorporated into the rendering or drawing
    algorithm itself?

    I would like to have a compensation within the program for my own use at least. From there,
    it is only a little more work to make it a "feature" of the program. So that looks like a good
    solution.

    I like brewbuck's suggestion, but I was assuming that the default settings of most displays
    would be in a state that woud typically benefit from a fixed correction. I don't want to get too far
    away from my original purpose of just adding a level of improvement. I think a simple gamma
    function will do that, based on experimenting with the display utility in my PC.

    -

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. question about the brightness of screen
    By thungmail in forum C Programming
    Replies: 4
    Last Post: 03-11-2008, 11:54 AM
  2. semaphore and monitors
    By sangfroid in forum C Programming
    Replies: 1
    Last Post: 05-13-2007, 06:16 AM
  3. Using dual monitors
    By lightatdawn in forum Tech Board
    Replies: 9
    Last Post: 07-17-2003, 11:54 AM
  4. Your opinion on monitors - which is the best?
    By Carlos in forum A Brief History of Cprogramming.com
    Replies: 12
    Last Post: 12-22-2001, 03:24 PM
  5. Monitors
    By Barjor in forum A Brief History of Cprogramming.com
    Replies: 1
    Last Post: 10-19-2001, 02:55 PM