Brightness

and Why your monitor flickers

Introduction

Monitors have been flickering for many years. A CRT monitor basically created its picture with flicker and if you found it uncomfortable to sit in front of a CRT monitor, you were not alone. But it can also happen on modern PC monitors. We are not talking about visible flicker as in a defective monitor, but β€œinvisible” flicker that can affect your experience in front of the monitor.

Modern PC LCD monitors are not flicker-free but in order to understand why we need to include a short introduction. LCD monitors started out by using CCFL (cold cathode fluorescent lamps) as a backlight source but in recent years manufacturers have shifted to LEDs (light emitting diodes). If you have a thin monitor then you have a LCD monitor with LED. If you are unsure check the model number on the backside and Google it.

As you might have noticed you can adjust the backlight on pretty much any monitor today via the buttons on the front. In a brightly lit room a higher monitor brightness level is preferable and in a dimly lit room, for example a cellar or studio, a much dimmer monitor is preferable. Some monitors adjust automatically based on surroundings but often you have to do it manually. Whether it is the one or other is irrelevant. The important aspect here is the actual method used to reduce brightness on a monitor with LED.


Why your monitor flickers?

When a monitor is set to maximum brightness the LEDs are typically glowing at full strength: 100%. If you reduce the brightness setting in the menu to, for example, 50% the LEDs need to omit less light. This is done by inserting small β€œbreaks” – or pauses – in which the LEDs turn off for a very short time. When reducing the brightness setting in the menu further the breaks become longer, basically.

90% duty cycle 50% duty cycle 10% duty cycle

This happens with CCFL based backlight units and LED based backlight units but CCFLs have a much longer afterglow than LEDs that basically turn off instantly. Therefore, the breaks when using CCFLs are much easier on the eyesthan when using LEDs. Thus a greater risk of experiencing eyestrain, tired eyes or in the worst-case scenario headache when working in front of a LED based monitor.

The use of LED obviously has numerous benefits, including much lower power consumption, far fewer toxic substances and some obvious picture quality advantages but here we are only focusing on the potential eyestrain issues as it is a separate issue that can be avoided on new monitors and even reduced on the monitor you own right now.

We need to emphasize that all eyes are different. Those who are affected never see the actual flicker – it is β€œindirect”. Studies have shown that approximately 10% of people experience discomfort. The rest experience eithermild discomfort or no discomfort at all.


PWM (Pulse Width Modulation)

The method of introducing breaks to reduce the brightness level is called PWM (pulse width modulation). It is a cheap and effective way of controlling the light output of a LCD monitor with LEDs as it gives a huge span of brightness levels. But, as you can see, it also has drawbacks.

PWM is not the actual problem. It is not necessarily a bad thing. The problem occurs if the blinking/flickering is indirectly perceptible by the human eyewhen the PWM is running at too low frequencies. The most common scenario is that you buy a new monitor that is far too bright out-of-the-box and reduce brightness to maybe 20-30%. Brightness is typically measured in cd/m2 and the brightest monitors hit around 450-500 cd/m2 but this is not necessarily anadvantage because no one can sit in front of a monitor this bright. The recommended brightness level is approximately 120 cd/m2 in a bright roomwithout direct sunbeams coming in.

The alternative to the PWM method is to lower/raise the electrical voltage for the LEDs. The method can also be used for CCFLs but they are not nearly asflexible as LEDs. The disadvantage to this method is that it is more expensive and that it can be very hard to control the color temperature of the backlight. There is also a risk of burning out the LEDs very fast.

90% continuous 50% continuous 10% continuous

The number of cycles (on / off periods) used in LEDs can obviously be measuredand as so many other things it is measured in Hz. 100 Hz means that it updates 100 times per second.

But what frequency do LEDs run at then? Well, most monitors use PWM with a frequency around 90 to over 400 Hz. Those with 90 Hz PWM are worst, obviously. PC monitors with CCFL backlights all run at 175 Hz (but as mentioned, the afterglow is very different). For comparison, ceiling fluorescent lamps found in many offices used to operate at around 100-120 Hz and they have been proven to cause headache many times in the past. Newer installations are better but we need to reach much higher frequencies for a perfect result. Again, it depends on the individual and some will experience problems where others do not. We probably have to surpass 2000-3000 Hz before we can call it a safe zone.

If manufacturers want to use the PWM method in LCDs with LED there are two ways to eliminate the issues. The first option is to run the LEDs as a much higher frequency (combined with a decent brightness level). The second option is to use a combination of PWM and modulation of the electrical voltage.

But you can also help yourself. When you are looking to buy a new one you can often leave out the cheapest PC monitors (the same is true for displays in smart phones and tablets) as these have proven to be most affected by the issues. Many of us work in front a monitor for many many hours daily, so it is really worth saving a few bucks?

Look at the maximum brightness (the cd/m2 number). It is around 400-500 cd/m2 then you can be sure that you need to lower the brightness level considerably – and then you risk increasing flicker. Many graphics monitors run at much lower brightness levels – and it can actually be an advantage. Higher is – ironically – not always better.


How Iris is fixing PWM issues?

You now understand why PWM is bad for your eyes and a little of the science behind LCD monitors. We thought a lot how not to experience eye problems while still decreasing monitor brightness. Today with Iris you can decrease your brightness without flicker on every monitor out there. β€œHow exactly this works?” you ask.

At the end of the graphics pipeline, just where the image leaves the computer to make its journey along the monitor cable, there is a small piece of hardware that can transform pixel values on the fly. This hardware typically uses a lookup table to transform the pixels. Iris controls this hardware and uses it to decrease your brightness. You get flicker free low brightness. Set your hardware brightness to the MAX and control it with Iris. WIN WIN.

Let’s see how you can actually do this. Open Iris by double clicking your tray icon.

You can control brightness either from Simple and from Advanced view.


You can also use automatic brightness feature, which will use sun positioning and your location and gradually decrease brightness when night is coming.


What brightness value to use?

The rule for brightness value is to match your indoor lighting, which means that the monitor should not look like light source in the room. My advice is just to set it whatever you like and feels good to your eyes.

Iris does not use PWM to control the screen brightness, so set it as low as you like.

2 thoughts on “Brightness

  1. Here’s a question about brightness. I turned on my monitor brightness to 100% so that Iris could adjust the brightness (and I wouldn’t get the flicker). But even if Iris is adjusting the brightness so that it usually is less than 100%, I wonder if I’m squandering energy and battery power since the monitor is set at 100% all the time (even if it appears less because of how Iris is manipulating it). Or is it that when Iris lowers the brightness it also decreases energy usage (and thus conserves battery power)?

    1. Well maybe Iris decreases the power usage a bit but it’s less than if you lower it from the monitor buttons

      The whole point of the PWM aside from headaches and eye pain is to make your monitor more energy efficient so it will conserve the battery power more than if hardware brightness is set to 100% and Iris is set to 50%

      You need to test this but Iris never turns off all pixels on the screen like PWM does and it seems logical to require a little bit more power usage

      You can test for PWM and Subpixel flicker from here
      https://iristech.co/3-ways-to-test/

Leave a Reply

Your email address will not be published. Required fields are marked *