What is monitor response time? “Response time” is an important but ambiguous parameter of an LCD TV Response time 1 ms

What is monitor response time? “Response time” is an important but ambiguous parameter of an LCD TV Response time 1 ms

What is the response time of a computer monitor?

Speaking in dry scientific language, the response time of liquid crystal monitors is the shortest time that a pixel needs to change the brightness of the glow and is measured in milliseconds. (ms)

It would seem that everything is simple and clear, but if we consider the issue in detail, it turns out that these numbers hide several secrets.

A bit of science and history

The time of warm and tube CRT monitors with honest Hertz frame scan and RGB color has already passed. Then everything was clear - 100 Hz is good, and 120 Hz is even better. Each user knew what these numbers showed - how many times per second the picture on the screen is updated, or blinks. For comfortable viewing of dynamically changing scenes (for example, films), it was recommended to use a frame rate of 25 for TV and 30 Hz for digital video. The basis was the medical statement that human vision perceives an image as continuous if it blinks at least twenty-five times per second.

But technology has evolved, and liquid crystal panels, also called LCD, TFT, LCD, took over the baton from the CRT (cathode ray tube). Although production technologies differ, we will not focus on the details in this article; we will talk about the differences between TFT and LCD another time

What affects response time?

So, the principle of LCD operation is that the matrix cells change their brightness under the influence of a control signal, in other words, they switch. And this switching speed or response time determines the maximum speed of changing the picture on the display.

It is converted into the usual hertz using the formula f=1/t. That is, in order to obtain the required 25 Hz, it is necessary to provide the pixels with a speed of 40 ms and 33 ms for 30 Hz.

Is it a lot or a little, and which monitor response time is better?

  1. If the time is long, then with sudden changes in the scene, artifacts will appear - where the matrix is ​​already black, the matrix still shows white. Or an object is displayed that has already disappeared from the camera's field of view.
  2. When the human eye is shown unclear pictures, visual fatigue increases, headaches may appear, and fatigue may increase. This is due to the visual tract - the brain is constantly interpolating information coming from the retina, and the eye itself is busy constantly changing focus.

It turns out that less is better. Especially if you have to spend most of your time at the computer. The older generation remembers how hard it was to sit through an eight-hour workday in front of a CRT - and yet they provided 60 Hz or more.

How can I find out and check the response time?

Although milliseconds are milliseconds in Africa, many have probably encountered the fact that different monitors with the same indicators produce images of different quality. This situation arose due to different methods for determining the matrix reaction. And it is hardly possible to find out what measurement method the manufacturer used in each specific case.

There are three main methods for measuring monitor response:

  1. BWB, also known as BtB, is an abbreviation of the English phrase “Black to Back” and “Black-White-Black”. Shows the time it takes for a pixel to switch from black to white and back to black. The most honest indicator.
  2. BtW – stands for “Black to White”. Switching on from an inactive state to one hundred percent luminosity.
  3. GtG is short for "Grey to Grey". How much does a point need to change the brightness of gray from ninety percent to ten. Usually around 1-2 ms.

And it turns out that checking the monitor’s response time using the third method will show a much better and more attractive result for the consumer than checking using the second. But if you don’t find fault, they’ll write that it’s 2 ms and that’s how it will be. But in fact, artifacts appear on the monitor, and the picture goes like a trail. And all because only the BWB method shows the true state of affairs- the first method, it is this that indicates the time required for a pixel to complete its full operating cycle in all possible states.

Unfortunately, the documentation available to consumers does not clarify the picture and what is meant by, for example, 8 ms is difficult to understand. Will it fit and be comfortable to work with?

For laboratory research, a rather complex software and hardware complex is used, which not every workshop has. But what if you want to check the manufacturer?

Checking the response time of the monitor at home is carried out by the TFT Monitor Test program . By selecting the test icon in the software menu and specifying the native screen resolution, a picture with a rectangle scurrying back and forth is displayed on the display. At the same time, the program will proudly display the measured time!

We used version 1.52, tested several displays and concluded that the program shows something, and even in milliseconds. Moreover, a monitor of poorer quality demonstrated worse results. But since the time of extinguishing and lighting of pixels is recorded only by a photosensor, which was not in sight, a purely software method can be recommended for a subjective comparative assessment - what the program measures is clear only to its developers.

A much more visual empirical test would be the “White Square” mode in the TFT Monitor Test - a white square moves across the screen, and the tester’s task is to observe the trail from this geometric figure. The longer the cable, the more time the matrix spends on switching and the worse its properties.

That’s all you can do to solve the problem “How to check the response time of a monitor.” We will not describe methods using cameras and calibration tables, but will consider them another time - this will take a couple more days. A full check can only be performed by a specialized organization with the appropriate technical base.

Gaming monitor response time

If the main purpose of the computer is gaming, then you should choose a monitor with the shortest response time. In fast-paced shooters, even a tenth of a second can decide the outcome of a battle. Therefore, the recommended monitor response time for games is no more than 8 ms. This value provides a frame rate of 125 Hz, and will be absolutely sufficient for any toy.

At the next closest value of 16ms, motion blur will be observed in hard batches. These statements are true if the declared time was measured by BWB, but craftily companies can write both 2 ms and 1 ms. Our recommendation remains the same - the less the better. Based on this approach, we say that the response time of a monitor for games should be at least 2 ms, since 2 ms GtG approximately corresponds to 16 ms BWB.

How to change the response time in the monitor?

Unfortunately, there is almost nothing without replacing the screen. This is a characteristic of the layer itself, which is responsible for forming the image, and corresponds to the manufacturer’s design decision. There is, of course, a small loophole and the engineers solved the question: “How to change the response time.”

Companies that produce monitors call this feature OverDrive (OD) or RTC - response time compensation. This is when a higher voltage pulse is briefly applied to the pixel and it switches faster. If the monitor sparkles with the inscription “Gaming Mode” or something similar, then you should know that it is possible to adjust it for the better. Let us explain once again to make it completely clear - no programs or replacement of video cards will help and nothing can be tweaked - this is a physical property of the matrix and its controller.

conclusions

Buying a video card for a thousand or one and a half conventional units in order to run your favorite games at at least a hundred FPS, and sending a video signal to a monitor that can barely handle forty FPS, is a little irrational. It’s better to add a hundred to the display and enjoy the full dynamics of games and movies without disappointment - you definitely won’t get any pleasure from a 40 ms matrix, and the joy of owning a powerful video adapter will outweigh the poor image quality.

When you purchase some additional equipment for your computer, such as an LCD monitor, there are many factors to consider. Today we will talk about such a parameter as response time. Knowing how the response time affects the image reproduced by the monitor, you can easily make the right choice.

LCD monitors

The LCD monitor became the heir to the outdated CRT CRT monitors, significantly improving the weight and size characteristics of such devices. CRT monitors were very large and heavy, while modern LCD monitors are very light and compact. Unlike CRT monitors, LCD monitors are available in a wider range of models with different screen diagonals - from 14 to 28 inches. The operation of an LCD is characterized by a wide range of parameters, such as the maximum supported resolution, black color display depth, color purity, quality of color reproduction, as well as other parameters, among which response time occupies a special place.

Response time

The response time for an LCD monitor is one of the key characteristics that you should pay attention to when choosing a monitor. Response time can be described as the time an LCD monitor takes to change the color of each pixel. A high response time leads to such an unpleasant defect in the image as afterglow or trailing. When playing fast moving objects, such as an athlete, a vehicle, or a bird, they may leave a trail on the screen. This is due to the response time being too high, which can negatively affect the quality of dynamic scenes in films and computer games. Response time is measured in milliseconds - the lower this number, the better quality picture you will get on the monitor.

2ms or 5ms

Any response time less than 15 milliseconds is acceptable for LCD monitors and guarantees sufficient image quality, free from trailing motion and other artifacts. In general, an LCD monitor with a 2ms response time is considered better than a monitor with a 5ms response time. However, you should consider other parameters that affect the quality of video display. Thus, an LCD monitor with a response time of 2 ms may have weaknesses in other areas, for example, in the quality of color reproduction. And then it may turn out that a monitor with a response time of 5 ms is preferable for performing your tasks. If you are preparing to purchase a monitor, we recommend that you conduct a practical comparison of models with a response time of 2 or 5 ms.

What response time to choose

In general, if you only use your computer for watching videos and playing computer games, then be sure to choose a monitor with a response time of less than 12 ms. For many people, the difference between 2 and 5 ms response times is indistinguishable. They are more likely to pay attention to the fact that a monitor with a 5 ms response is cheaper than a monitor with a 2 ms response. In the end, the choice is yours - choose a monitor in the price range that suits you and with the necessary characteristics.

The monitor is designed to display information coming from the computer in graphical form. The comfort of working at a computer depends on the size and quality of the monitor.

The most optimal price/quality ratio for today are LG 24MP58D-P and 24MK430H.
Monitor LG 24MP58D-P

Monitor LG 24MK430H

There are also similar models Samsung S24F350FHI and S24F356FHI. They are no different in quality from LG, but perhaps some will like their design better.
Monitor Samsung S24F350FHI

Monitor Samsung S24F356FHI

But DELL S2318HN and S2318H are already significantly superior to monitors from Korean brands in terms of the quality of electronics, case materials and firmware.
Monitor DELL S2318HN

Monitor DELL S2318H

If you are not pleased with the DELL design, then pay attention to the HP EliteDisplay E232 and E242 monitors, they are of the same high quality.
HP EliteDisplay E232 Monitor

HP EliteDisplay E242 Monitor

2. Monitor manufacturers

The best monitors are made by Dell, NEC and HP, but they are also the most expensive.

Monitors from large European brands Samsung, LG, Philips, BenQ are especially popular, but in the budget segment there are many low-quality models.

You can also consider monitors from well-known Chinese brands Acer, AOC, Viewsonic, which are of average quality across the entire price range, and the Japanese brand Iiyama, which produces both expensive professional and budget monitors.

In any case, carefully read reviews and testimonials, paying special attention to shortcomings (poor image and build quality).

3. Warranty

Modern monitors are not of high quality and often fail. The warranty for a high-quality monitor should be 24-36 months. The best warranty service in terms of quality and speed is offered by Dell, HP, Samsung and LG.

4. Aspect Ratio

Previously, monitors had screen width-to-height ratios of 4:3 and 5:4, which are closer to a square shape.

There are not many such monitors anymore, but they can still be found on sale. They have a small screen size of 17-19″ and this format is suitable for office or some specific tasks. But in general, such monitors are no longer relevant, and are generally not suitable for watching movies.

Modern monitors are widescreen and have aspect ratios of 16:9 and 16:10.

The most popular format is 16:9 (1920x1080) and it suits most users. The 16:10 ratio makes the screen a little taller, which is more convenient in some programs with a large number of horizontal panels (for example, when editing video). But at the same time, the screen resolution should also be a little higher in height (1920x1200).

Some monitors have an ultra-wide 21:9 format.

This is a very specific format that can be used in some types of professional activities that require simultaneous work with a large number of windows, such as design, video editing or stock quotes. Now this format is also actively moving into the gaming industry and some gamers note greater convenience due to the expanded visibility in games.

5. Screen diagonal

For a widescreen monitor, a 19″ screen diagonal is too small. For an office computer, it is advisable to purchase a monitor with a screen diagonal of 20″, since it will not be significantly more expensive than a 19″ one, and it will be more convenient to work with. For a home multimedia computer, it is better to purchase a monitor with a screen diagonal of 22-23″. For a gaming computer, the recommended screen size is 23-27″, depending on personal preferences and financial capabilities. To work with large 3D models or drawings, it is advisable to purchase a monitor with a screen diagonal of 27″ or more.

6. Screen resolution

Screen resolution is the number of dots (pixels) in width and height. The higher the resolution, the sharper the image and the more information that fits on the screen, but the text and other elements become smaller. In principle, problems with small fonts can be easily resolved by turning on scaling or increasing fonts in the operating system. Please also note that the higher the resolution, the higher the demands placed on the power of the video card in games.

In monitors with screens up to 20″, you can ignore this parameter, since they have the optimal resolution for them.

22″ monitors can have a resolution of 1680×1050 or 1920×1080 (Full HD). Monitors with a resolution of 1680x1050 are cheaper, but videos and games will look worse on them. If you often watch videos, play games or do photo editing, then it is better to take a monitor with a resolution of 1920x1080.

23″ monitors generally have a resolution of 1920×1080, which is the most optimal.

24″ monitors generally have a resolution of 1920×1080 or 1920×1200. 1920x1080 resolution is more popular, 1920x1200 has a higher screen height if you need it.

Monitors 25-27″ and larger can have a resolution of 1920×1080, 2560×1440, 2560×1600, 3840×2160 (4K). Monitors with a resolution of 1920x1080 are optimal in terms of price/quality ratio and in terms of gaming performance. Higher resolution monitors will provide higher image quality, but will cost several times more and require a more powerful graphics card for gaming.

Ultra-wide screen monitors (21:9) have a resolution of 2560x1080 or 3440x1440 and will require a more powerful graphics card if used for gaming.

7. Matrix type

The matrix is ​​the liquid crystal screen of a monitor. Modern monitors have the following types of matrices.

TN (TN+film) is a cheap matrix with average color rendering quality, clarity and poor viewing angles. Monitors with such a matrix are suitable for ordinary office tasks and are not suitable for watching videos with the whole family, as they have poor viewing angles.

IPS (AH-IPS, e-IPS, P-IPS) – a matrix with high quality color reproduction, clarity and good viewing angles. Monitors with such a matrix are perfect for all tasks - watching videos, games, design work, but they are more expensive.

VA (MVA, WVA) is a compromise option between TN and IPS type matrices, it has high quality color rendering, clarity and good viewing angles, but does not differ much in price from inexpensive IPS matrices. Monitors with such matrices are no longer very relevant, but they can be in demand in design activities, as they are still cheaper than professional IPS matrices.

PLS (AD-PLS) is a more modern, cheaper version of the IPS matrix, which has high color rendering quality, clarity and good viewing angles. In theory, monitors with such matrices should cost less, but they appeared not so long ago and their cost is still higher than their analogues with an IPS matrix.

Since monitors with IPS and PLS matrices are no longer much more expensive than those with TN, I recommend purchasing them for home multimedia computers. However, IPS and TN matrices also come in different qualities. Usually those called simply IPS or TFT IPS are of lower quality.

AH-IPS and AD-PLS matrices have a lower response time (4-6 ms) and are more suitable for dynamic games, but their overall image quality is lower than that of more expensive modifications.

The e-IPS matrix already has significantly higher image quality and is better suited for design tasks. Semi-professional monitors are equipped with such matrices, the best of which are produced by NEC, DELL and HP. Such a monitor will also be an excellent choice for a home multimedia computer, but it is more expensive than analogues on cheaper IPS, AH-IPS and PLS matrices.

The P-IPS matrix is ​​the highest quality, but is installed only in the most expensive professional monitors. Also, some e-IPS and P-IPS monitors are color calibrated at the factory, ensuring perfect color reproduction out of the box without the need for professional tuning.

There are also expensive gaming monitors with high-quality TN matrices with low response times (1-2 ms). They are specially designed for dynamic shooters (Counter-Strike, Battlefield, Overwatch). But due to worse color reproduction and poor viewing angles, they are less suitable for watching videos and working with graphics.

8. Screen covering type

Matrices can have a matte or glossy finish.

Matte screens are more versatile, suitable for all tasks and any external lighting. They look duller but have more natural color rendition. High-quality matrices usually have a matte finish.

Glossy screens look brighter and tend to have clearer, darker tones, but are only suitable for watching videos and gaming in a dark room. On a glossy matrix you will see reflections of light sources (sun, lamps) and your own, which is quite uncomfortable. Typically, cheap matrices have such a coating to smooth out imperfections in image quality.

9. Matrix response time

The matrix response time is the time in milliseconds (ms) during which the crystals can rotate and the pixels change color. The first matrices had a response of 16-32 ms and when working on these monitors, terrible trails were visible behind the mouse cursor and other moving elements on the screen. Watching movies and playing games on such monitors was completely uncomfortable. Modern matrices have a response time of 2-14 ms and problems with loops on the screen practically do not exist anymore.

For an office monitor, in principle, this does not matter much, but it is desirable that the response time does not exceed 8 ms. For home multimedia computers, it is believed that the response time should be about 5 ms, and for gaming computers – 2 ms. However, this is not quite true. The fact is that only low-quality matrices (TN) can have such a low response time. Monitors with IPS, VA, PLS matrices have a response time of 5-14 ms and they provide significantly higher image quality, including movies and games.

Do not buy monitors with a response time that is too low (2 ms), as they will contain low-quality matrices. For a home multimedia or gaming computer, a response time of 8 ms is sufficient. I do not recommend purchasing models with higher response times. An exception may be monitors for designers, which have a matrix response time of 14 ms, but they are less suitable for games.

10. Screen refresh rate

Most monitors have a refresh rate of 60Hz. This is, in principle, enough to ensure flicker-free and smooth images in most tasks, including games.

Monitors that support 3D technology have a frequency of 120 Hz or more, which is necessary to support this technology.

Gaming monitors can have refresh rates of 140Hz or higher. Due to this, the picture is incredibly clear and does not blur in such dynamic games as online shooters. But this also places additional demands on the computer's performance so that it can provide the same high frame rates.

Some gaming monitors support G-Sync frame synchronization technology, which was developed by Nvidia for its video cards and makes frame changes incredibly smooth. But such monitors are much more expensive.

AMD also has its own FreeSync frame synchronization technology for video cards of its own design, and monitors with its support are cheaper.

To support G-Sync or FreeSync, you also need a modern video card that supports the corresponding technology. But many gamers question the usefulness of these technologies in games.

11. Screen brightness

Screen brightness determines the maximum possible level of screen backlight for comfortable work in bright outdoor lighting conditions. This figure can be in the range of 200-400 cd/m2, and if the monitor is not placed under bright sun, then a low brightness will be enough. Of course, if the monitor is large and you will watch videos on it with the whole family during the day with the curtains open, then the brightness of 200-250 cd/m2 may not be enough.

12. Screen contrast

Contrast is responsible for the clarity of the image, especially fonts and small details. There is static and dynamic contrast.

The static contrast ratio of most modern monitors is 1000:1 and this is quite enough for them. Some monitors with more expensive matrices have static contrast ratios from 2000:1 to 5000:1.

Dynamic contrast is determined by different manufacturers according to different criteria and can be calculated in numbers from 10,000:1 to 100,000,000:1. These numbers have nothing to do with reality and I recommend not paying attention to them.

13. Viewing angles

The viewing angles determine whether you or several people at the same time can view the contents of the screen (for example, a movie) from different sides of the monitor without significant distortion. If the screen has small viewing angles, then deviation from it in any direction will lead to a sharp darkening or lightening of the image, which will make viewing uncomfortable. The screen with large viewing angles looks good from any side, which, for example, allows you to watch videos in a group.

All monitors with high-quality matrices (IPS, VA, PLS) have good viewing angles; those with cheap matrices (TN) have poor viewing angles. You can ignore the values ​​of viewing angles that are given in the monitor's specifications (160-178°), since they have a very distant relation to reality and only confuse you.

14. Screen backlight

Older monitors used fluorescent lamps (LCD) to backlight the screen. All modern monitors use light-emitting diodes (LEDs) to backlight the screen. LED lighting is of higher quality, economical and durable.

Some modern monitors support Flicker-Free backlight flicker-free technology, which is designed to reduce eye fatigue and negative effects on vision. But in budget models, due to the low quality of the matrix, this technology does not have a positive effect and many users complain that their eyes still hurt. Therefore, support for this technology is more justified on monitors with the highest quality matrices.

15. Energy consumption

Modern monitors consume only 40-50 W when the screen is on, and 1-3 W when the screen is off. Therefore, when choosing a monitor, you can ignore its power consumption.

The monitor may have the following connectors (click on the picture to enlarge).

1. Power connector 220 V.
2. Power connector for monitors with external power supply or power supply for speakers.
3. VGA (D-SUB) connector for connecting to a computer with an old video card. Not necessary, as an adapter can be used for this.
4,8. Display Port connectors for connecting to a modern video card. Supports high resolution and refresh rates over 60 Hz (for gaming and 3D monitors). Not required if you have DVI and the monitor does not support frequencies greater than 60 Hz.
5. The Mini Display Port connector is the same connector in a smaller format, but is optional.
6. DVI connector for connecting to a computer with a modern video card. Must be required if there are no other digital connectors (Display Port, HDMI).
7. HDMI connector for connecting a computer, laptop, TV tuner and other devices, it is desirable to have such a connector.
9. A 3.5 mm audio jack for connecting audio to monitors with built-in speakers, external speakers or headphones is not necessary, but in some cases this solution may be convenient.
10. A USB connector for connecting the USB hub built into the monitor is not available everywhere and is not mandatory.
11. USB connectors in monitors with a USB hub for connecting flash drives, mice, keyboards and other devices are not mandatory, but in some cases it can be convenient.

17. Control buttons

Control buttons are used to adjust brightness, contrast, and other monitor settings.

Typically the monitor is set up once and these keys are rarely used. But if the external lighting conditions are not constant, then the parameters may be adjusted more often. If the control buttons are on the front panel and have symbols, then it will be more convenient to use them. If there are no labels on the side or bottom panel, then it will be difficult to guess which button is which. But in most cases you can get used to it.

Some, mostly more expensive monitors, may have a mini-joystick to navigate to menus. Many users note the convenience of this solution, even if the joystick is located on the back of the monitor.

18. Built-in speakers

Some monitors have built-in speakers. Usually they are quite weak and do not differ in sound quality. This monitor is suitable for the office. For a home computer, it is advisable to purchase separate speakers.

19. Built-in TV tuner

Some monitors have a built-in TV tuner. Sometimes this can be convenient, since the monitor can also be used as a TV. But keep in mind that such a monitor itself will cost more and must support the required broadcast format in your region. As an alternative and more flexible option, you can buy a monitor with an HDMI connector and a separate inexpensive TV tuner suitable for your region.

20. Built-in webcam

Some monitors have a built-in webcam. This is absolutely not necessary, since you can purchase a separate high-quality webcam for a fairly reasonable price.

21. 3D support

Some monitors are specially adapted to use 3D technology. However, they still require the use of special glasses. I would say that this is all an amateur and the level of development of this technology is still not high enough. Usually it all comes down to watching several films in this format and understanding that in games 3D only interferes and slows down the computer. In addition, this effect can be achieved on a regular monitor using special 3D players and a video card driver.

22. Curved screen

Some monitors have a curved screen to provide a more immersive gaming experience. Usually these are models with a large screen (27-34″) elongated in width (21:9).

Such monitors are more suitable for those who use the computer mainly to play various story-based games. The image on the sides seems to be a little blurry, which, when the monitor is placed close in a darkened room, gives the effect of immersion in the game.

But such monitors are not universal, as they have a number of disadvantages. They are poorly suited for dynamic online shooters (wide and blurry screen), watching videos in a group (worse viewing angles), and working with graphics (image distortion).

In addition, not all games support the 21:9 aspect ratio and will not run on the entire screen, and higher resolutions place very stringent demands on computer performance.

23. Body color and material

As for color, the most versatile monitors are black or black-silver, as they go well with other computer devices, modern household appliances and interiors.

24. Stand design

Most monitors have a standard non-adjustable stand, which is usually sufficient. But if you want more room to adjust the position of the screen, for example, rotating it to watch videos while sitting on the sofa, then pay attention to models with a more functional adjustable stand.

Just having a quality stand is quite nice.

25. Wall mount

Some monitors have a VESA mount, which allows you to mount it on a wall or any other surface using a special bracket that is adjustable in any direction.

Take this into account when choosing if you want to realize your design ideas.

The VESA mount can be sized 75x75 or 100x100 and in most cases allows you to mount the monitor panel to any universal bracket. But some monitors may have design flaws that prevent universal brackets from being used and only require one specific bracket size. Be sure to check these features with the seller and in the reviews.

26. Links

Dell P2717H Monitor
Monitor DELL U2412M
Monitor Dell P2217H

Paying attention to the “response time” parameter is good advice. But neither specialists, nor even ordinary users have an unambiguous assessment of this “response time”. And the companies that produce televisions sometimes add confusion to this already confusing matter.

The general definition of “response time” is the time, measured in milliseconds, required for a liquid crystal pixel cell to change state from an active state to an inactive state. However, almost every manufacturer has its own view on the formation of this value.

The main characteristic in which all instructions and systems converge is that the smaller the indicator, expressed in milliseconds, the better the quality of the picture change on the display, which ultimately forms a clear image. This is especially true in relation to LCD TVs of the first models or to products from non-brand manufacturers: relatively young Chinese and Korean companies that do not invest in the introduction of advanced technologies.

A high “response time” indicator is, first of all, a blurry “picture”. On the screen it looks like this: fast moving objects leave a trail behind them, or when changing frames quickly, images overlap one another. Both "effects" can be seen on the screen. This mainly affects action films, sports programs, dynamic scenes and computer games (when connecting a console or using a TV as a monitor).

No matter how wide the range of settings for contrast, brightness and screen resolution, a slow response time can ruin the entire viewing experience. Some manufacturers, who do not spend much on acquiring new developments in the production of liquid crystal monitors, have chosen a path that can hardly be called anything other than funny. They began implementing their own response time standards. Result: there is no single system - no consensus.

Versions

For the first models of LCD TVs, there was only one standard, which was called rise-and-fall response or TrTf (Time rising, Time falling). It regulates the time of change of state of the “liquid” crystal in milliseconds from the active state (black color) to the inactive state (white color) and back. In fact, the activity of black is taken into account by 90%, and the activity of white only by 10%. This standard for televisions and monitors was once adopted by the famous company VESA, which adopts standards in the field of video electronics.

However, there are no hard and fast rules here yet. Despite the accepted state of affairs and the authority of VESA, many manufacturers began to maneuver within these frameworks. For example, in the description of a TV model only half the time is indicated: the change from black to white, and this is only half the “response time”. Another trick that manufacturers successfully use is the manipulation of numbers, i.e. declaring the maximum response speed of cells instead of the statistical average.

There is another option for measuring response. GTG (Gray to Gray) does not measure the rate of change from black to white, but the time of saturation of gray color, the so-called gradation of gray tones. Of course, all these specifications do not correlate with each other.

Moreover, the overwhelming majority of manufacturers, when indicating the response time parameter in the instructions for a particular model, do not indicate the system by which it is calculated. Others do not indicate response time at all. This is often due to the fact that there is simply nothing to indicate.


"Canonical" version

Large world-famous companies use the TrTf (Time rising, Time falling) standard. It is considered to be the most accurate and widespread.

According to this system, a response time of 20-25 milliseconds is recommended to be considered optimal. Many experts agree with this interpretation and argue that this indicator can provide comfortable viewing of videos with fast scenes. But there is a nuance: some users with this standard are able to see the trails “stretching” behind objects and objects on the screen. Moreover, a certain number of viewers can distinguish the trail at twelve and even eight milliseconds. Most likely, in this case we are talking about the individual abilities of individual people to specifically acutely perceive the visual range shown on the screen. This version has a right to exist, since, according to some data, the image of a CRT TV with a screen frequency of 50 Hz is equivalent to 14-16 milliseconds of a liquid crystal monitor.

Epilogue

In the current circumstances, it should be recognized that the “response time” parameter is an important and mandatory value to take into account when choosing an LCD TV or monitor. But it must also “match” with other details, in particular with the system by which the manufacturer carried out the measurements. In the end, the name of the manufacturer can help with this.

VESA is currently working on unifying a common standard based on the good old TrTf. Let's hope that soon all structures involved in the production process will put it into use.

Speaking about the various parameters of LCD monitors - and this topic is regularly raised not only in our articles, but also on almost any hardware site that touches on the subject of monitors - we can distinguish three levels of discussion of the problem.

Level one, basic: isn’t the manufacturer deceiving us? In general, the answer at the moment is completely banal: serious monitor manufacturers do not stoop to banal deception.

Level two, more interesting: what do the stated parameters actually mean? In fact, it boils down to a discussion of the question under what conditions these parameters are measured by manufacturers and what practical limitations these conditions impose on the applicability of measurement results. For example, a good example would be the measurement of response time according to the ISO 13406-2 standard, where it was defined as the sum of the times the matrix switches from black to white and vice versa. Research shows that for all types of matrices this transition takes the minimum time, while on transitions between shades of gray the response time can be many times higher, which means that in reality the matrix will not look as fast as on paper. However, this example cannot be attributed to the first level of discussion, since it cannot be said that the manufacturer is deceiving us anywhere: if we set the maximum contrast on the monitor and measure the “black-white-black” switching time, then it will coincide with the declared .

However, there is an even more interesting level, the third: the question of how certain parameters are perceived by our eyes. Without touching the monitors for now (we will deal with them below), I will give an example from acoustics: from a purely technical point of view, tube sound amplifiers have rather mediocre parameters (high level of harmonics, poor impulse characteristics, and so on), and in connection with them we can talk about fidelity There is simply no need to reproduce sound. Nevertheless, many listeners, on the contrary, like the sound of tube technology - but not because it is objectively better than transistor technology (as I already said, this is not the case), but because the distortions it introduces are pleasant to the ear.

Of course, the conversation about the subtleties of perception comes when the parameters of the devices under discussion are good enough for such subtleties to have a noticeable impact. You can buy computer audio speakers for ten dollars - no matter what amplifier you connect them to, they won’t sound any better, because their own distortions obviously exceed any flaws in the amplifier. It’s the same with monitors - while the response time of the matrices was tens of milliseconds, there was simply no point in discussing the features of image perception by the retina; now, when the response time has been reduced to a few milliseconds, it suddenly turns out that the performance of the monitor - not the rated performance, but its subjective perception by a person - is determined not only by milliseconds...

In the article I bring to your attention, I would like to discuss both some of the passport parameters of monitors - the features of their measurement by manufacturers, compliance with reality, and so on - but also some points related specifically to the characteristics of human vision. First of all, this concerns the response time of monitors.

Monitor response time and eye response time

For a long time, in many reviews of monitors - what can I say, I’m a sinner myself - one could come across the statement that as soon as the response time of LCD panels (the real response time, and not the nameplate value, which, as we all know, when measured according to ISO13406 -2, to put it mildly, does not accurately reflect reality) drops to 2...4 ms, then we can simply forget about this parameter, further reducing it will not give anything new, we will stop noticing blurring anyway.

And so, such monitors appeared - the latest models of gaming monitors on TN matrices with response time compensation fully provide an arithmetic mean (GtG) time of the order of a few milliseconds. Let's not discuss things like RTC artifacts or inherent shortcomings of TN technology now - all that matters to us is that the above numbers are actually achieved. However, if you put them next to a regular CRT monitor, many people will notice that the CRT is still faster.

Oddly enough, it does not follow from this that we need to wait for LCD monitors with a response of 1 ms, 0.5 ms... That is, you can wait for them, but such panels themselves will not solve the problem - moreover, subjectively they won't even be much different from modern 2...4 ms panels. Because the problem here is no longer in the panel, but in the peculiarities of human vision.

Everyone knows about such a thing as retinal inertia. It is enough to look at a bright object for one or two seconds, then close your eyes - and for a few more seconds you will see a slowly fading “imprint” of the image of this object. Of course, the print will be quite vague, actually a contour, but we are talking about such a long period of time as seconds. For about 10...20 ms after the disappearance of the actual picture, the retina of our eye continues to store its entire image, and only then it quickly fades away, leaving only the outlines of the brightest objects.

In the case of CRT monitors, the inertia of the retina plays a positive role: thanks to it, we do not notice the flickering of the screen. The duration of the afterglow of the phosphor of modern tubes is about 1 ms, while the time it takes for the beam to travel across the screen is 10 ms (with a frame scan of 100 Hz), that is, if our vision were inertia-free, we would see a light stripe running from top to bottom, only 1/10 wide screen height. This can be easily demonstrated by photographing a CRT monitor at different shutter speeds:


At a shutter speed of 1/50 sec (20 ms), we see a normal image that occupies the entire screen.


When the shutter speed is reduced to 1/200 sec (5 ms), a wide dark stripe appears in the image - during this time, with a scan of 100 Hz, the beam manages to bypass only half of the screen, while on the other half of the screen the phosphor has time to go out.


And finally, at a shutter speed of 1/800 sec (1.25 ms), we see a narrow light strip running across the screen, followed by a small and quickly darkening trail, while the main part of the screen is simply black. The width of the light stripe is precisely determined by the afterglow time of the phosphor.

On the one hand, this behavior of the phosphor forces us to use high frame rates on CRT monitors, for modern tubes - at least 85 Hz. On the other hand, it is precisely the relatively short afterglow time of the phosphor that leads to the fact that any, even the fastest, modern LCD monitor is still slightly inferior in speed to the good old CRT.

Let's imagine a simple case - a white square moving across a black screen, say, as in one of the tests of the popular TFTTest program. Consider two adjacent frames, between which the square has moved one position from left to right:


In the picture I tried to depict four consecutive “snapshots”, the first and last of which occur when the monitor displays two adjacent frames, and the middle two demonstrate how the monitor and our eye behave in the interval between frames.

In the case of a CRT monitor, the required square is regularly displayed when the first frame arrives, but after 1 ms (the afterglow time of the phosphor) it begins to quickly fade and disappears from the screen long before the arrival of the second frame. However, due to the inertia of the retina, we continue to see this square for about another 10 ms - by the beginning of the second frame it only begins to fade noticeably. At the moment the monitor draws the second frame, our brain receives two images - a white square in a new place, plus its imprint quickly fading on the retina in the old place.


Active matrix LCD monitors, unlike CRTs, do not flicker - the picture on them is preserved throughout the entire period between frames. On the one hand, this allows you not to worry about the frame rate (there is no screen flickering in any case, at any frequency), on the other... look at the picture above. So, during the interval between frames, the image on the CRT monitor quickly went dark, but on the LCD it remained unchanged. After the second frame arrives, our white square is displayed on the monitor in a new position, and the old frame goes out in 1...2 ms (in fact, the pixel blanking time for modern fast TN matrices is the same as the phosphor afterglow time for a CRT). However, the retina of our eye stores a residual image, which will go out only 10 ms after the disappearance of the real image, and until then it will be added to a new picture. As a result, within about ten milliseconds after the arrival of the second frame, our brain receives two images at once - the real picture of the second frame from the monitor screen plus the imprint of the first frame superimposed on it. Well, why not the usual blurring?.. Only now the old picture is stored not by the slow matrix of the monitor, but by the slow retina of our own eye.

In short, when the native response time of an LCD monitor drops below 10 ms, further reductions have less effect than might be expected - due to the fact that retinal inertia begins to play a noticeable role. Moreover, even if we reduce the monitor's response time to completely negligible amounts, it will still subjectively appear slower than a CRT. The difference lies in the moment from which the storage time of the residual image on the retina is counted: in a CRT this is the arrival time of the first frame plus 1 ms, and in an LCD this is the arrival time of the second frame - which gives us a difference of about ten milliseconds.

The solution to this problem is quite obvious - since a CRT appears fast due to the fact that most of the time between two successive frames its screen is black, which allows the afterimage on the retina to begin to fade just in time for the arrival of a new frame, then in an LCD monitor To achieve the same effect, additional black frames must be artificially inserted between image frames.

This is exactly what BenQ decided to do when they introduced Black Frame Insertion (BFI) technology some time ago. It was assumed that a monitor equipped with it would insert additional black frames into the output image, thereby emulating the operation of a conventional CRT:


Interestingly, it was initially assumed that frames would be inserted by changing the image on the matrix, and not by extinguishing the backlight. This technology is quite acceptable for fast TN matrices, but on MVA and PVA matrices there would be a problem with their too long switching time to black and back: if for modern TN it is a few milliseconds, then even for the best *VA- monitors matrices fluctuates around 10 ms - thus, for them, the time required to insert a black frame simply exceeds the frame repetition period of the main image, and BFI technology turns out to be unsuitable. In addition, the limitation on the maximum duration of a black frame is not even imposed by the repetition period of image frames (16.7 ms with a standard LCD frame scan of 60 Hz), but rather by our eyes - if the duration of black inserts is too long, the flickering of the monitor screen will be no less noticeable than on a CRT with scanning at the same 60 Hz. It's unlikely that anyone will like this.

Let me note in passing that it is still incorrect to talk about doubling the frame rate when using BFI, as some reviewers do: the natural frequency of the matrix should increase according to the addition of black frames to the video stream, but the image frame rate still remains the same, from the point of view of the video card and nothing changes at all.

As a result, when BenQ presented its FP241WZ monitor on a 24" PVA matrix, it actually did not contain the promised insertion of black frames, but a technology similar in purpose, but completely different in implementation, differing from the original one in that the black frame is not inserted after due to the matrix, but due to the control of the backlight lamps: at the right moment they simply go out for a short while.

Of course, for the implementation of BFI in this form, the response time of the matrix does not play any role at all; it can be used with equal success both on TN-matrices and on any others. In the case of the FP241WZ, its panel behind the matrix houses 16 independently controlled horizontal backlight lamps. Unlike a CRT, where (as we saw in photographs with a short shutter speed) a light scanning stripe runs across the screen, in BFI, on the contrary, the stripe is dark - at any given moment in time, 15 out of 16 lamps are on, and one is off. Thus, when BFI is running, a narrow dark stripe runs across the FP241WZ screen for the duration of one frame:


The reasons for choosing such a scheme (extinguishing one of the lamps instead of igniting one of the lamps, which would seem to exactly emulate a CRT, or extinguishing and igniting all the lamps at the same time) are quite obvious: modern LCD monitors operate with a frame scan of 60 Hz, so an attempt to accurately emulate a CRT would lead to severe flickering of the picture. A narrow dark strip, the movement of which is synchronized with the frame scanning of the monitor (that is, at the moment before each lamp is extinguished, the section of the matrix above it showed the previous frame, and by the time this lamp is lit, a new frame will already be recorded in it) on the one hand, partly compensates the above-described effect of retinal inertia, on the other hand, does not lead to noticeable flickering of the image.

Of course, with such modulation of the backlight, the maximum brightness of the monitor drops slightly - but, in general, this is not a problem; modern LCD monitors have a very good brightness reserve (in some models it can reach up to 400 cd/sq.m).

Unfortunately, I haven’t had time to visit our FP241WZ laboratory yet, so regarding the practical application of the new technology I can only refer to the article of the respected BeHardware website “ BenQ FP241WZ: 1rst LCD with screening" (in English). As Vincent Alzieu notes in it, the new technology really improves the subjective assessment of the monitor’s reaction speed, however, despite the fact that only one of the sixteen backlights is off at any given time, in some cases you can still notice screen flickering it is possible - first of all, on large single-color fields.

Most likely, this is due to the still insufficient frame rate - as I wrote above, switching the backlight lamps is synchronized with it, that is, a full cycle takes 16.7 ms (60 Hz). The sensitivity of the human eye to flicker depends on many conditions (for example, it is enough to recall, say, that the 100 Hz flicker of an ordinary fluorescent lamp with electromagnetic ballast is difficult to notice when looking directly at it, but easy if it falls into the area of ​​​​peripheral vision), so it is quite It seems reasonable to assume that the monitor still lacks the vertical scanning frequency, although the use of as many as 16 backlight lamps gives a positive effect: as we well know from CRT monitors, if the entire screen flickered at the same frequency of 60 Hz, we would have to look closely to detect this flickering would not be required, but working with such a monitor would be completely problematic.

The most reasonable way out of this situation seems to be a transition in LCD monitors to a frame scan of 75 or even 85 Hz. Some of our readers may argue that many monitors already support 75 Hz scanning - but, alas, I have to disappoint them, this support is done in the vast majority of cases only on paper: the monitor receives 75 frames per second from the computer, then simply throws out every fifth frame and continues to display the same 60 frames per second on its matrix. You can document this behavior by photographing an object moving quickly across the screen with a sufficiently long shutter speed (about 1/5 of a second - so that the camera has time to capture a dozen frames of the monitor): on many monitors, with a scan of 60 Hz, the photograph will show uniform movement of the object across the screen, and when scanning at 75 Hz, holes will appear in it. Subjectively, this will be felt as a loss of smoothness of movement.

In addition to this obstacle - I’m sure it can be easily overcome if there is such a desire on the part of monitor manufacturers - there is another one: with an increase in the frame rate, the required bandwidth of the interface through which the monitor is connected increases. In other words, to switch to 75 Hz scanning, monitors with working resolutions of 1600x1200 and 1680x1050 will need to use two-channel Dual Link DVI, since the operating frequency of single-channel Single Link DVI (165 MHz) will no longer be enough. This problem is not fundamental, but it does impose some restrictions on the compatibility of monitors with video cards, especially not very new ones.

Interestingly, increasing the frame rate itself will reduce image blurring at the same specification response time of the panel - and again the effect is associated with the inertia of the retina. Let’s say that the picture manages to move on the screen by a centimeter during the period of one frame at a scan rate of 60 Hz (16.7 ms) - then after changing the frame, the retina of our eye will capture the new picture plus the shadow of the old picture, shifted by a centimeter, superimposed on it. If we double the frame rate, then the eye will record frames with an interval of no longer 16.7 ms, but approximately 8.3 ms - respectively, and the shift of two pictures, old and new, relative to each other will become half as large, that is, with from the eye's point of view, the length of the trail trailing the moving image will be halved. Obviously, ideally, at a very high frame rate, we will get exactly the same picture as we see in real life, without any additional artificial blur.

Here, however, you need to understand that it is not enough to increase only the frame rate of the monitor, as was done in CRTs to combat screen flickering - it is necessary that all image frames be unique, otherwise there will be absolutely no point in increasing the frequency.

In games, this will lead to an interesting effect - since in most new products, even for modern video cards, a speed of 60 FPS is considered quite a good indicator, then raising the scanning frequency of the LCD monitor itself will not affect blurring until you set it enough a powerful video card (capable of running this game at a speed corresponding to the scan rate of the monitor) or do not lower the quality of the game’s graphics to a sufficiently low level. In other words, on LCD monitors with a real frame rate of 85 or 100 Hz, image blurring in games will, albeit to a small extent, still depend on the speed of the video card - and we are accustomed to considering blurring to depend solely on the monitor.

The situation with films is even more complicated - no matter what video card you install, the frame rate in the film is still 25, maximum 30 frames/sec, that is, increasing the frame rate of the monitor itself will not have any effect on reducing blurring in films. In principle, there is a way out of this situation: when playing a movie, you can programmatically calculate additional frames, which is an averaging between two real frames, and insert them into the video stream - by the way, this approach will reduce blurring in movies even on existing monitors, because their frame rate is 60 Hz is at least twice the frame rate in films, that is, there is a reserve.

This scheme has already been implemented in the 100 Hz Samsung LE4073BD TV - it has a DSP installed that automatically tries to calculate intermediate frames and inserts them into the video stream between the main ones. On the one hand, the LE4073BD does demonstrate noticeably less blur compared to TVs that do not have this function, but, on the other hand, the new technology also gives an unexpected effect - the image begins to resemble cheap “soap operas” with their unnaturally smooth movements. Some may like this, but, as experience shows, most people prefer a little blurring of a regular monitor rather than the new “soapy effect” - especially since in films the blurring of modern LCD monitors is already somewhere on the border of perception.

Of course, in addition to these problems, purely technical obstacles will arise - raising the frame rate above 60 Hz will mean the need to use Dual Link DVI on monitors with a resolution of 1680x1050.

To summarize briefly, three main points can be noted:

a) When the real response time of an LCD monitor is less than 10 ms, further reducing it gives a weaker effect than expected due to the fact that the inertia of the retina begins to play a role. In CRT monitors, the black gap between frames gives the retina time to “light up”, while in classic LCD monitors there is no such gap, the frames follow continuously. Therefore, further efforts by manufacturers to increase the speed of monitors will be aimed not so much at reducing their nominal response time, but at combating the inertia of the retina. Moreover, this problem affects not only LCD monitors, but also any other active matrix technologies in which the pixel glows continuously.

b) The most promising technology at the moment seems to be the technology of short-term extinguishing of backlight lamps, as in the BenQ FP241WZ - it is relatively easy to implement (the only downside is the need for a large number and a certain configuration of backlight lamps, but for monitors of large diagonals this is a completely solvable problem), suitable for all types of matrices and does not have any intractable shortcomings. It may only be necessary to increase the scanning frequency of new monitors to 75...85 Hz - but perhaps manufacturers will be able to solve the above-mentioned problem with flickering noticeable on the FP241WZ in other ways, so for a final conclusion it is worth waiting for other models to appear on the market monitors with backlight dimming.

c) Generally speaking, from the point of view of most users, modern monitors (on any type of matrix) are quite fast even without such technologies, so you should seriously wait for the appearance of various models with backlight dimming unless otherwise you are definitely not satisfied.

Display Delay (Input Lag)

The topic of frame display delay in some monitor models, which has recently been very widely discussed in various forums, is only at first glance similar to the topic of response time - in fact, it is a completely different effect. If, with normal blurring, the frame received on the monitor begins to be displayed instantly, but its complete rendering takes some time, then with a delay between the arrival of the frame from the video card to the monitor and the beginning of its display, some time passes, a multiple of the frame scanning period of the monitor. In other words, the monitor has a frame buffer installed - ordinary RAM - storing one or more frames; When a new frame arrives from the video card, it is first written to the buffer, and only then displayed on the screen.

Objectively measuring this delay is quite simple - you need to connect two monitors (CRT and LCD or two different LCDs) to the two outputs of one video card in cloning mode, then run a timer on them that shows milliseconds, and take a series of photographs of the screens of these monitors. Then, if one of them has a delay, the timers in the photographs will differ by the amount of this delay - while one monitor shows the current timer value, the second will show the value that was several frames earlier. To obtain a reliable result, it is advisable to take at least a couple of dozen photographs, and then discard those that were clearly taken at the time of the frame change. The diagram below shows the results of such measurements for the Samsung SyncMaster 215TW monitor (compared to an LCD monitor that does not have any delay), the horizontal axis shows the difference in the timer readings on the screens of the two monitors, and the vertical axis shows the number of frames with such a difference:


A total of 20 photographs were taken, 4 of which were clearly caught at the moment of frame change (two values ​​were superimposed on each other in the timer images, one from the old frame, the second from the new), two frames gave a difference of 63 ms, three frames - 33 ms, and 11 frames - 47 ms. Obviously, the correct result for the 215TW is a latency value of 47ms, which is about three frames.

Making a small digression, I note that you should be somewhat skeptical about publications on forums, the authors of which claim abnormally low or abnormally high latency specifically on their monitors. As a rule, they do not collect sufficient statistics, but take one frame - as you saw above, in individual frames you can accidentally “catch” a value both higher and lower than the real one, and the higher the shutter speed set on the camera, the greater the likelihood of such an error . To get real numbers, you need to take a dozen or two frames and select the most common delay value.

However, this is all a lyric, of little interest to us, customers - well, before buying a monitor in a store, you won’t take pictures of the timers on it?.. From a practical point of view, a much more interesting question is whether it makes sense to pay attention to this delay at all. For example, we will consider the aforementioned SyncMaster 215TW with a latency of 47 ms - I am not aware of monitors with higher values, so this choice is quite reasonable.

If we consider the time of 47 ms from the point of view of the speed of human reaction, then this is a fairly small interval - it is comparable to the time it takes for a signal to travel from the brain to the muscles along the nerve fibers. In medicine, the term “simple sensorimotor reaction time” has been adopted - the interval between the appearance of a signal that is simple enough for the brain to process (for example, lighting a light bulb) and the muscle response (for example, pressing a button). On average, for a person, the PSMR time is about 200...250 ms, this includes the time of registration of an event by the eye and transmission of information about it to the brain, the time of recognition of the event by the brain and the time of transmitting a command from the brain to the muscles. In principle, compared to this figure, the delay of 47 ms does not look too big.

During normal office work, such a delay is simply impossible to notice. You can try for as long as you like to notice the difference between the movement of the mouse and the movement of the cursor on the screen - but the very time the brain processes these events and links them with each other (note, tracking the movement of the cursor is a much more complex task than tracking the lighting of a light bulb in the PSMR test, so that we are no longer talking about a simple reaction, which means that the reaction time will be longer than for PSMR) is so great that 47 ms turns out to be a completely insignificant value.

However, on the forums, many users say that on the new monitor the cursor movements feel like “wool”, they have difficulty hitting small buttons and icons the first time, and so on - and the delay that was absent on the old monitor is to blame for everything. present on the new one.

In the meantime, most people are upgrading to the new larger monitors, either from 19" models with a resolution of 1280x1024, or from CRT monitors altogether. Let's take for example the transition from a 19" LCD to the aforementioned 215TW: the horizontal resolution increases by about a third (from 1280 to 1680 pixels), which means that in order to move the mouse cursor from the left edge of the screen to the right, the mouse itself will have to be moved a greater distance - provided that its working resolution and settings remain the same. This is where the feeling of “vatness” and slowness of movements appears - try reducing the cursor speed by a third on your current monitor in the mouse driver settings, you will get exactly the same sensations.

It’s exactly the same with missing buttons after changing the monitor - our nervous system, sad as it is to admit, is too slow to fix with our eyes the moment “the cursor has reached the button” and transmit the nervous impulse to the finger pressing the left mouse button before , as the cursor leaves the button. Therefore, in fact, the accuracy of hitting the buttons is nothing more than the precision of movements, when the brain knows in advance which movement of the hand corresponds to which movement of the cursor, and also with what delay after the start of this movement it is necessary to send a command to the finger so that when it presses the button mouse, the cursor was just on the right button. Of course, when you change both the resolution and the physical size of the screen, all this precision turns out to be completely useless - the brain has to get used to the new conditions, but at first, while it acts according to the old habit, you will indeed sometimes miss the buttons. Only the delay caused by the monitor has absolutely nothing to do with it. As in the previous experiment, the same effect can be achieved by simply changing the sensitivity of the mouse - if you increase it, at first you will “skip” the necessary buttons, if you decrease it, on the contrary, you will stop the cursor before reaching them. Of course, after a while the brain adapts to the new conditions, and you will start hitting the buttons again.

Therefore, if you change your monitor to a new one, with a significantly different resolution or screen size, do not be lazy to go into the mouse settings and experiment a little with its sensitivity. If you have an old mouse with a low optical resolution, then it would be a good idea to think about buying a new, more sensitive one - it will move more smoothly when set to high speed settings. Honestly, compared to the cost of a new monitor, spending an extra 20 dollars on a good mouse is not so ruinous.

So, we've sorted out the work, the next item is films. Theoretically, the problem here could arise due to desynchronization of the sound (which comes without delay) and the image (which is delayed by 47 ms on the monitor). However, after experimenting a little in any video editor, you can easily establish that a person notices desynchronization in films with a difference of the order of 200...300 ms, that is, many times more than what the monitor in question gives. While 47 ms is just a little more than the period of one frame of a film (at 25 frames per second, the period is 40 ms), it is impossible to notice such a small difference between sound and image.

And finally, the most interesting - games, the only area in which at least in some cases the delay introduced by the monitor can make a difference. However, it should be noted that many of those discussing the problem on forums and here tend to exaggerate it too much - for most people and in most games, the notorious 47 ms does not play any role. Perhaps, with the exception of the situation when in a multiplayer shooter you and your opponent see each other at the same time - in this case, reaction speed will really play a role, and the additional delay of 47 ms can become significant. If you already notice the enemy half a second later than he notices you, then some milliseconds will not save the situation.

It should be noted that the monitor delay does not affect the accuracy of aiming in FPS games, nor the accuracy of cornering in auto racing... In all these cases, the same precision of movements works - our nervous system does not have time to react at such a speed , in order to press the “fire” button exactly at the moment when the sight is aimed at the enemy, but it perfectly adapts to a variety of conditions and, in particular, to the need to give the finger the command “press!” at that moment when the sight had not yet reached the enemy. Therefore, any additional delays of short duration simply force the brain to slightly adapt to new conditions - moreover, if a person who is accustomed to a monitor with a delay is transferred to a model without a delay, he will have to get used to it in the same way, and for the first quarter of an hour the new monitor he will feel suspiciously uncomfortable.

And finally, I have already seen stories on forums several times about how it is generally impossible to play games on a new monitor due to the notorious latency, which ultimately boils down to the fact that a person, having changed from the 1280x1024 resolution of the old monitor to 1680x1050 of the new one, simply I didn’t think about the fact that his old video card wouldn’t work too fast at this resolution. So, when reading forums, be careful - as a rule, you do not know anything about the level of technical literacy of those who write there, and you cannot tell in advance whether things that are obvious to you are also obvious to them.

The situation with the discussion of monitor delays is aggravated by two more points that are inherent to most people to one degree or another. First, many people are prone to overly complex attempts to explain simple phenomena - they prefer to believe that a bright point in the sky is a UFO rather than an ordinary weather balloon, that strange shadows in NASA lunar photographs indicate not an unevenness of the lunar landscape, but that that people have never flown to the moon, and so on. Actually, any person interested in the activities of ufologists and similar organizations will tell you that most of their so-called discoveries are a consequence not so much of the lack of simple “earthly” explanations for many phenomena, but rather of a reluctance to look for simple explanations at all, a priori moving on to overly complex theories. No matter how strange the analogy between ufologists and monitor buyers is, the latter, once on the forum, often behave the same way - for the most part they do not even try to consider the fact that with a significant change in the resolution and diagonal of the monitor, the sensations of working with it will change completely out of the blue depending on any delays, they immediately move on to discuss how the generally insignificant 47ms delay affects the movement of the mouse cursor.

Secondly, people are prone to self-hypnosis. Try to take two bottles of beer of different types, obviously cheap and obviously expensive, pour the same beer into them - the vast majority of people, having tried it, will say that the beer tastes better in a bottle with the label of an expensive type. Cover the labels with opaque tape - opinions will be equally divided. The problem here is that our brain cannot completely abstract from all sorts of external factors - when we see expensive packaging, we already begin to subconsciously expect a higher quality of the contents of this packaging, and vice versa. To combat this, all any serious subjective comparisons are carried out using the blind test method - when all the studied samples are given conventional numbers, and none of the experts taking part in the testing knows until the end how these numbers relate to real brands.

Much the same thing happens with the discussed topic of display delay. A person who has just bought or is about to buy a new monitor goes to a forum on monitors, where he immediately discovers multi-page threads about the delay, in which he is told about “wobbly mouse movements”, and about the fact that it is impossible to play on such a monitor, and many other horrors. And, of course, there are a number of people there who claim that they can see this delay with their eyes. Having read all this, a person goes to the store and begins to look at the monitor he is interested in with the thought “there must be a delay here, people can see it!” Of course, after a while he himself begins to see it - or rather, he believes that he sees it - after which he returns home from the store and writes to the forum “Yes, I looked at this monitor, there really is a delay!” There are also more amusing cases - when people directly write something like “I’ve already been sitting at the monitor in question for two weeks, but only now, after reading the forum, I clearly saw a delay on it.”

Some time ago, videos posted on YouTube gained popularity in which on two adjacent monitors (working in desktop extension mode) a window is dragged up and down with a mouse - and you can clearly see how much this window lags on the monitor with a delay. The videos, of course, are beautiful, but... imagine: a monitor with a 60 Hz scan rate is filmed on a camera with its own matrix scan rate of 50 Hz, then saved into a video file with a frame rate of 25 Hz, uploaded to YouTube, which may well re-encode it internally. again, without telling us about it... Do you think that after all these transformations there is much left of the original? In my opinion, not very much. An attempt to view one of these videos frame by frame (by saving it from YouTube and opening it in a video editor) demonstrated this especially clearly - at some moments the difference between the two captured monitors is noticeably greater than the above-mentioned 47 ms, at other moments the windows on them move synchronously, as if there is no delay... In general, complete chaos, senseless and merciless.

So, let's make a short conclusion:

a) In some monitors, the display delay is objectively present; the maximum reliably recorded value is 47 ms.

b) A delay of this magnitude cannot be noticed either during normal work or in films. In games it can be significant at some points for well-trained players, but in most cases and for most people it is invisible in games.

c) As a rule, discomfort when changing a monitor to a model with a larger diagonal and resolution occurs due to insufficient speed or sensitivity of the mouse, insufficient speed of the video card, as well as the change in screen size itself. However, many people, having read too much on forums, a priori attribute any discomfort on a new monitor to problems with display lag.

To put it in a nutshell: theoretically the problem exists, but its practical significance is greatly exaggerated. The vast majority of people will never notice a delay of 47 ms anywhere, let alone lower delay values.

Contrast: nameplate, real and dynamic

Perhaps the statement “the contrast of a good CRT monitor is higher than the contrast of an LCD monitor” has long been perceived by many people as an a priori truth that does not require additional evidence - yet we see how noticeably the black background glows in the dark on the screen of LCD monitors. No, I’m not going to completely refute this statement; it’s difficult to refute what you see perfectly with your own eyes, even sitting behind the latest S-PVA matrix with a rated contrast ratio of 1000:1.

Specification contrast, as a rule, is measured by manufacturers not of the monitors themselves, but of LCD matrices, on a special stand, when a certain signal is supplied and a certain level of backlight brightness. It is equal to the ratio of the white color level to the black color level.

In finished monitors, the picture is primarily complicated by the fact that the black level is determined not only by the characteristics of the matrix, but also - sometimes - by the settings of the monitor itself, primarily in models where the brightness is controlled by the matrix, and not by the backlights. In this case, the contrast of the monitor may turn out to be much lower than the rated contrast of the matrix - if it is not configured too carefully. This effect can be clearly seen on Sony monitors, which have two brightness adjustments at once - both by the matrix and by the lamps - in them, when the matrix brightness increases above 50%, the black color quickly turns into gray.

Here I would like to note once again that the opinion that the rated contrast can be increased due to the brightness of the backlight - and this is supposedly why many monitor manufacturers install such powerful lamps in them - is completely wrong. As the brightness of the backlight increases, both the white and black levels increase at the same rate, which means their ratio, which is the contrast, does not change. It is impossible, through backlighting alone, to increase the brightness level of white without increasing the brightness of black.

However, all this has already been said many times before, so let's move on to consider other issues.

Undoubtedly, the nominal contrast of modern LCD monitors is still not high enough to successfully compete with good CRT monitors in this parameter - in the dark their screens still glow noticeably, even if the picture is completely black. But we most often use monitors not in the dark, but even in daylight, sometimes quite bright. Obviously, in this case, the real contrast we observe will differ from the passport one, measured in the semi-darkness of the laboratory - the external light reflected by it will be added to the own glow of the monitor screen.


Above is a photo of two monitors standing side by side - a Samsung SyncMaster 950p+ CRT monitor and a SyncMaster 215TW LCD monitor. Both are turned off, the external lighting is normal daylight, on a cloudy day. It is clearly visible that the screen of a CRT monitor under external lighting is not just lighter, but much lighter than the screen of an LCD monitor - a situation that is exactly the opposite of what we see in the dark and when the monitors are turned on.

This can be explained very simply - the phosphor used in cathode ray tubes itself has a light gray color. To darken the screen, a tint film is applied to its glass - since the phosphor’s own glow passes through this film once, and external light passes through it twice (the first time on the way to the phosphor, the second time, reflected from the phosphor, on the way out to our eye) , then the latter is weakened by the film significantly more than the former.

However, it is not possible to make a completely black screen on a CRT - as the transparency of the film decreases, you have to increase the brightness of the phosphor glow, because the film also weakens it. And this brightness in a CRT is limited to a fairly modest level, since when the current of the electron beam increases too much, its focusing is greatly deteriorated, the image becomes fuzzy and blurry. For this reason, the maximum reasonable brightness of CRT monitors does not exceed 150 cd/sq.m.

In an LCD matrix, there is practically nothing for external light to be reflected from; there is no phosphor in it, only layers of glass, polarizers and liquid crystals. Of course, some small part of the light is reflected from the outer surface of the screen, but most of it freely passes inside and is lost there forever. Therefore, in daylight, the screen of an LCD monitor that is turned off looks almost black.

So, in daylight and the monitors are off, a CRT screen is significantly lighter than an LCD screen. If we turn on both monitors, then the LCD, due to its lower nominal contrast, will receive a greater increase in the black level than the CRT - but even so, it will still remain darker than the CRT. If we now close the curtains, “turning off” daylight, then the situation will change to the opposite, and the CRT will have a deeper black color.

Thus, the real contrast of monitors depends on the external illumination: the higher it is, the more advantageous the position is for LCD monitors; even in bright light, the picture on them remains contrasty, while on a CRT it noticeably fades. In the dark, on the contrary, the advantage is on the side of the CRT.

By the way, this is partly the basis for the good appearance - at least in the store window - of monitors with a glossy screen surface. A regular matte coating scatters the light falling on it in all directions, while a glossy one reflects it purposefully, like a regular mirror - therefore, if the light source is not located directly behind you, then a matrix with a glossy coating will look more contrasting than a matte one. Alas, if the light source suddenly turns out to be behind you, the picture changes radically - a matte screen still scatters light more or less evenly, but a glossy one will reflect it directly into your eyes.

It should be noted that all these discussions concern not only LCD and CRT monitors, but also other display technologies - for example, the SED panels promised to us by Toshiba and Canon in the near future, having a fantastic rated contrast ratio of 100,000: 1 (in other words, black the color on them in the dark is completely black), in real life in daylight they will fade in exactly the same way as CRTs. They use the same phosphor, which glows when bombarded with an electron beam, and a black tint film is also installed in front of it, but if in a CRT, reducing the transparency of the tint (thereby increasing the contrast) was prevented by the defocusing of the beam, then in the SED this will be hampered by a noticeably decreasing decrease with increasing beam current, lifetime of emitter cathodes.

However, recently models of LCD monitors have appeared on the market with unusually high values ​​​​of the declared passport contrast - up to 3000: 1 - and at the same time using the same matrices as monitors with more familiar numbers in the specifications. The explanation for this lies in the fact that such large values ​​by LCD standards correspond not to “normal” contrast, but to the so-called dynamic one.

The idea is, in general, simple: in any film there are both light scenes and dark ones. In both cases, our eye perceives the brightness of the entire picture as a whole, that is, if most of the screen is light, then the black level in a few dark areas does not matter much, and vice versa. Therefore, it seems quite reasonable to automatically adjust the brightness of the backlight depending on the image on the screen - on dark scenes the backlight can be dimmed, thereby making them even darker, on light scenes, on the contrary, it can be brought to maximum brightness. It is this automatic adjustment that is called “dynamic contrast”.

The official figures for dynamic contrast are obtained very simply: the white level is measured at maximum backlight brightness, the black level at minimum. As a result, if the matrix has a rated contrast of 1000:1, and the monitor electronics allow you to automatically change the backlight brightness three times, then the final dynamic contrast will be equal to 3000:1.

At the same time, you need to understand that the dynamic contrast mode is only suitable for films, and maybe even for games - and in the latter, players would rather raise the brightness in dark scenes in order to make it easier to navigate what is happening, rather than lower it. For normal work, automatically adjusting the brightness depending on the image displayed on the screen is not only useless, but simply extremely annoying.

Of course, at any given moment in time, the screen contrast - the ratio of the white level to the black level - does not exceed the nominal static contrast of the monitor, however, as mentioned above, in light scenes the black level is not very important for the eye, and in dark scenes, on the contrary, the white level , so automatic brightness adjustment in movies is quite useful and really gives the impression of a monitor with a noticeably increased dynamic range.

The only downside of the technology is that the brightness is controlled as a whole for the entire screen, so in scenes that combine light and dark objects in equal proportions, the monitor will simply set a certain average brightness. Dynamic contrast will not give anything in dark scenes with individual small very bright objects (for example, a night street with lanterns) - since the overall background will be dark, the monitor will reduce the brightness to a minimum, accordingly dimming bright objects. However, as mentioned above, due to the peculiarities of our perception, these shortcomings are hardly noticeable and in any case less significant than the insufficient contrast of conventional monitors. So overall, the new technology should appeal to many users.

Color rendering: color gamut and LED backlight

A little over two years ago, in the article “Parameters of modern LCD monitors,” I wrote that such a parameter as color gamut is generally unimportant for monitors - simply because it is the same for all monitors. Fortunately, since then the situation has changed for the better - monitor models with increased color gamut have begun to appear on sale.

So what is color gamut?

As is known, humans see light in the wavelength range from approximately 380 to 700 nm, from violet to red. Four types of detectors act as light-sensitive elements in our eye - one type of rods and three types of cones. Rods have excellent sensitivity, but do not distinguish between different wavelengths at all; they perceive the entire range as a whole, which gives us black-and-white vision. Cones, on the contrary, have significantly less sensitivity (and therefore stop working at dusk), but with sufficient illumination they give us color vision - each of the three types of cones is sensitive to its own wavelength range. If a beam of monochromatic light with a wavelength of, say, 400 nm hits our eye, then only one type of cone, responsible for blue color, will react to it. Thus, different types of cones perform approximately the same function as the RGB filters in front of the digital camera sensor.

Although this makes it seem at first glance that our color vision can easily be described by three numbers, each of which would correspond to the level of red, green or blue, this is not the case. As experiments conducted at the beginning of the last century have shown, the processing of information by our eye and our brain is less unambiguous, and if we try to describe color perception in three coordinates (red, green, blue), it turns out that the eye can perceive without any problems colors for which in such a system the value of red turns out to be... negative. In other words, it is impossible to completely describe human vision in the RGB system - in fact, the spectral sensitivity curves of different types of cones are somewhat more complex.


As a result of the experiments, a system was created that describes the entire range of colors perceived by our eyes. Its graphical display is called a CIE diagram and is shown in the figure above. Within the shaded area are all the colors perceived by our eyes; the outline of this area corresponds to pure, monochromatic colors, and the inner area corresponds, accordingly, to non-monochromatic, up to white color (it is marked by a white dot; in fact, “white color” from the point of view of the eye is a relative concept, depending on the conditions we can consider colors that are actually different from each other are white; on the CIE diagram, the so-called “flat spectrum point” is usually marked as the white point, which has coordinates x=y=1/3; under normal conditions, the corresponding color will appear very cold, bluish).

With a CIE diagram, any color perceived by the human eye can be specified using two numbers, coordinates along the horizontal and vertical axes of the diagram: x and y. But this is not surprising, but the fact that we can recreate any color using a set of several monochromatic colors, mixing them in a certain proportion - our eye is completely indifferent to what spectrum the light that entered it actually had, all that matters is , how each type of receptor, rod and cone, was excited.

If human vision were successfully described by the RGB model, then to emulate any of the colors that the eye could see, it would be enough to take three sources, red, green and blue, and mix them in the right proportions. However, as mentioned above, we actually see more colors than can be described in RGB, so in practice the problem is the opposite: given three sources of different colors, what other colors can we create by mixing them?


The answer is very simple and obvious: if you put points with the coordinates of these colors on the CIE diagram, then everything that can be obtained by mixing them will lie inside a triangle with vertices at these points. It is this triangle that is called the “color gamut”.

The maximum possible color gamut for a system with three basic colors is provided by the so-called laser display (see above in the figure), the basic colors in which are formed by three lasers, red, green and blue. The laser has a very narrow emission spectrum, it has excellent monochromaticity, therefore the coordinates of the corresponding basic colors will lie exactly on the border of the diagram. It is impossible to move them outside, beyond the border - this is a non-physical region, the coordinates of the points in it do not correspond to any light, and any shift of the points inside the diagram will lead to a decrease in the area of ​​the corresponding triangle and, accordingly, a decrease in the color gamut.

As can be clearly seen from the figure, even a laser display is not capable of reproducing all the colors that the human eye sees, although it is quite close to this. You can increase the color gamut only by using a larger number of basic colors (four, five, and so on), or by creating some kind of hypothetical system that can change the coordinates of its basic colors “on the fly” - however, if the first is simply technically difficult at the moment, then the second is generally unrealizable.

However, in any case, it is too early for us to grieve over the shortcomings of laser displays: we don’t even have them yet, and what we do have demonstrates a color gamut that is very much inferior to laser displays. In other words, in real monitors, both CRT and LCD (with the exception of some models, which will be discussed below), the spectrum of each of the basic colors is quite far from monochromatic - in terms of the CIE diagram, this means that the vertices of the triangle will shift from the boundaries of the diagram are closer to its center, and the area of ​​the triangle will noticeably decrease.

Above in the picture there are two triangles drawn - for a laser display and the so-called sRGB. In short, the second one exactly corresponds to the typical color gamut of modern LCD and CRT monitors. It's a sad picture, isn't it? I'm afraid we won't be able to see a pure green color yet...

The reason for this - in the case of LCD monitors - is the extremely poor spectrum of backlight lamps for LCD panels. Cold cathode fluorescent lamps (CCFL) are used as such - the discharge burning in them produces radiation in the ultraviolet spectrum, which is converted into ordinary white light by a phosphor applied to the walls of the lamp bulb.

In nature, the source of light for us is usually various hot bodies, primarily our Sun. The radiation spectrum of such a body is described by Planck's law, but the main thing is that it is continuous, continuous, all wavelengths are present in it, and the radiation intensities at close wavelengths differ slightly.

A fluorescent lamp, like other gas-discharge light sources, produces a line spectrum, in which there is no radiation at all at some wavelengths, and the intensities of parts of the spectrum that are only a few nanometers apart from each other can differ by tens or hundreds of times. Since our eye is completely insensitive to a specific type of spectrum, from its point of view both the Sun and the fluorescent lamp give exactly the same light. However, in the monitor everything turns out to be somewhat more complicated...

So, several fluorescent lamps standing behind the LCD matrix shine through it. On the reverse side of the matrix there is a grid of multi-colored filters - red, green and blue - forming triads of subpixels. Each filter cuts out a piece of the spectrum from the lamp light corresponding to its passband - and as we remember, to obtain the maximum color gamut, this piece should be as narrow as possible. However, let’s imagine that at a wavelength of 620 nm in the spectrum of the backlight lamp there is a peak intensity... well, let it be 100 arbitrary units. Then for the red subpixel we install a filter with a maximum transmission at the same 620 nm and, it would seem, we get the first vertex of the color gamut triangle, lying neatly on the border of the diagram. It would seem that.

The phosphor of even modern fluorescent lamps is a rather capricious thing; we cannot control its spectrum at will; we can only select from a set of phosphors known in chemistry the one that more or less meets our needs. And the best one that we can choose has in its spectrum another peak with a height of the same 100 arbitrary units at a wavelength of 575 nm (this will be yellow). Our red filter with a maximum at a wavelength of 620 nm at this point has a transmittance of, well, let's say, 1/10 of the maximum.

What does this mean? That at the output of the filter we get not one wavelength, but two at once: 620 nm with an intensity of 100 conventional units and 575 nm with an intensity of 100 * 1/10 (we multiply the intensity in the lamp spectrum line by the transmittance of the filter at a given wavelength), then there are 10 conventional units. In general, not so little.

Thus, due to the “extra” peak in the spectrum of the lamp, which partially breaks through the filter, we got a polychromatic color instead of monochromatic red - red with an admixture of yellow. On the CIE diagram, this means that the corresponding vertex of the gamut triangle has moved from the bottom edge of the diagram upward, closer to yellow shades, reducing the area of ​​the gamut triangle.

However, as you know, it is better to see once than to hear five times. To see what was described above, I turned for help to the Department of Plasma Physics of the Research Institute of Nuclear Physics named after. Skobeltsyn, and soon I had an automated spectrographic system at my disposal. It was designed to study and control the growth processes of artificial diamond films in microwave plasma using the emission spectra of the plasma, so it will probably cope with some kind of banal LCD monitor without difficulty.


We turn on the system (the large and angular black box is the Solar TII MS3504i monochromator, its input port is visible on the left, opposite which is a light guide with an optical system, the orange cylinder of the photosensor attached to the output port of the monochromator is visible on the right; the system’s power supply is on top)...


We install the input optical system at the required height and connect the second end of the light guide to it...


And finally, we place it in front of the monitor. The entire system is controlled by a computer, so the process of taking the spectrum in the entire range of interest to us (from 380 to 700 nm) is completed in just a couple of minutes:


The horizontal axis of the graph shows the wavelength in angstroms (10 A = 1 nm), and the vertical axis shows the intensity in certain conventional units. For greater clarity, the graph is colored according to the wavelengths - as our eyes perceive them.

The test monitor in this case was the Samsung SyncMaster 913N, a fairly old budget model on a TN matrix, but in general this does not matter - the same lamps with the same spectrum that are in it are used in the vast majority of other modern LCD monitors.

So what do we see on the spectrum? Namely, what was described in the words above: in addition to three distinct high peaks corresponding to the blue, red and green subpixels, we see some completely unnecessary garbage in the region of 570...600 nm and 480...500 nm. It is these extra peaks that shift the vertices of the color gamut triangle far deeper into the CIE diagram.

Of course, the best way to combat this may be to abandon CCFL altogether - and some manufacturers have done so, such as Samsung with its SynsMaster XL20 monitor. In it, instead of fluorescent lamps, a block of LEDs of three colors is used as backlight - red, blue and green (exactly so, because using white LEDs does not make sense, because anyway, from the backlight spectrum with a filter we will cut out red, green and blue colors) . Each of the LEDs has a neat, even spectrum that exactly matches the passband of the corresponding filter and does not have any unnecessary side bands:


It's fun to watch, isn't it?

Of course, the band of each of the LEDs is quite wide, their radiation cannot be called strictly monochromatic, so it will not be possible to compete with a laser display, but when compared with the CCFL spectrum, it is a very pleasant picture, in which it is especially worth noting the neat smooth minima in those two areas where CCFL had completely unnecessary peaks. It is also interesting that the position of the maxima of all three peaks has shifted slightly - with red now noticeably closer to the edge of the visible spectrum, which will also have a positive effect on the color gamut.


And here, in fact, is the color gamut. We see that the coverage triangle of the SyncMaster 913N is practically no different from the modest sRGB, and compared to the coverage of the human eye, the green color suffers the most in it. But the color gamut of the XL20 is difficult to confuse with sRGB - it easily captures a significantly larger part of the shades of green and blue-green, as well as deep red. This is, of course, not a laser display, but it is impressive.

However, we won’t see LED-backlit home monitors for a long time. Even the SyncMaster XL20, the start of sales of which is scheduled for this spring, will cost about $2000 with a 20" screen diagonal, and the 21" NEC SpectraView Reference 21 LED costs three times that amount - only printers are accustomed to such prices for monitors (for which both of these models are primarily intended for), but clearly not home users.

However, do not despair - there is hope for you and me too. It consists in the appearance on the market of backlit monitors using the same fluorescent lamps, but with a new phosphor, in which unnecessary peaks in the spectrum are partly suppressed. These lamps are not as good as LEDs, but they are still noticeably superior to old lamps - the color gamut they provide is approximately halfway between that of models with old lamps and models with LED backlighting.

For a numerical comparison of the color gamut, it is customary to indicate the percentage of the gamut of a given monitor from one of the standard gamuts; sRGB is quite small, so NTSC is often used as a standard color gamut for comparison. Regular sRGB monitors have a color gamut of 72% NTSC, monitors with enhanced backlights have a color gamut of 97% NTSC, and LED-backlit monitors have a color gamut of 114% NTSC.

What does increased color gamut give us? Manufacturers of LED-backlit monitors in their press releases usually place photographs of new monitors next to old ones, simply increasing the color saturation on the new ones - this is not entirely true, because in fact, new monitors only improve the saturation of those colors that go beyond the color limit coverage of old monitors. But, of course, when viewing the above press releases on your old monitor, you will never see this difference, because your monitor cannot reproduce these colors anyway. It's like trying to watch a report from a color TV show in black and white. Although, the manufacturers can also be understood - they need to somehow reflect the advantages of the new models in press releases?..

In practice, however, there is a difference - I can’t say it’s fundamental, but it definitely speaks in favor of models with an increased color gamut. It is expressed in very pure and deep red and green colors - if after a long period of work on an LED-backlit monitor you switch back to the good old CCFL, at first you just want to add color saturation to it, until you realize that this will not help it at all , red and green will remain somewhat dull and dirty compared to the “LED” monitor.

Unfortunately, so far the distribution of models with improved backlights is not going quite as we would like - for example, at Samsung it started with the SyncMaster 931C model on a TN matrix. Of course, budget TN monitors would also benefit from an increased color gamut, but hardly anyone takes such models to work with color due to the frankly poor viewing angles. However, all major manufacturers of panels for LCD monitors - LG.Philips LCD, AU Optronics and Samsung - already have S-IPS, MVA and S-PVA panels with a diagonal of 26-27" and new backlight lamps.

In the future, undoubtedly, lamps with new phosphors will completely replace the old ones - and we will finally go beyond the modest coverage of sRGB, for the first time in the entire existence of color computer monitors.

Color rendering: color temperature

In the previous section, I briefly mentioned that the concept of “white color” is subjective and depends on external conditions, now I would like to expand on this topic in a little more detail.

So, there really is no standard white color. One could take a flat spectrum as a standard (that is, one for which in the optical range the intensities at all wavelengths are the same), but there is one problem - in most cases, to the human eye it will not look white, but very cold, with a bluish tint .

The fact is that, just as you can adjust the white balance in a camera, our brain adjusts this balance for itself depending on the external lighting. The light of an incandescent light bulb at home in the evening seems to us only a little yellowish, although the same lamp, lit in light shade on a fine sunny day, already looks completely yellow - because in both cases our brain adjusts its white balance to the prevailing lighting, and in these cases it is different .

The desired white color is usually denoted through the concept of “color temperature” - this is the temperature to which an absolutely black body must be heated in order for the light emitted by it to look the desired way. Let's say the surface of the Sun has a temperature of about 6000 K - and indeed, the color temperature of sunlight on a clear day is defined as 6000 K. The filament of an incandescent lamp has a temperature of about 2700 K - and the color temperature of its light is also equal to 2700 K. It's funny that the higher the body temperature , the colder its light seems to us, because blue tones begin to predominate in it.

For sources with a line spectrum - for example, the CCFL mentioned above - the concept of color temperature becomes somewhat more conventional, because it is, of course, impossible to compare their radiation with the continuous spectrum of an absolutely black body. So in their case, we have to rely on the perception of the spectrum by our eye, and from devices for measuring the color temperature of light sources we have to achieve the same cunning characteristic of color perception as that of the eye.

In the case of monitors, we can adjust the color temperature from the menu: as a rule, there are three or four preset values ​​(for some models - significantly more) and the ability to individually adjust the levels of basic RGB colors. The latter is inconvenient compared to CRT monitors, where it was the temperature and not the RGB levels that were adjusted, but, unfortunately, for LCD monitors, except for some expensive models, this is the de facto standard. The purpose of adjusting the color temperature on the monitor is obvious - since the ambient light is chosen as a sample for adjusting the white balance, the monitor must be adjusted to it so that the white color looks white on it, and not bluish or reddish.

What is even more regrettable is that in many monitors the color temperature varies greatly between different gray levels - it is obvious that gray color differs from white very conditionally, only in brightness, so nothing prevents us from talking not about white balance, but about gray balance, and this will be even more correct. And many monitors also have different balances for different gray levels.


Above is a photograph of the ASUS PG191 monitor screen, on which four gray squares of different brightness are displayed - more precisely, three versions of this photograph are shown, added together. In the first of them, the gray balance is selected according to the rightmost (fourth) square, in the second - according to the third, in the last - according to the second. We can’t say about any of them that it is correct and the others are wrong - in fact, they are all incorrect, because the color temperature of the monitor should not depend in any way on what level of gray color we calculate it by, but here it is clearly not So. This situation can only be corrected by a hardware calibrator - but not by monitor settings.

For this reason, in each article for each monitor I provide a table with the results of color temperature measurements for four different gray levels - and if they are very different from each other, the monitor image will be tinted in different tones, as in the picture above.

Workspace ergonomics and monitor settings

Despite the fact that this topic is not directly related to the parameters of monitors, at the end of the article I would like to consider it, because, as practice shows, for many people, especially those accustomed to CRT monitors, the process of initially setting up an LCD monitor can cause difficulties.

Firstly, the location in space. The monitor should be located at arm's length from the person working behind it, perhaps slightly more if the monitor has a large screen size. You shouldn't place the monitor too close - so if you are going to buy a model with a small pixel size (17" monitors with a resolution of 1280x1024, 20" monitors with a resolution of 1600x1200 and 1680x1050, 23" with a resolution of 1920x1200...), consider whether the image will be suitable for you it is too small and illegible. If you have such concerns, it is better to take a closer look at monitors with the same resolution, but a larger diagonal, since the only other countermeasures that remain are scaling fonts and interface elements of Windows (or the OS you use), which is not available in all applications. programs gives beautiful results.

The height of the monitor should ideally be adjusted so that the top edge of the screen is at eye level - in this case, when working, the gaze will be directed slightly downward, and the eyes will be half-closed with eyelids, which will protect them from drying out (as you know, we blink too rarely when working) . Many budget monitors, even 20" and 22" models, use stands without height adjustment - if you have the choice, it is better to avoid such models, and in monitors with height adjustment, pay attention to the range of this adjustment. However, almost all modern monitors allow you to remove the original stand from them and install a standard VESA bracket - and sometimes this opportunity is worth taking advantage of, because a good bracket gives not only freedom to move the screen, but also the ability to install it to the height that you need , starting from zero relative to the top of the table.

An important point is the lighting of the workplace. It is strictly contraindicated to work at a monitor in complete darkness - a sharp transition between a bright screen and a dark background will greatly tire your eyes. To watch movies and games, a small background lighting is enough, for example, one table or wall lamp; For work, it is better to organize full lighting of the workplace. For lighting, you can use incandescent lamps or fluorescent lamps with electronic ballast (both compact ones, chambered for E14 or E27, and ordinary “tubes”), but fluorescent lamps with electromagnetic ballast should be avoided - these lamps flicker strongly at twice the frequency of the mains voltage , i.e. 100 Hz, this flicker can interfere with the scan or the monitor's own backlight flicker, which sometimes creates extremely unpleasant effects. In large office premises, blocks of fluorescent lamps are used, the lamps in which flicker in different phases (either by connecting different lamps to different phases of the power supply, or by installing phase-shifting chains), which significantly reduces the noticeability of flicker. At home, where there is usually only one lamp, there is also only one way to combat flicker - the use of modern lamps with electronic ballast.

Having installed the monitor in the real space, you can connect it to the computer and continue the installation in the virtual one.

An LCD monitor, unlike a CRT, has exactly one resolution at which it performs well. The LCD monitor does not work well in all other resolutions - so it is better to immediately set its native resolution in the video card settings. Here, of course, we must once again note the need to think before purchasing a monitor whether the native resolution of the selected model will seem too large or too small to you - and, if necessary, adjust your plans by choosing a model with a different screen diagonal or with a different resolution.

The frame rate of modern monitors is, by and large, the same for all - 60 Hz. Despite the frequencies of 75 Hz and even 85 Hz formally declared for many models, when they are installed, the monitor matrix usually continues to operate at the same 60 Hz, and the monitor electronics simply discards “extra” frames. Therefore, there is no point in chasing high frequencies: unlike CRTs, there is no flicker on LCD monitors.

If your monitor has two inputs, digital DVI-D and analog D-Sub, then it is better to use the first one for work - it not only gives a higher quality picture at higher resolutions, but also simplifies the setup process. If you only have an analog input, then after connecting and setting the native resolution, you should open some clear, contrasting image - for example, a page of text - and check for unpleasant artifacts in the form of flickering, waves, interference, borders around characters, etc. similar. If something similar is observed, you should press the auto-adjust to signal button on the monitor; in many models it turns on automatically when the resolution is changed, but a smooth, low-contrast picture of the Windows desktop is not always enough for successful auto-tuning, so you have to run it manually again. When connecting via the DVI-D digital input, such problems do not arise, so when buying a monitor, it is better to pay attention to the set of inputs it has and give preference to models with DVI-D.

Almost all modern monitors have default settings that give very high brightness - about 200 cd/sq.m. This brightness is suitable for working on a sunny day, or for watching movies - but not for work: for comparison, the typical brightness of a CRT monitor is about 80...100 cd/sq.m. Therefore, the first thing to do after turning on the new monitor is to set the desired brightness. The main thing is to do it without haste, without trying to get the perfect result in one movement, and especially not trying to do it “like on the old monitor”; The problem is that being pleasing to the eyes of an old monitor does not mean its fine tuning and high image quality - but only that your eyes are used to it. A person who has switched to a new monitor from an old CRT with a dead tube and a dim image may at first complain about excessive brightness and clarity - but if a month later the old CRT is placed in front of him again, it turns out that now he can no longer sit in front of it, because that the picture is too dull and dark.

For this reason, if your eyes feel discomfort when working with the monitor, you should try changing its settings gradually and in connection with each other - reduce the brightness and contrast a little, work some more, if the discomfort remains, turn them down a little more... Let's do it after each Such a change takes time for the eyes to get used to the picture.

In principle, there is a good trick that allows you to quickly adjust the brightness of an LCD monitor to an acceptable level: you need to place a sheet of white paper next to the screen and adjust the brightness and contrast of the monitor so that the brightness of the white color on it is close to the brightness of the sheet of paper. Of course, this technique assumes that your workplace is well lit.

It is also worth experimenting a little with color temperature - ideally, it should be such that the white color on the monitor screen is perceived by the eye as white, and not bluish or reddish. However, this perception depends on the type of external lighting, while monitors are initially adjusted to some average conditions, and many models are also configured very sloppily. Try changing the color temperature to a warmer or cooler one, moving the RGB level adjustment sliders in the monitor menu - this can also have a positive effect, especially if the default color temperature of the monitor is too high: the eyes react worse to cool shades than to warm shades.

Unfortunately, many users do not follow these generally simple recommendations - and as a result, multi-page topics in the forums are born in the spirit of “Help me choose a monitor that doesn’t tire my eyes,” where they even go as far as creating lists of monitors that don’t tire my eyes. Gentlemen, I worked with dozens of monitors, and my eyes never got tired of any of them, with the exception of a couple of ultra-budget models that simply had problems with image clarity or completely crooked color rendition settings. Because your eyes get tired not from the monitor, but from its incorrect settings.

In forums, in similar topics, sometimes it gets to the point of ridiculousness - the influence of flickering backlight lamps (its frequency in modern monitors is usually 200...250 Hz, which, of course, is not perceived by the eye at all) on vision, the influence of polarized light, the influence of too low or The contrast of modern LCD monitors is too high (to taste), there was once even one topic in which the effect of the line spectrum of backlight lamps on vision was discussed. However, this seems to be a topic for another article, April Fool’s...
views