Disk data cache. Increasing the speed of the hard drive. Enabling Record Caching

Disk data cache. Increasing the speed of the hard drive. Enabling Record Caching

Cache memory- This is ultra-fast memory, which has increased performance compared to RAM.

Cache memory complements the functional value of RAM.
When a computer is running, all calculations occur in the processor, and the data for these calculations and their results are stored in RAM. The speed of the processor is several times higher than the speed of information exchange with RAM. Considering that between two processor operations one or more operations can be performed on slower memory, we find that the processor must be idle from time to time and the overall speed of the computer drops.

The cache memory is controlled by a special controller, which, by analyzing the program being executed, tries to predict what data and commands the processor will most likely need in the near future, and pumps them into the cache memory, i.e. The cache controller loads the necessary data from RAM into the cache memory, and returns, when necessary, data modified by the processor to RAM.

The processor's cache memory performs approximately the same function as RAM. Only the cache is memory built into the processor and is therefore faster than RAM, partly due to its position. After all, the communication lines running along the motherboard and the connector have a detrimental effect on speed. The cache of a modern personal computer is located directly on the processor, making it possible to shorten communication lines and improve their parameters.

Cache memory is used by the processor to store information. It buffers the most frequently used data, due to which the time of the next access to it is significantly reduced.

All modern processors have a cache (in English - cache) - an array of ultra-fast RAM, which is a buffer between the relatively slow system memory controller and the processor. This buffer stores blocks of data that the CPU is currently working with, thereby significantly reducing the number of processor calls to the extremely slow (compared to the processor speed) system memory.

This significantly increases the overall performance of the processor.
Moreover, in modern processors the cache is no longer a single memory array, as before, but is divided into several levels. The fastest, but relatively small in size, first-level cache (denoted as L1), with which the processor core operates, is most often divided into two halves - the instruction cache and the data cache. The second level cache interacts with the L1 cache - L2, which, as a rule, is much larger in volume and is mixed, without dividing into an instruction cache and a data cache.

Some desktop processors, following the example of server processors, also sometimes acquire a third-level L3 cache. The L3 cache is usually even larger in size, although somewhat slower than the L2 (due to the fact that the bus between L2 and L3 is narrower than the bus between L1 and L2), but its speed, in any case, is disproportionately higher than the speed system memory.

There are two types of cache: exclusive and non-inclusive cache. In the first case, information in caches of all levels is clearly demarcated - each of them contains exclusively original information, while in the case of a non-exclusive cache, information can be duplicated at all caching levels. Today it is difficult to say which of these two schemes is more correct - both have both minuses and pluses. The exclusive caching scheme is used in AMD processors, while the non-exclusive one is used in Intel processors.

Exclusive cache memory

Exclusive cache memory assumes the uniqueness of the information located in L1 and L2.
When reading information from RAM into the cache, the information is immediately entered into L1. When L1 is full, information is transferred from L1 to L2.
If, when the processor reads information from L1, the necessary information is not found, then it is looked for in L2. If the necessary information is found in L2, then the first and second level caches exchange lines with each other (the “oldest” line from L1 is placed in L2, and the required line from L2 is written in its place). If the necessary information is not found in L2, then the access goes to the RAM.
The exclusive architecture is used in systems where the difference between the volumes of the first and second level caches is relatively small.

Inclusive cache

An inclusive architecture involves duplication of information found in L1 and L2.
The scheme of work is as follows. When copying information from RAM to the cache, two copies are made, one copy is stored in L2, the other copy is stored in L1. When L1 is completely full, the information is replaced according to the principle of removing the “oldest data” - LRU (Least-Recently Used). The same thing happens with the second level cache, but since its volume is larger, the information is stored in it longer.

When the processor reads information from the cache, it is taken from L1. If the necessary information is not in the first level cache, then it is looked for in L2. If the necessary information is found in the second level cache, it is duplicated in L1 (using the LRU principle), and then transferred to the processor. If the necessary information is not found in the second level cache, then it is read from RAM.
The inclusive architecture is used in those systems where the difference in the size of the first and second level caches is large.

However, cache memory is ineffective when working with large amounts of data (video, sound, graphics, archives). Such files simply do not fit in the cache, so you have to constantly access RAM, or even the HDD. In such cases, all the advantages disappear. That is why budget processors (for example, Intel Celeron) with a reduced cache are so popular that the performance in multimedia tasks (related to processing large amounts of data) is not greatly affected by the cache size, even despite the reduced operating frequency Intel Celeron buses.

Hard drive cache

As a rule, all modern hard drives have their own RAM, called cache memory or simply cache. Hard drive manufacturers often refer to this memory as buffer memory. The size and structure of the cache differ significantly among manufacturers and for different models of hard drives.

Cache memory acts as a buffer for storing intermediate data that has already been read from the hard drive, but has not yet been transferred for further processing, as well as for storing data that the system accesses quite often. The need for transit storage is caused by the difference between the speed of reading data from the hard drive and the system throughput.

Typically, cache memory is used for both writing and reading data, but on SCSI drives it is sometimes necessary to force write caching to be enabled, so disk write caching is usually disabled by default for SCSI. Although this contradicts the above, the size of the cache memory is not decisive for improving performance.

It is more important to organize data exchange with the cache to increase disk performance as a whole.
In addition, performance in general is affected by the operating algorithms of the control electronics, which prevent errors when working with the buffer (storing irrelevant data, segmentation, etc.)

In theory: the larger the cache memory, the higher the likelihood that the necessary data is in the buffer and there will be no need to “disturb” the hard drive. But in practice, it happens that a disk with a large amount of cache memory is not much different in performance from a hard disk with a smaller amount; this happens when working with large files.

The normal functioning of the operating system and the fast operation of programs on the computer are ensured by RAM. Every user knows that the number of tasks that a PC can perform simultaneously depends on its volume. Some computer components are also equipped with similar memory, only in smaller quantities. In this material we will talk about hard drive cache memory.

Cache memory (or buffer memory, buffer) is an area where data is stored that has already been read from the hard drive, but has not yet been transferred for further processing. The information that Windows uses most often is stored there. The need for this storage arose due to the large difference between the speed of reading data from the drive and the system throughput. Other computer elements also have a similar buffer: processors, video cards, network cards, etc.

Cache volumes

When choosing a HDD, the amount of buffer memory is of no small importance. Typically these devices are equipped with 8, 16, 32 and 64 MB, but there are 128 and 256 MB buffers. The cache gets overloaded quite often and needs to be cleared, so in this regard, more is always better.

Modern HDDs are mainly equipped with 32 and 64 MB cache memory (smaller volumes are already rare). Usually this is enough, especially since the system has its own memory, which, together with RAM, speeds up the operation of the hard drive. True, when choosing a hard drive, not everyone pays attention to the device with the largest buffer size, since the price for such ones is high, and this parameter is not the only determining one.

The main task of cache memory

The cache is used to write and read data, but, as already mentioned, this is not the main factor in the efficient operation of the hard drive. It is also important here how the process of exchanging information with the buffer is organized, as well as how well the technologies that prevent errors from occurring work.

Buffer storage contains data that is used most frequently. They are loaded directly from the cache, so performance increases several times. The point is that there is no need for physical reading, which involves directly accessing the hard drive and its sectors. This process is too long, as it is calculated in milliseconds, while data is transferred from the buffer many times faster.

Benefits of cache memory

A cache is about fast data processing, but it has other benefits as well. Hard drives with large storage can significantly unload the processor, which leads to its minimal use.

Buffer memory is a kind of accelerator that ensures fast and efficient operation of the HDD. It has a positive effect on software startup when it comes to frequent access to the same data, the size of which does not exceed the buffer size. For work, 32 and 64 MB are more than enough for an ordinary user. Further, this characteristic begins to lose its significance, since when interacting with large files this difference is insignificant, and who would want to overpay for a larger cache.

Find out the cache size

If the size of the hard drive is a value that is easy to find out, then the situation with buffer memory is different. Not every user is interested in this characteristic, but if such a desire arises, it is usually indicated on the packaging with the device. Otherwise, you can find this information on the Internet or use the free HD Tune program.

The utility, designed to work with HDDs and SSDs, performs reliable data deletion, assesses the status of devices, scans for errors, and also provides detailed information about the characteristics of the hard drive.


In this article, we explained what buffer memory is, what tasks it performs, what its advantages are, and how to find out its capacity on a hard drive. We found out that it is important, but is not the main criterion when choosing a hard drive, and this is a positive point, given the high cost of devices equipped with a large amount of cache memory.

You've probably been interested at least once in the question of how to increase your hard drive cache and how safe it is.

Unfortunately, I hasten to disappoint you - this is precisely what is impossible, because the cache board itself is installed inside the disk, and we do not have access to it. But the hard drive cache can be increased, this is quite simple, the only problem is that for this you need to sacrifice some of the RAM. The program at the driver level (this means that the system will not see this cache, it will only see the program that has taken up a lot of RAM and that’s all) will cache not files, which is very important, but blocks!

This program is called PrimoCache. This program starts working only after Windows itself boots, so the boot itself will not be accelerated. It is important to understand the principle of operation so as not to draw hasty conclusions.

The program is paid, but there is a test period of 60 days, which is enough to evaluate the effect.

You should not allocate too much RAM to the disk cache, otherwise the system will most likely simply slow down at times, since there will not be enough RAM. Personally, I recommend setting the cache to no more than 50% of the total amount of RAM if your total memory is more than 4 GB and no more than 25% if less.

Such a cache speeds up the operation of the disk, if only because the writing process is debugged, and the overall cost is reduced - it does not happen constantly, as some programs do, but after a while, which can be specified in the settings. What does this give? This gives maximum speed for reading data from the disk if it is not in the cache, since it is not interfered with by the writing process.

The PrimoCache program can perfectly increase the service life of an SSD, as it allows you to reduce the load on it, this will be especially useful for cheap SSD drives. Another advantage is that flushing the cache to an SSD disk will be much faster than to a regular one, this is convenient in cases where you have a large cache.

Also, a bottleneck in the disk is the recording of a large number of small files. In this case, this problem is solved, since small files will first be placed in the cache, from where they will then be written to disk without interfering with the user.

So if you are interested in this utility, then here it is - New PrimoCache 2.0.0 is a super cache for your disk! . In this article I am reviewing the second version of the utility, I think there will be no problems with installation and configuration.

If the computer does not have “heavy” disk work processes, that is, writing a large number of files, reading them, or all of this at the same time, then you may not notice the effect of PrimoCache. The utility does not speed up the hard drive, but allows it to work at its maximum! At the same time, frequently used blocks of the file system will be accessible instantly.

I’ll also add that the system caches not blocks, but the files themselves. And PrimoCache caches file system blocks, it doesn’t care what kind of file it is. Therefore, for example, Windows 10 personally works faster for me, since the cache contains many blocks that the system itself uses directly.

Caching records on-device storage refers to the use of high-speed volatile memory to accumulate write commands sent to storage devices and cache them until slower media (either physical disks or low-cost flash memory) can process them. Most devices that use record caching require continuous power supply.

To manage caching of disk entries, open Control Panel - Device Manager.

In chapter Disk devices Double-click the desired drive.

Go to the tab Politicians

Quick removal

This value is usually the best choice for devices that may need to be removed from the system frequently, such as USB flash drives, SD, MMC, Compact Flash or similar memory cards, and other external plug-in storage devices.

If the option is selected Quick removal, then Windows manages the commands sent to the device using a method called end-to-end caching. With pass-through caching, the device handles write commands as if there was no cache. The cache may provide a small performance gain, but the emphasis is on ensuring maximum data security by intercepting commands sent to the primary storage device. The main benefit is the ability to quickly remove a storage device without risk of data loss. For example, if a flash drive is accidentally removed from its port, the likelihood of losing data written to it is greatly reduced.

Optimal performance

This option is usually optimal for devices that need to provide the fastest possible performance; for devices that are rarely removed from the system. If this option is selected and the device is disconnected from the system before all data has been written to it (for example, when a USB flash drive is removed), data may be lost.

If you select the option, Windows uses a method called write-back caching. This method allows the storage device to determine whether a high-speed cache will save time on write commands. If so, the device tells the computer that the data was successfully saved, even though the data may not actually be on the primary storage device (such as disk or flash memory). This method significantly improves the performance of write operations, which are often the main bottleneck for the performance of the system as a whole. But if for any reason the power to the device is lost, then all the data in the cache (which the computer considers to be safely stored) may be lost.

Writing cache to disk

By default, Windows writes the cache to disk. This means that the system will periodically instruct the storage device to transfer to the primary storage device all the data stored in the cache. Selecting this option disables these periodic data transfer commands. Not all devices support all of these features.

If high data transfer speed is your primary concern, you should enable both options: in the section Removal Policy select item Optimal performance, and in the section Record caching policy select item Allow entry caching for this device(if your system hardware and storage device support these features).

How do I change the record caching settings for a device?

Most consumer-oriented storage devices, such as USB flash drives, SD or MMC memory cards, or external drives, do not allow you to change the caching settings for the device. Internal SATA or SAS hard drives that come with Windows usually allow you to change these settings (depending on the device manufacturer). To understand the caching capabilities provided by a particular device and to determine which options best suit your needs, refer to the documentation provided by the manufacturer.

Learn more about data loss prevention

Systems that enable write caching anywhere between the application and the storage device must be stable and not affected by power surges. When a device connected to a system uses write caching, the caching algorithms for the device assume that power is always available for both the cache and data movements to and from the cache. If the system or power supply is known to have power supply problems, these features should not be used.

You should also carefully remove removable storage devices such as USB flash drives, SD, MMC or Compact Flash memory cards, and external drives. When using the parameter Safe removal Windows will be able to protect user data in most scenarios. But certain drivers or applications may not be compatible with the Windows model, which may result in data loss when you uninstall such devices. If possible, you should invoke the Secure Removal application before removing any external storage device from your system.

Sources: Windows Help Documentation.

There is some simple settings algorithm that helps speed up the hard drive. And the work really speeds up. You just need to follow all the proposed points and sequence and the question - how to speed up the hdd will become simple settings and banal changes.

How to speed up hdd?

A little introduction so that you understand what you are dealing with. First, such familiar devices as SSDs came into our era of computer technology. But they are so inaccessible in monetary terms that few people can afford them at all. Cheap solutions are a completely different topic; they are not only controversial, they are often ineffective. They always have several criteria that determine their price qualities:

- scanty (that is, small) service life;
- small volume;
- not so nimble.

The conclusion suggests itself. If you want to save money, but don’t know how, this article will help. Read on to learn how to reorganize your hard drive and make it a little faster.

System settings for better hard drive performance

Quite simple rules for speeding up the hdd. Everything here is really simple. It turns out that Windows, by its very nature, likes to use the disk for its own purposes. It can be:

- background needs.

And this function may not be the main one for the user. But this Windows OS is, in essence, a trickster. The most common thing that said smart guy does is defragmentation. Often the process starts strictly according to schedule. And this is fundamentally wrong. With this action, nothing should affect the disk. And it is better to do such things manually.
And what will become the first priority is to disable this very defragmentation on a schedule.

Disabling defragmentation

Here's what to do:

1) my computer;
2) properties;
3) service;
4) perform defragmentation;
5) set up a schedule.
In the last paragraph, simply uncheck the “execute on schedule” box.

Elimination of indexation

Elimination refers to a simple action on the part of the user that will remove indexing for quick search. It is also of no use to ordinary users. And this will also speed up your hdd. Because you will save a small amount of disk performance.

Work algorithm:

1) my computer;
2) properties;
3) general tab. Here you need to uncheck “allow the contents of files on this drive to be indexed in addition to the file properties.” Now just click the apply button.

Now after clicking apply, a request will appear. Changes will also need to be made to it. Select the button “To drive C and all attached files and folders.”

Enable write caching for hard drives

This action can also improve productivity. Here are the actions you need to take:

1) my computer;
2) properties;
3) equipment;
4) type. Next, go to the properties button and in the “policy” tab that pops up. But here you need to check two boxes at once. These are the “allow entry caching for this device” checkbox and the “disable Windows entry cache buffer flushing for this device” checkbox.

Let's speed up the work in just three steps:

1. Disable defragmentation.
2. Elimination of indexation.
3. Enable write caching for hard drives.

All this really ensures increased hdd performance right before your eyes. This theory is applicable to such operating systems as:

- Windows XP;
- Windows Vista;
- Windows 7.

All these activities apply not only to hard drives. They can also be used for efficient operation of portable devices. These include:

— external hard drives;
- flash drives;
- IDE drives.

The only difference will be in the set of checkboxes on the “policy” tab. After reading even this short instruction, you can easily apply it on your device. About computers in Russian - this is when the device works at full capacity and this work does not affect its service life. Show your disk how to work so that the pace is fast and there are no unnecessary actions.

views