What is Cache in Computer | Write Policy in Cache Memory

What is Cache in Computer | Write Policy in Cache Memory I Hope You Like The What is Cache in Computer/ Write Policy in Cache Memory



What is Cache in Computer | Write Policy in Cache Memory

A Cache is a hardware or software component that stores data so future requests for that data can be served faster; the data stored in a cache might be the result of an earlier computation or the duplicate of data stored elsewhere. A cache hit occurs when the requested data can be found in a cache, while a cache miss occurs when it cannot. Cache hits are served by reading data from the cache, which is faster than recomputing a result or reading from a slower data store; thus, the more requests can be served from the cache, the faster the system performs.

To be cost-effective and to enable efficient use of data, caches are relatively small. Nevertheless, caches have proven themselves in many areas of computing because access patterns in typical computer applications exhibit the locality of reference. Moreover, access patterns exhibit temporal locality if data is requested again that has been recently requested already, while spatial locality refers to requests for data physically stored close to data that has been already requested.





Latency


A larger resource incurs a significant latency for access e.g. it can take 100s of clock cycles for a modern 4GHz processor to reach DRAM. This is mitigated by reading in large chunks, in the hope that subsequent reads will be from nearby locations. Prediction or explicit prefetching might also guess where future reads will come from and make requests ahead of time; if done correctly the latency is bypassed altogether.

Throughput and granularity

The use of a cache also allows for higher throughput from the underlying resource, by assembling multiple fine grain transfers into larger, more efficient requests. In the case of DRAM, this might be served by a wider bus. Imagine a program scanning bytes in a 32bit address space, but being served by a 128bit off-chip data bus; individual uncached byte accesses would only allow 1/16th of the total bandwidth to be used, and 80% of the data movement would be addressed. Reading larger chunks reduces the fraction of bandwidth required for transmitting address information.


Hardware implements cache as a block of memory for temporary storage of data likely to be used again. Central processing units (CPUs) and hard disk drives (HDDs) frequently use a cache, as do web browsers and web servers.

A cache is made up of a pool of entries. Each entry has associated data, which is a copy of the same data in some backing store. Each entry also has a tag, which specifies the identity of the data in the backing store of which the entry is a copy.


When the cache client (a CPU, web browser, operating system) needs to access data presumed to exist in the backing store, it first checks the cache. If an entry can be found with a tag matching that of the desired data, the data in the entry is used instead. This situation is known as a cache hit. So, for example, a web browser program might check its local cache on disk to see if it has a local copy of the contents of a web page at a particular URL. In this example, the URL is the tag, and the contents of the web page are the data. The percentage of accesses that result in cache hits is known as the hit rate or hit ratio of the cache.

The alternative situation, when the cache is consulted and found not to contain data with the desired tag, has become known as a cache miss. The previously uncached data fetched from the backing store during miss handling is usually copied into the cache, ready for the next access.

During a cache miss, the CPU usually ejects some other entry in order to make room for the previously uncached data. The heuristic used to select the entry to eject is known as the replacement policy. One popular replacement policy, "least recently used" (LRU), replaces the least recently used entry (see cache algorithm). More efficient caches compute use frequency against the size of the stored contents, as well as the latencies and throughputs for both the cache and the backing store. This works well for larger amounts of data, longer latencies and slower throughputs, such as experienced with a hard drive and the Internet but is not efficient for use with a CPU cache.

Writing policies



When a system writes data to cache, it must at some point write that data to the backing store as well. The timing of this write is controlled by what is known as the written policy.

There are two basic writing approaches:

  • Write-through: write is done synchronously both to the cache and to the backing store.
  • Write-back (also called write-behind): initially, writing is done only to the cache. The write to the backing store is postponed until the cache blocks containing the data are about to be modified/replaced by new content.
  • A write-back cache is more complex to implement since it needs to track which of its locations have been written over and mark them as dirty for later writing to the backing store. The data in these locations are written back to the backing store only when they are evicted from the cache, an effect referred to as a lazy writer. For this reason, a read miss in a write-back cache (which requires a block to be replaced by another) will often require two memory accesses to service: one to write the replaced data from the cache back to the store, and then one to retrieve the needed data.
  • Other policies may also trigger data write-back. The client may make any changes to data in the cache, and then explicitly notify the cache to write back the data.


No data is returned on write operations, thus there are two approaches for situations of write-misses:

  • Write allocate (also called fetch on write): data at the missed-write location is loaded to cache, followed by a write-hit operation. In this approach, write misses are similar to read misses.
  • No-write allocates (also called write-no-allocate or write around): data at the missed-write location is not loaded to cache and is written directly to the backing store. In this approach, only the reads are being cached.
  • Both write-through and write-back policies can use either of these write-miss policies, but usually, they are paired in this way:
  • A write-back cache uses write allocate, hoping for subsequent writes (or even reads) to the same location, which is now cached.
  • A write-through cache uses no-write allocate. Here, subsequent writes have no advantage, since they still need to be written directly to the backing store.
  • Entities other than the cache may change the data in the backing store, in which case the copy in the cache may become out-of-date or stale. Alternatively, when the client updates the data in the cache, copies of those data in other caches will become stale. Communication protocols between the cache managers which keep the data consistent are known as coherency protocols.
READ MORE


I Hope You Like The What is Cache in Computer/ Write Policy in Cache Memory

No comments:

Name

Affiliate Marketing,1,Android Apps,8,Android Top Games 2019,1,Antivirus,1,Blogging,2,Cache Memory,2,Computer,16,Computer Assembling,1,Computer Generations,1,Computer Virus,1,Domain,1,Dropshipping Business,1,E Commerce,4,Ebook,1,Evolution of Microsoft windows,6,Facebook Marketing Tutorial,1,Fix Computer Overheating,1,Flipkart Affiliate Program,1,Games,7,Hardware,5,Hindi Computer,6,Hindi Hardware,6,Hindi SEO Tutorial,8,Hindi Software,1,Hindi YouTube,2,how to bootable pendrive,1,How To Do,2,Internet,7,Keyboard,1,Laptop,1,Latest,100,Make Money Selling used items,1,Memory,2,Monitor,1,Mouse,1,Online Money,27,Operating system,11,Operating system Hindi,1,PC Tricks,2,Sell Photos,1,Smartphone,1,Software,3,Technology,1,Top 10,1,Top Stories,12,Twitter,1,Usertesting,1,Web Browser,3,Website,2,WhatsApp,1,Widget for Blogger,1,
ltr
item
NEtechy: What is Cache in Computer | Write Policy in Cache Memory
What is Cache in Computer | Write Policy in Cache Memory
What is Cache in Computer | Write Policy in Cache Memory I Hope You Like The What is Cache in Computer/ Write Policy in Cache Memory
https://4.bp.blogspot.com/-vOHhHJlDgZM/XKC4di2kvPI/AAAAAAAABtY/tjcKZFqKeFIybtTGPX3X--GjUDVdG4OKQCLcBGAs/s640/Write-Policy-in-Cache-Memory.jpeg
https://4.bp.blogspot.com/-vOHhHJlDgZM/XKC4di2kvPI/AAAAAAAABtY/tjcKZFqKeFIybtTGPX3X--GjUDVdG4OKQCLcBGAs/s72-c/Write-Policy-in-Cache-Memory.jpeg
NEtechy
https://www.netechy.com/2019/03/Write-Policy-in-Cache-Memory.html
https://www.netechy.com/
https://www.netechy.com/
https://www.netechy.com/2019/03/Write-Policy-in-Cache-Memory.html
true
751342609679224409
UTF-8
Loaded All Posts Not found any posts VIEW ALL Readmore Reply Cancel reply Delete By Home PAGES POSTS View All RECOMMENDED FOR YOU LABEL ARCHIVE SEARCH ALL POSTS Not found any post match with your request Back Home Sunday Monday Tuesday Wednesday Thursday Friday Saturday Sun Mon Tue Wed Thu Fri Sat January February March April May June July August September October November December Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec just now 1 minute ago $$1$$ minutes ago 1 hour ago $$1$$ hours ago Yesterday $$1$$ days ago $$1$$ weeks ago more than 5 weeks ago Followers Follow THIS CONTENT IS PREMIUM Please share to unlock Copy All Code Select All Code All codes were copied to your clipboard Can not copy the codes / texts, please press [CTRL]+[C] (or CMD+C with Mac) to copy