With "n" stage pipeline the throughput should be "n" instructions.
As the pipe stages can't be perfectly balanced ( time to perform task in a pipeline stage), furthermore pipeline does involve some
Like java have a feature for handling exception handling "prime catch".the exception like divide by zero,out of bound.
The comparison can be made out of two factors
Write through is better in integrity as it will flush for each writes.
Write back holds up the write till the same cache line has to be used up for a read, which question the data integrity when multiple processors access the same region of data using its own internal cache.
Write Back - gives a good performance, as it save many memory write cycles /write.
Write Through - Doesn't give this performance compared to the write-back.
Virtual memory is a concept that, when implemented by a computer and its operating system, allows programmers to use a very large range of memory or storage addresses for stored data. The computing system maps the programmer's virtual addresses to real hardware storage addresses. Usually, the programmer is freed from having to be concerned about the availability of data storage.
In addition to managing the mapping of virtual storage addresses to real storage addresses, a computer implementing virtual memory or storage also manages storage swapping between active storage (RAM) and hard disk or other high volume storage devices. Data is read in units called "pages" of sizes ranging from a thousand bytes (actually 1,024 decimal bytes) up to several megabyes in size. This reduces the amount of physical storage access that is required and speeds up overall system performance.
DMA makes and support the two tier and three tier to n-tier architecture application in which it can be able to communicate and transfer the data as much as it can but practically it was very difficult with Windows DNA . The DMA Architecture uses only the basic encapsulation mechanism of COM for local, memory resident objects.