80 likes | 91 Views
This paper explores the dynamic use of Error Correction Codes (ECC) in caches to balance data protection and performance. It presents a method to self-tune ECC usage based on cache activity, optimizing cache lifetime and efficiency. By tracking ACE and Un-ACE data in cache lifetime periods, the system adapts ECC usage, enhancing performance without compromising data integrity. This approach minimizes overhead and improves overall cache efficiency, ensuring data protection when needed while maximizing performance when vulnerability is low.
E N D
Self-* SystemsCSE 598B Paper title: Dynamically employing ECC in caches Presented by: Niranjan Soundararajan
Presentation layout • Background Work • Cache Lifetime • Implementation • Performance issues • Conclusion
Background Work • Caches are protected by ECC to tolerate errors in data. • ECC involves computation. Therefore they affect cache latency as they are on the critical path. • Protection not always needed. This is because, there are periods during execution when vulnerability of cache is low.
Cache Lifetime • Data in cache is important only when results are needed for further execution or is being sent as output to user. These data required for further execution are called ACE. Others are called Un-ACE. • The amount of ACE data in cache during program execution decides the cache vulnerability. • ACE components – RR, WR, FR, WE • Un-ACE components - WW, RW, FW, RE, FE, IF, EF data in cache is not important.
Cache Lifetime … • Current work give the total ACE and Un-ACE activities in cache lifetime. • What is required is keeping track of the different activities in each period and take appropriate action. • If UNACE operations dominate cache activity, then cache performance can be given more importance over data protection
Implementation • Cache lifetime is broken down to multiple periods of execution. • Each period the UNACE portion is kept track. • If the UNACE portion is significant enough, then the ECC are not used for the rest of the period. • Cache self-tunes to establish the threshold for switching ECC off.
Performance issues • When ECC is required for next cycle, it has to be calculated for data as writes may update the cache. • Keeping track of multiple activities during execution might itself emerge as overhead. • Main advantage: Both mentioned issues are not on the critical path of execution.
Conclusion • How effective is the scheme … …