School of Informatics - University of Edinburgh Institute for Computing Systems Architecture - School of Informatics
Institute for Computing
Systems Architecture

Smart Cache: A Self Adaptive Cache Architecture for Energy Efficiency

    Paper - Smart Cache: A Self Adaptive Cache Architecture for Energy Efficiency
  • Type: Paper
  • Authors:
    K.Sundararajan, T.Jones and N.Topham.
  • Proceedings of the International Symposium on Systems, Architectures, Modeling, and Simulation (SAMOS'11), Samos, Greece, July 19-22, 2011.
  • Download as PDF
  • Abstract:

    The demand for low-power embedded systems re- quires designers to tune processor parameters to avoid excessive energy wastage. Tuning on a per-application or per-application- phase basis allows a greater saving in energy consumption without a noticeable degradation in performance. On-chip caches often consume a significant fraction of the total energy budget and are therefore prime candidates for adaptation. Fixed-configuration caches must be designed to deliver low average memory access times across a wide range of potential applications. However, this can lead to excessive energy consumption for applications that do not require the full capacity or associativity of the cache at all times. Furthermore, in systems where the clock period is constrained by the access times of level-1 caches, the clock frequency for all applications is effectively limited by the cache requirements of the most demanding phase within the most demanding application. This results in both performance and energy efficiency that represents the lowest common denominator across the applications. In this paper we present a Set and way Management cache Architecture for Run-Time reconfiguration (SMART cache), a cache architecture that allows reconfiguration in both its size and associativity. Results show the energy-delay of the Smart cache is on average 14% better than state-of-the-art cache reconfiguration architectures. We then leverage the flexibility provided by our cache to dynamically reconfigure the hierarchy as a program runs. We develop a decision tree based machine learning model to control the adaptation and automatically reconfigure the cache to the best configuration. Results show an average reduction in energy-delay product of 17% in the data cache (just 1% away from an oracle result) and 34% in the level-2 cache (just 5% away from an oracle), with an overall performance degradation of less than 2% compared with a baseline statically-configured cache.