To test whether Platform Cache improves performance in your application, calculate the elapsed time with and without using the cache. Don’t rely on the Apex debug log timestamp for the execution time. Use the System.currentTimeMillis() method instead. For example, first call System.currentTimeMillis() to get the start time. Perform application logic, fetching the data from either the cache or another data source. Then calculate the elapsed time.
long startTime = System.currentTimeMillis();
// Your code here
long elapsedTime = System.currentTimeMillis() - startTime;
System.debug(elapsedTime);
Ensure that your code handles cache misses by testing cache requests that return null. To help with debugging, add logging information for cache operations.
Alternatively, use the Cache.CacheBuilder interface, which checks for cache misses.
public class CacheManager { private Boolean cacheEnabled; public void CacheManager() { cacheEnabled = true; } public Boolean toggleEnabled() { // Use for testing misses cacheEnabled = !cacheEnabled; return cacheEnabled; } public Object get(String key) { if (!cacheEnabled) return null; Object value = Cache.Session.get(key); if (value != null) System.debug(LoggingLevel.DEBUG, 'Hit for key ' + key); return value; } public void put(String key, Object value, Integer ttl) { if (!cacheEnabled) return; Cache.Session.put(key, value, ttl); // for redundancy, save to DB System.debug(LoggingLevel.DEBUG, 'put() for key ' + key); } public Boolean remove(String key) { if (!cacheEnabled) return false; Boolean removed = Cache.Session.remove(key); if (removed) { System.debug(LoggingLevel.DEBUG, 'Removed key ' + key); return true; } else return false; } }
When possible, group cache requests, but be aware of caching limits. To help improve performance, perform cache operations on a list of keys rather than on individual keys. For example, if you know which keys are necessary to invoke a Visualforce page or perform a task in Apex, retrieve all keys at once. To retrieve multiple keys, call get(keys) in an initialization method.
It’s more efficient to cache a few large items than to cache many small items separately. Caching many small items decreases performance and increases overhead, including total serialization size, serialization time, cache commit time, and cache capacity usage.
Don’t add many small items to the Platform Cache within one request. Instead, wrap data in larger items, such as lists. If a list is large, consider breaking it into multiple items. Here’s an example of what to avoid.
// Don't do this! public class MyController { public void initCache() { List<Account> accts = [SELECT Id, Name, Phone, Industry, Description FROM Account limit 1000]; for (Integer i=0; i<accts.size(); i++) { Cache.Org.put('acct' + i, accts.get(i)); } } }
Instead, wrap the data in a few reasonably large items without exceeding the limit on the size of single cached items.
// Do this instead. public class MyController { public void initCache() { List<Account> accts = [SELECT Id, Name, Phone, Industry, Description FROM Account limit 1000]; Cache.Org.put('accts', accts); } }
Another good example of caching larger items is to encapsulate data in an Apex class. For example, you can create a class that wraps session data, and cache an instance of the class rather than the individual data items. Caching the class instance improves overall serialization size and performance.
When you add items to the cache, be aware of the following limits.
When you add items to the cache, make sure that you are not exceeding local cache limits within a request. The local cache limit for the session cache is 500 KB and 1,000 KB for the org cache. If you exceed the local cache limit, items can be evicted from the local cache before the request has been committed. This eviction can cause unexpected misses and long serialization time and can waste resources.
Consider the following guidelines to minimize expensive operations.