是否存在IDictionary的LRU實現?

[英]Is it there any LRU implementation of IDictionary?


I would like to implement a simple in-memory LRU cache system and I was thinking about a solution based on an IDictionary implementation which could handle an hashed LRU mechanism. Coming from java, I have experiences with LinkedHashMap, which works fine for what I need: I can't find anywhere a similar solution for .NET.

我想實現一個簡單的內存LRU緩存系統,我正在考慮一個基於IDictionary實現的解決方案,它可以處理散列LRU機制。來自java,我有使用LinkedHashMap的經驗,它可以滿足我的需要:我無法找到類似的.NET解決方案。

Has anyone developed it or has anyone had experiences like this?

有人開發過它或有沒有人有這樣的經歷?

9 个解决方案

#1


There is nothing in the base class libraries that does this.

基類庫中沒有任何內容可以執行此操作。

On the free side, maybe something like C5's HashedLinkedList would work.

在免費方面,也許像C5的HashedLinkedList這樣的東西可行。

If you're willing to pay, maybe check out this C# toolkit. It contains an implementation.

如果您願意付費,也許可以查看這個C#工具包。它包含一個實現。

#2


This a very simple an fast implementation we developed for a web site we own.

這是我們為我們擁有的網站開發的一個非常簡單的快速實現。

We try to improve the code as much as possible but keeping it thread safe. I think the code is very simple and clear, but if you need some explanation or a guide related to how to use it, don't hesitate to ask.

我們盡可能地改進代碼但保持線程安全。我認為代碼非常簡單明了,但如果您需要一些解釋或指導如何使用它,請不要猶豫。

namespace LRUCache
{
    public class LRUCache<K,V>
    {
        private int capacity;
        private Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>> cacheMap = new Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>>();
        private LinkedList<LRUCacheItem<K, V>> lruList = new LinkedList<LRUCacheItem<K, V>>();

        public LRUCache(int capacity)
        {
            this.capacity = capacity;
        }

        [MethodImpl(MethodImplOptions.Synchronized)]
        public V get(K key)
        {
            LinkedListNode<LRUCacheItem<K, V>> node;
            if (cacheMap.TryGetValue(key, out node))
            {
                V value = node.Value.value;
                lruList.Remove(node);
                lruList.AddLast(node);
                return value;
            }
            return default(V);
        }

        [MethodImpl(MethodImplOptions.Synchronized)]
        public void add(K key, V val)
        {
            if (cacheMap.Count >= capacity)
            {
                RemoveFirst();
            }

            LRUCacheItem<K, V> cacheItem = new LRUCacheItem<K, V>(key, val);
            LinkedListNode<LRUCacheItem<K, V>> node = new LinkedListNode<LRUCacheItem<K, V>>(cacheItem);
            lruList.AddLast(node);
            cacheMap.Add(key, node);
        }

        private void RemoveFirst()
        {
            // Remove from LRUPriority
            LinkedListNode<LRUCacheItem<K,V>> node = lruList.First;
            lruList.RemoveFirst();

            // Remove from cache
            cacheMap.Remove(node.Value.key);
        }
    }

    class LRUCacheItem<K,V>
    {
        public LRUCacheItem(K k, V v)
        {
            key = k;
            value = v;
        }
        public K key;
        public V value;
    }
}

#3


Found you answer while googling, also found this:

發現你在google搜索時回答,也發現了這個:

http://code.google.com/p/csharp-lru-cache/

csharp-lru-cache: LRU cache collection class library

csharp-lru-cache:LRU緩存集合類庫

This is a collection class that functions as a least-recently-used cache. It implements ICollection<T>, but also exposes three other members:

這是一個集合類,它起到最近最少使用的緩存的作用。它實現了ICollection ,但也暴露了其他三個成員:

  • Capacity, the maximum number of items the cache can contain. Once the collection is at capacity, adding a new item to the cache will cause the least recently used item to be discarded. If the Capacity is set to 0 at construction, the cache will not automatically discard items.
  • 容量,緩存可以包含的最大項目數。一旦集合處於容量狀態,向緩存中添加新項將導致最近最少使用的項被丟棄。如果在構造時將Capacity設置為0,則緩存不會自動丟棄項目。

  • Oldest, the oldest (i.e. least recently used) item in the collection.
  • 最早,最早(即最近最少使用)的集合中的項目。

  • DiscardingOldestItem, an event raised when the cache is about to discard its oldest item. This is an extremely simple implementation. While its Add and Remove methods are thread-safe, it shouldn't be used in heavy multithreading environments because the entire collection is locked during those methods.
  • DiscardingOldestItem,當緩存即將丟棄其最舊的項目時引發的事件。這是一個非常簡單的實現。雖然它的Add和Remove方法是線程安全的,但它不應該用在繁重的多線程環境中,因為整個集合在這些方法中被鎖定。

#4


I've recently released a class called LurchTable to address the need for a C# variant of the LinkedHashMap. A brief discussion of the LurchTable can be found here.

我最近發布了一個名為LurchTable的類,以滿足對LinkedHashMap的C#變體的需求。可以在這里找到關於LurchTable的簡短討論。

Basic features:

  • Linked Concurrent Dictionary by Insertion, Modification, or Access
  • 通過插入,修改或訪問鏈接的並發字典

  • Dictionary/ConcurrentDictionary interface support
  • Dictionary / ConcurrentDictionary接口支持

  • Peek/TryDequeue/Dequeue access to 'oldest' entry
  • Peek / TryDequeue / Dequeue訪問“最舊”條目

  • Allows hard-limit on items enforced at insertion
  • 允許對插入時強制執行的項目進行硬限制

  • Exposes events for add, update, and remove
  • 公開事件以進行添加,更新和刪除

Source Code: http://csharptest.net/browse/src/Library/Collections/LurchTable.cs

源代碼:http://csharptest.net/browse/src/Library/Collections/LurchTable.cs

GitHub: https://github.com/csharptest/CSharpTest.Net.Collections

HTML Help: http://help.csharptest.net/

HTML幫助:http://help.csharptest.net/

PM> Install-Package CSharpTest.Net.Collections

PM>安裝包CSharpTest.Net.Collections

#5


This takes Martin's code with Mr T's suggestions and makes it Stylecop friendly. Oh, it also allows for disposal of values as they cycle out of the cache.

這需要馬丁的代碼與T先生的建議,並使Stylecop友好。哦,它還允許在循環退出緩存時處理值。

namespace LruCache
{
    using System;
    using System.Collections.Generic;

    /// <summary>
    /// A least-recently-used cache stored like a dictionary.
    /// </summary>
    /// <typeparam name="TKey">
    /// The type of the key to the cached item
    /// </typeparam>
    /// <typeparam name="TValue">
    /// The type of the cached item.
    /// </typeparam>
    /// <remarks>
    /// Derived from https://stackoverflow.com/a/3719378/240845
    /// </remarks>
    public class LruCache<TKey, TValue>
    {
        private readonly Dictionary<TKey, LinkedListNode<LruCacheItem>> cacheMap =
            new Dictionary<TKey, LinkedListNode<LruCacheItem>>();

        private readonly LinkedList<LruCacheItem> lruList =
            new LinkedList<LruCacheItem>();

        private readonly Action<TValue> dispose;

        /// <summary>
        /// Initializes a new instance of the <see cref="LruCache{TKey, TValue}"/>
        /// class.
        /// </summary>
        /// <param name="capacity">
        /// Maximum number of elements to cache.
        /// </param>
        /// <param name="dispose">
        /// When elements cycle out of the cache, disposes them. May be null.
        /// </param>
        public LruCache(int capacity, Action<TValue> dispose = null)
        {
            this.Capacity = capacity;
            this.dispose = dispose;
        }

        /// <summary>
        /// Gets the capacity of the cache.
        /// </summary>
        public int Capacity { get; }

        /// <summary>Gets the value associated with the specified key.</summary>
        /// <param name="key">
        /// The key of the value to get.
        /// </param>
        /// <param name="value">
        /// When this method returns, contains the value associated with the specified
        /// key, if the key is found; otherwise, the default value for the type of the 
        /// <paramref name="value" /> parameter. This parameter is passed
        /// uninitialized.
        /// </param>
        /// <returns>
        /// true if the <see cref="T:System.Collections.Generic.Dictionary`2" /> 
        /// contains an element with the specified key; otherwise, false.
        /// </returns>
        public bool TryGetValue(TKey key, out TValue value)
        {
            lock (this.cacheMap)
            {
                LinkedListNode<LruCacheItem> node;
                if (this.cacheMap.TryGetValue(key, out node))
                {
                    value = node.Value.Value;
                    this.lruList.Remove(node);
                    this.lruList.AddLast(node);
                    return true;
                }

                value = default(TValue);
                return false;
            }
        }

        /// <summary>
        /// Looks for a value for the matching <paramref name="key"/>. If not found, 
        /// calls <paramref name="valueGenerator"/> to retrieve the value and add it to
        /// the cache.
        /// </summary>
        /// <param name="key">
        /// The key of the value to look up.
        /// </param>
        /// <param name="valueGenerator">
        /// Generates a value if one isn't found.
        /// </param>
        /// <returns>
        /// The requested value.
        /// </returns>
        public TValue Get(TKey key, Func<TValue> valueGenerator)
        {
            lock (this.cacheMap)
            {
                LinkedListNode<LruCacheItem> node;
                TValue value;
                if (this.cacheMap.TryGetValue(key, out node))
                {
                    value = node.Value.Value;
                    this.lruList.Remove(node);
                    this.lruList.AddLast(node);
                }
                else
                {
                    value = valueGenerator();
                    if (this.cacheMap.Count >= this.Capacity)
                    {
                        this.RemoveFirst();
                    }

                    LruCacheItem cacheItem = new LruCacheItem(key, value);
                    node = new LinkedListNode<LruCacheItem>(cacheItem);
                    this.lruList.AddLast(node);
                    this.cacheMap.Add(key, node);
                }

                return value;
            }
        }

        /// <summary>
        /// Adds the specified key and value to the dictionary.
        /// </summary>
        /// <param name="key">
        /// The key of the element to add.
        /// </param>
        /// <param name="value">
        /// The value of the element to add. The value can be null for reference types.
        /// </param>
        public void Add(TKey key, TValue value)
        {
            lock (this.cacheMap)
            {
                if (this.cacheMap.Count >= this.Capacity)
                {
                    this.RemoveFirst();
                }

                LruCacheItem cacheItem = new LruCacheItem(key, value);
                LinkedListNode<LruCacheItem> node = 
                    new LinkedListNode<LruCacheItem>(cacheItem);
                this.lruList.AddLast(node);
                this.cacheMap.Add(key, node);
            }
        }

        private void RemoveFirst()
        {
            // Remove from LRUPriority
            LinkedListNode<LruCacheItem> node = this.lruList.First;
            this.lruList.RemoveFirst();

            // Remove from cache
            this.cacheMap.Remove(node.Value.Key);

            // dispose
            this.dispose?.Invoke(node.Value.Value);
        }

        private class LruCacheItem
        {
            public LruCacheItem(TKey k, TValue v)
            {
                this.Key = k;
                this.Value = v;
            }

            public TKey Key { get; }

            public TValue Value { get; }
        }
    }
}

#6


I don't believe so. I've certainly seen hand-rolled ones implemented several times in various unrelated projects (which more or less confirms this. If there was one, surely at least one of the projects would have used it).

我不相信。我當然看到在各種不相關的項目中實施了多次手工軋制(這或多或少證實了這一點。如果有的話,肯定至少有一個項目會使用它)。

It's pretty simple to implement, and usually gets done by creating a class which contains both a Dictionary and a List.

它實現起來非常簡單,通常通過創建一個包含Dictionary和List的類來完成。

The keys go in the list (in-order) and the items go in the dictionary.
When you Add a new item to the collection, the function checks the length of the list, pulls out the last Key (if it's too long) and then evicts the key and value from the dictionary to match. Not much more to it really

鍵進入列表(按順序),項目進入字典。當您向集合中添加新項時,該函數會檢查列表的長度,拉出最后一個鍵(如果它太長),然后從字典中刪除鍵和值以匹配。真的沒那么多

#7


The Caching Application Block of EntLib has an LRU scavenging option out of the box and can be in memory. It might be a bit heavyweight for what you want tho.

EntLib的緩存應用程序塊具有開箱即用的LRU清理選項,可以在內存中。對於你想要的東西,它可能有點重量級。

#8


I like Lawrence's implementation. Hashtable + LinkedList is a good solution. Regarding threading I would not lock this [MethodImpl(MethodImplOptions.Synchronized)], but rather using ReaderWriterLockSlim or spin lock (since contention usually fast) instead. In get function I would check if it's already the 1st item first, rather than always removing and adding. This gives you possibility to keep within a reader lock that not blocking other readers.

我喜歡勞倫斯的實施。 Hashtable + LinkedList是一個很好的解決方案。關於線程,我不會鎖定[MethodImpl(MethodImplOptions.Synchronized)],而是使用ReaderWriterLockSlim或自旋鎖(因為爭用通常很快)。在get函數中,我會檢查它是否已經是第一個項目,而不是總是刪除和添加。這使您可以保持讀取器鎖定,而不會阻止其他讀者。

#9


If it's an asp.net app you can use the cache class[1] but you'll be competing for space with other cached stuff, which may be what you want or may not be.

如果它是一個asp.net應用程序,你可以使用緩存類[1],但你將與其他緩存的東西競爭空間,這可能是你想要的或不是。

[1] http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx


注意!

本站翻译的文章,版权归属于本站,未经许可禁止转摘,转摘请注明本文地址:https://www.itdaan.com/blog/2009/04/15/7302c73bd7aad5ded5295e2b4568c132.html



 
粤ICP备14056181号  © 2014-2021 ITdaan.com