英文:
go large memory garbage collection performance
问题
我正在考虑使用Go语言实现一个内存缓存守护程序。它有可能使用大量的内存(比如说,一太字节)。将内存分散到不同的堆中并不是一个好的选择,我希望所有的内存都在一个内存空间中。有没有人有在Go语言中使用如此大内存大小的经验?垃圾回收(GC)的性能会达到可接受的水平吗?
英文:
I am considering implementing a memory caching daemon in Go. It has potential of getting some serious memory utilization (say, Terabyte). Fragmenting into separate heaps is not a good option, I want it all in one memory space. Does anyone have experience running Go with such huge memory sizes? Is GC going to perform acceptably?
答案1
得分: 3
我正在尝试做同样的事情,但是只有二叉树项目给我提供了良好的缓存数据性能。这个项目支持在一台拥有8GB内存的Ubuntu 12.0.4 LTS机器上存储超过100万个节点的数据。此外,它加载和搜索数据的速度也很快。
我还测试了其他项目,包括LMDB,但它不支持在内存中存储很多节点,还有kv、go-cache和goleveldb,但没有一个能像treap那样从内存中快速恢复数据。
英文:
I am trying to do the same but the only projects that gave me a good performance to cache data was the binary tree https://github.com/stathat/treap m which supported more than 1 millons of nodes on memory in one machine Ubuntu 12.0.4 LTS with 8 GB memory. Furthermore, it was fast loading and searching data.
Other projects that I tested was LMDB but not support many nodes on memory, kv, go-cache and goleveldb but no one was as faster to recovery data from memory that treap.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论