增加大内存垃圾回收性能

huangapple go评论78阅读模式
英文:

go large memory garbage collection performance

问题

我正在考虑使用Go语言实现一个内存缓存守护程序。它有可能使用大量的内存(比如说,一太字节)。将内存分散到不同的堆中并不是一个好的选择,我希望所有的内存都在一个内存空间中。有没有人有在Go语言中使用如此大内存大小的经验?垃圾回收(GC)的性能会达到可接受的水平吗?

英文:

I am considering implementing a memory caching daemon in Go. It has potential of getting some serious memory utilization (say, Terabyte). Fragmenting into separate heaps is not a good option, I want it all in one memory space. Does anyone have experience running Go with such huge memory sizes? Is GC going to perform acceptably?

答案1

得分: 3

我正在尝试做同样的事情,但是只有二叉树项目给我提供了良好的缓存数据性能。这个项目支持在一台拥有8GB内存的Ubuntu 12.0.4 LTS机器上存储超过100万个节点的数据。此外,它加载和搜索数据的速度也很快。

我还测试了其他项目,包括LMDB,但它不支持在内存中存储很多节点,还有kv、go-cache和goleveldb,但没有一个能像treap那样从内存中快速恢复数据。

英文:

I am trying to do the same but the only projects that gave me a good performance to cache data was the binary tree https://github.com/stathat/treap m which supported more than 1 millons of nodes on memory in one machine Ubuntu 12.0.4 LTS with 8 GB memory. Furthermore, it was fast loading and searching data.

Other projects that I tested was LMDB but not support many nodes on memory, kv, go-cache and goleveldb but no one was as faster to recovery data from memory that treap.

huangapple
  • 本文由 发表于 2014年4月15日 02:28:21
  • 转载请务必保留本文链接:https://go.coder-hub.com/23067500.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定