Go Redis 加载中

huangapple go评论70阅读模式
英文:

Go Redis Loading

问题

我正在尝试将2亿个键加载到Redis中,通常在大约3100万个键时开始出现错误,必须停止。我正在使用Go语言和Redis库"github.com/garyburd/redigo/redis"。

我设置了一个连接池,代码如下:

func newPool(server string) *redis.Pool {
    return &redis.Pool{
        MaxIdle:     3,
        MaxActive:   10,
        IdleTimeout: 240 * time.Second,
        Dial: func() (redis.Conn, error) {
            c, err := redis.Dial("tcp", server)
            if err != nil {
                return nil, err
            }
            return c, err
        },
        TestOnBorrow: func(c redis.Conn, t time.Time) error {
            _, err := c.Do("PING")
            return err
        },
    }
}

然后,我尝试使用以下函数向Redis填充值:

func RedisServerBatchLoadKeys(rtbExchange string, keys []string) {
    redisLock.Lock()
    defer redisLock.Unlock()
    retry := 0
    for {
        conn := GetConnOrPanic(rtbExchange)
        defer conn.Close()
        conn.Send("MULTI")
        for _, key := range keys {
            conn.Send("SET", key, maxCount)
            conn.Send("EXPIRE", key, numSecondsExpire)
        }
        _, err := conn.Do("EXEC")
        if err == nil {
            break
        } else if !(err == io.EOF) {
            CheckRedisError(err, rtbExchange, "Could not load batch")
        } else {
            retry++
        }
        if retry >= 10 {
            CheckRedisError(err, rtbExchange, "Could not load batch - 10 retries")
        }
    }
}

我一直遇到许多错误,例如:

  • read tcp 10.249.15.194:6379: connection reset by peer
  • dial tcp 10.249.15.194:6379: connection refused
  • redis#RedisError: EOF

我是否做错了什么基本的事情,还是需要添加更多的错误检查(除了我已经添加的EOF)?

谢谢。

英文:

I am trying to load 200 million keys into redis and usually start to get an error at around 31 million keys and have to stop.I am using golang and the redis library "github.com/garyburd/redigo/redis"

I set up a connection pool as so:

func newPool(server string) *redis.Pool {
    return &redis.Pool{
        MaxIdle: 3,
        MaxActive: 10,
        IdleTimeout: 240 * time.Second,
        Dial: func () (redis.Conn, error) {
            c, err := redis.Dial("tcp", server)
            if err != nil {
                return nil, err
            }
            return c, err
        },
        TestOnBorrow: func(c redis.Conn, t time.Time) error {
            _, err := c.Do("PING")
            return err
        },
    }
}

I then try to fill up redis with values with this function:

func RedisServerBatchLoadKeys(rtbExchange string, keys []string){
  redisLock.Lock()
  defer redisLock.Unlock()
  retry := 0
  for {
    conn := GetConnOrPanic(rtbExchange)
    defer conn.Close()
    conn.Send("MULTI")
    for _, key := range keys {
      conn.Send("SET", key, maxCount)
      conn.Send("EXPIRE", key, numSecondsExpire)
    }
    _, err := conn.Do("EXEC")
    if err == nil {
      break
    } else if !(err == io.EOF) {
      CheckRedisError(err, rtbExchange, "Could not load batch")
    } else {
      retry ++
    }
    if retry >= 10 {
      CheckRedisError(err, rtbExchange, "Could not load batch - 10 retries")
    }
  }
}

I have been getting numerous errors such as:

  • read tcp 10.249.15.194:6379: connection reset by peer
  • dial tcp 10.249.15.194:6379: connection refused
  • redis#RedisError : EOF

Am I doing something fundamentally wrong or do I have to add in more error checks (aside from the EOF that I added).

Thanks,

答案1

得分: 2

只是猜测:2亿个键是很多的。你的内存是否足够容纳这么大的数据库?

Redis文档中提到:

Redis可以处理最多2^32个键,并且在实践中已经测试过每个实例至少可以处理2.5亿个键。

换句话说,你的限制很可能是系统中可用的内存。

他们还说:

如果Redis内存用尽会发生什么?

Redis要么会被Linux内核的OOM killer杀死,要么会崩溃并显示错误,要么会开始变慢。

我认为你无法连接的原因可能是服务器实际上已经关闭。也许它被重新启动,每次运行脚本时都会在同一个位置停止,因为那时内存用尽。

如果这是你的问题,你可以尝试以下几个方法:

  1. 使用Redis哈希,可以更高效地存储数据。参考http://redis.io/topics/memory-optimization
  2. 将数据集分片(shard)到多个服务器上(例如,如果你有4台服务器,可以使用key % 4确定存储在哪个Redis服务器下)。如果你追求的是O(1)的查找效率,你仍然可以得到这个效果,但是系统变得更加脆弱,因为存在多个故障点。
英文:

Just a guess: 200 million keys is a lot. Do you have enough memory for that size database?

The Redis docs say:

> Redis can handle up to 2^32 keys, and was tested in practice to handle at least 250 million of keys per instance.
>
> In other words your limit is likely the available memory in your system.

They also say:

> What happens if Redis runs out of memory?
>
> Redis will either be killed by the Linux kernel OOM killer, crash with an error, or will start to slow down.

It seems plausible to me that you're not able to connect because the server is actually down. Perhaps it gets restarted, and the next time you run your script it gets to the same place every time because that's when you run out of memory.

If this is your problem there are a couple things you could try:

  1. Use a redis hash which can store data more efficiently. See http://redis.io/topics/memory-optimization
  2. Partition (shard) your data set across multiple servers (for example if you had 4 servers you could take your key % 4 to determine which redis server to store under) If what you're going for is O(1) lookup you'll still get that, though you've made your system more brittle because there are multiple points of failure.

huangapple
  • 本文由 发表于 2015年1月16日 08:31:26
  • 转载请务必保留本文链接:https://go.coder-hub.com/27975494.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定