11

Click here to load reader

Using Redis at Facebook

Embed Size (px)

Citation preview

Page 1: Using Redis at Facebook
Page 2: Using Redis at Facebook

Redis @ Facebook

Guy YonishProduction Engineering Manager

Page 3: Using Redis at Facebook

• Need to read user configs in real time for every request • Need to update user config in some cases • High request rate - tens of thousands per second. Some

distribution is needed. • Requests are blocked on the results. Answers are needed

as quickly as possible

Problem #1 - User configuration

Page 4: Using Redis at Facebook

The Solution:

Page 5: Using Redis at Facebook

Hierarchical master/slave replicationGlobal Master

Regional Master Regional Master

Slave Slave

Slave

Slave Slave

Slave

Page 6: Using Redis at Facebook

Hierarchical master/slave replication• User configs are saved as Hashes • The multi-level replication brings

the data close to the server • Reads are distributed across the Slaves. • If more rps is needed - more slaves can be added - very

scalable. • Writes can be performed at every level depending on the

need

Page 7: Using Redis at Facebook

# Use my fqdn as the shard key shard_key = node[‘fqdn’]

# Returns a random position in the redis servers array redis_shard = Digest::MD5.hexdigest(shard_key) \ [0…7].to_i(16) % redis_servers.length

my_redis = redis_servers[redis_shard]

Balance the load easily - hashEasy to use in your attributes file for every server

Page 8: Using Redis at Facebook

Problem #2 - Data processing (stats)• Data processing and insertion

into long term storage takes time

• Stats are needed in real time for the user

Page 9: Using Redis at Facebook

The Solution:

Page 10: Using Redis at Facebook

Split data processing model• Perform daily bulk processing of that day. Can take long time. • Tailers process stats realtime and insert them into sharded

Redis cluster • Using HINCRBY makes it easy to increment counters • Updates are performed in bulks (batches) • Using Redis TTL to automagically delete data after X days

(while the bulk processing has finished) • 10s of GB of data, sharded using twemproxy.

Page 11: Using Redis at Facebook