CephNotes

Some notes about Ceph
Laurent Barbe @SIB

Crushmap for 2 DC

Update 2021: Since Pacific version, there is a specific operating mode for the monitors in case of an stretched cluster. See : https://docs.ceph.com/en/latest/rados/operations/stretch-mode/ For more informations, check out Greg's talk on Fostem

An example of crushmap for 2 Datacenter replication with 2 or 3 replica :

rule replicated_ruleset {
    ruleset X
    type replicated
    min_size 2
    max_size 3
    step take default
    step choose firstn 2 type datacenter
    step chooseleaf firstn -1 type host
    step emit
}

This working well with pool size=2 (not recommended!) or 3. If you set pool size more than 3 (and increase the max_size in crush), be careful : you will have n-1 replica on one side and only one on the other datacenter.

If you want to be able to write data even when one of the datacenters is inaccessible, pool min_size should be set at 1 even if size is set to 3. In this case, pay attention to the monitors location.

Other posts about crushmap : http://cephnotes.ksperis.com/tag/crushmap.html

Comments