rados

The Definitive Guide: Ceph Cluster on Raspberry Pi

Learn how to deploy a ceph cluster on a Raspberry Pi. Commodity off the shelf hardware meets a budget.
A Ceph cluster on Raspberry Pi is an awesome way to create a RADOS home storage solution (NAS) that is highly redundant and low power usage. It’s also a low cost way to get into Ceph, which may or may not be the future of storage (software defined storage definitely is as a whole). Ceph on ARM is an interesting idea in and of itself. I built one of these as a development environment (playground) for home.

Copy Ceph Pool Objects to Another Pool

Code and explainantion for copying objects between ceph pools using librados.
Sometimes it is necessary to copy Ceph pool objects from one Ceph pool to another - such as when changing CRUSH/erasure rule sets on an expanding cluster. There is a built-in command in RADOS for doing this. However the command in question, rados cppool , has some limitations. It only seems to work with replicated target pools. Thus it cannot copy Ceph pool objects from a erasure pool to a replicated pool, or between erasure pools.