Your input is very helpful, Josiah. Thanks a lot!
We thought, to reduce the nr of instances, that we could run the sentinels and redis programs on the same server. And we understood that you need at least 2 or 3 sentinels for it to function correctly.
Our setup now is this: (4 servers in 2 zones)
Zone A | Zone B
------------------------------------
Master + sentinel | Slave + sentinel
Slave + sentinel | Slave + sentinel
Does this make sense? This way if zone A fails entirely Zone B has 2 sentinels and can agree to make 1 of the 2 remaining slaves the new master.
We do need it to work at 3AM when we are sleeping :)
So onto sharding; my understanding is that i should have 2 masters and my applications (node.js) should connect to both (more configuration and code).
Then they can use something like crc32 to determine what key/event goes to what server.
That way only 50% of my message should fail at the same time, unless both masters go down of course.
I've read a bunch of articles on this online but it still sounds pretty complicated, requiring a lot of configuration, code and places for bugs.
If you have the time to help us with this but we need to take this into a PM as to not spam everyone with updates thats ok. Let me know.
Cheers,
Guus