The redis server version I use is 2.8.9 from MSOpenTech github. Can anyone shed light on why redis "info" command indicates that used memory is 21 GB even though the RDB file that's saved on disk is < than 4 GB? I did successfully run a "save" command before noting down the size of the RDB file. The qfork heapfile is 30 Gb as it's been configured in redis.windows.conf.
Configuration :
maxheap 30gb max-memory 20 Gb appendonly no save 18000 1
The server has 192 GB of physical RAM, but unfortunately only has about 60 GB of free disk space and I had to set max-heap and max-memory to 30 Gb and 20 Gb respectively so that I have additional space to persist the data on disk.
I'm using redis as a cache and the save interval is large as seeding the data takes a long time and I don't want constant writing to file. Once seeding is done, the DB is updated with newer data once a day.
My questions are :
How is the saved RDB file so small? Is it solely due to compression (rdbcompression yes)? If yes, can the same compression mechanism be used to store data in memory too? I make use of lists extensively.
Before I ran the "save" command, the working set and private bytes in process-explorer was very small. Is there a way I can breakdown memory usage by datastructure? For example : List uses x amount, Hash uses y amount etc?
Is there any way I can store the AOF file ( I turned off AOF and use RDB because the AOF files were filling up disk space fast ) in a network path ( shared drive or NAS )? I tried setting the dir config to \someip\some folder but the service failed to start with the message "Cant CHDIR to location"
I'm unable to post images, but this is what process-explorer has to say about the redis-server instance :
Virtual Memory:
Physical Memory:
The latest saved dump.rdb is 3.81 GB and the heap file is 30 GB.
# Server
redis_version:2.8.9
redis_git_sha1:00000000
redis_git_dirty:0
redis_build_id:1fe181ad2447fe38
redis_mode:standalone
os:Windows
arch_bits:64
multiplexing_api:winsock_IOCP
gcc_version:0.0.0
process_id:12772
run_id:553f2b4665edd206e632b7040aa76c0b76083f4d
tcp_port:6379
uptime_in_seconds:24087
uptime_in_days:0
hz:50
lru_clock:14825512
config_file:D:\RedisService/redis.windows.conf
# Clients
connected_clients:2
client_longest_output_list:0
client_biggest_input_buf:0
blocked_clients:0
# Memory
used_memory:21484921736
used_memory_human:20.01G
used_memory_rss:21484870536
used_memory_peak:21487283360
used_memory_peak_human:20.01G
used_memory_lua:3156992
mem_fragmentation_ratio:1.00
mem_allocator:dlmalloc-2.8
# Persistence
loading:0
rdb_changes_since_last_save:0
rdb_bgsave_in_progress:0
rdb_last_save_time:1407328559
rdb_last_bgsave_status:ok
rdb_last_bgsave_time_sec:1407328560
rdb_current_bgsave_time_sec:-1
aof_enabled:0
aof_rewrite_in_progress:0
aof_rewrite_scheduled:0
aof_last_rewrite_time_sec:-1
aof_current_rewrite_time_sec:-1
aof_last_bgrewrite_status:ok
aof_last_write_status:ok
# Stats
total_connections_received:9486
total_commands_processed:241141370
instantaneous_ops_per_sec:0
rejected_connections:0
sync_full:0
sync_partial_ok:0
sync_partial_err:0
expired_keys:0
evicted_keys:0
keyspace_hits:30143
keyspace_misses:81
pubsub_channels:0
pubsub_patterns:0
latest_fork_usec:1341134
The redis server version I use is 2.8.9 from MSOpenTech github. Can anyone shed light on why redis "info" command indicates that used memory is 21 GB even though the RDB file that's saved on disk is < than 4 GB? I did successfully run a "save" command before noting down the size of the RDB file. The qfork heapfile is 30 Gb as it's been configured in redis.windows.conf.
Configuration :
maxheap 30gb max-memory 20 Gb appendonly no save 18000 1
The server has 192 GB of physical RAM, but unfortunately only has about 60 GB of free disk space and I had to set max-heap and max-memory to 30 Gb and 20 Gb respectively so that I have additional space to persist the data on disk.
I'm using redis as a cache and the save interval is large as seeding the data takes a long time and I don't want constant writing to file. Once seeding is done, the DB is updated with newer data once a day.
My questions are :
How is the saved RDB file so small? Is it solely due to compression (rdbcompression yes)? If yes, can the same compression mechanism be used to store data in memory too? I make use of lists extensively.
Before I ran the "save" command, the working set and private bytes in process-explorer was very small. Is there a way I can breakdown memory usage by datastructure? For example : List uses x amount, Hash uses y amount etc?
Is there any way I can store the AOF file ( I turned off AOF and use RDB because the AOF files were filling up disk space fast ) in a network path ( shared drive or NAS )? I tried setting the dir config to \someip\some folder but the service failed to start with the message "Cant CHDIR to location"
I'm unable to post images, but this is what process-explorer has to say about the redis-server instance :
--
You received this message because you are subscribed to the Google Groups "Redis DB" group.
To unsubscribe from this group and stop receiving emails from it, send an email to redis-db+u...@googlegroups.com.
To post to this group, send email to redi...@googlegroups.com.
Visit this group at http://groups.google.com/group/redis-db.
For more options, visit https://groups.google.com/d/optout.
Hi Josiah,The RDB file being smaller makes sense, although I did not expect to see so much difference between the in-memory representation and on disk.
In your case, there are a couple things that you might be able to do to reduce the amount of memory that Redis uses, but it depends on the data you are storing in your lists. If you can share, how big is your typical item in a list, and how large do your lists tend to be?
The list has one entry per date and each item is a delimited string which is between 100 and 250 characters long, depending on the date.
I'd check out redis-rdb-tools first, then look for others if it isn't able to do what you need.
I'll check this out. Thank you.Alternatively, the network paths for Windows are usually of the form \\hostname\path, not \hostname\path as you listed.
This was a typo, the config entry I put in was in the form \\hostname\path. I also tried creating a mapped drive but redis couldn't read the path. I get the same "Cant CHDIR" error. I did ensure that the service account under which redis is running has permissions. I suspect that it's unable to parse windows path format ex T:\RedisRDB. Also the documentation states that the heap file ( QFork ) must be on a local drive. I'm not sure how it determines if the drive is local as opposed to being a mapped drive, but that's the biggest stumbling block for me right now.
Are you sure you can't just slip in a 1-4TB drive without anyone noticing?Unfortunately no :( These are servers in the enterprise data center and it would take weeks if not months to add storage.I will play around with list-max-ziplist-entries 512 and list-max-ziplist-value 64 to see if they make a difference, understanding the fact that insertion/read speed may be affected.