Hello
I set ac_in_buffer_size to 100 meg, but still, pyftpdlib write to file as soon as data become available(I guess), This will be an issue when client has a high internet connection, io would go up about 99% and this make server load to increase, and cpu usage itself would go up to 50%(if upload speed is about 100 meg/s)
So my question is, why ac_in_buffer_size is not working, How I suppose to make it work ? I've been working on it for several days now but no luck
class Ftp(FTPHandler):
pass # Some methods to handle Uploading(client to server)
class Authorizer(DummyAuthorizer):
pass # some methods to handle auth
if __name__ == '__main__':
handler = Ftp
authorizer = Authorizer()
data_handler = DTPHandler
data_handler.ac_in_buffer_size = 104857600 # 100 meg
handler.dtp_handler = data_handler
handler.authorizer = authorizer
server = ThreadedFTPServer(('', 21), handler)
server.serve_forever()
I use python3.5.1 on CentOS7 to run my ftp server, And I'm using SSD(Actually I tested on another server which also had SATA3, when Uploading, io would go 100% and it dramatically increase server load-about to 5....6- but it's always lower than 2 on SSD in my opinion this shows data is not buffering as specified)
Thank you