I'm trying to upload a large file (1GB) to an SFTP, but I keep getting some variant of: File "/venv/lib/python2.7/site-packages/paramiko/file.py", line 339, in write self._write_all(data) File "/venv/lib/python2.7/site-packages/paramiko/file.py", line 456, in _write_all count = self._write(data) File "/venv/lib/python2.7/site-packages/paramiko/sftp_file.py", line 180, in _write t, msg = self.sftp._read_response(req) File "/venv/lib/python2.7/site-packages/paramiko/sftp_client.py", line 762, in _read_response raise SSHException('Server connection dropped: %s' % str(e)) SSHException: Server connection dropped: I noticed that if I update MAX_REQUEST_SIZE (in sftp_file.py) to be 1024 instead of 32768, it works. Does this mean that my only option is to copy/paste a custom version of sftp_file.py with MAX_REQUEST_SIZE = 1024? Does anyone else have suggestions that won't slow down uploads? Update: It ended up throwing a OperationalError: SSL SYSCALL error: EOF detected error the last few times I tried to update the MAX_REQUEST_SIZE. For reference, this is what I'm currently doing: transport = paramiko.Transport((hostname, port)) transport.connect(username, password) sftp = paramiko.SFTPClient.from_transport(transport) f = sftp.open(ftp_path, 'wb') f.write(file_obj.read()) f.close() sftp.close() transport.close() Continue reading...