TCP Socket Timeout Issues When Sending Large Data in Python 3.8
I'm performance testing and I'm learning this framework and I'm working with an scenario with my TCP socket implementation in Python 3.8 where the connection times out when I'm trying to send large amounts of data... I have a server that accepts connections and a client that sends data, but when the data size exceeds 64KB, I get a timeout behavior. The behavior message I receive is `socket.timeout: timed out`. I've set the timeout using `socket.settimeout(5)` on the client side, but I suspect that the connection might be getting interrupted due to the large data packet. Here's a simplified version of my server code: ```python import socket server_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) server_socket.bind(('127.0.0.1', 65432)) server_socket.listen(1) while True: conn, addr = server_socket.accept() print(f'Connected by {addr}') data = conn.recv(1024) print(f'Received data: {data}') conn.close() ``` And here's the client code: ```python import socket import time client_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) client_socket.settimeout(5) client_socket.connect(('127.0.0.1', 65432)) large_data = b'x' * 100000 # 100KB of data try: client_socket.sendall(large_data) except socket.timeout as e: print(f'Timeout occurred: {e}') finally: client_socket.close() ``` I've tried adjusting the buffer size in the `recv` method, but it didnβt seem to help. The timeout seems to happen before the server can process the data, and I am not sure if itβs an scenario with the way I handle the socket or if there are inherent limitations in sending large packets over TCP. Any advice on how to properly implement sending large data without working with timeouts would be greatly appreciated. I'm developing on Ubuntu 22.04 with Python. Could this be a known issue? This is for a REST API running on Windows 11. What are your experiences with this?