I've got a client and a server running on the same box. The client is able to issue urllib.request.Request() commands to external URLs and get responses back in about 0.1 seconds, fast enough that as a user I'm not noticing any real delay. When issuing a Request() to my local http.server.HTTPServer() there's a 1 second delay between the client calling urllib.request.urlopen() and the server's do_GET(self) even getting the request.
I saw some answers suggest that it was an issue with DNS lookup for logging, and that overwriting address_string(self) would fix it, but I'm still seeing the exact same delay with or without the modification.
Server:
import http.server
import time
class MyHTTPHandler(http.server.BaseHTTPRequestHandler):
def do_GET(self):
start = time.time()
print(start)
self.send_response(200)
self.end_headers()
def address_string(self):
host, port = self.client_address[:2]
return host
server = http.server.HTTPServer(('localhost', 9999), MyHTTPHandler)
try:
server.serve_forever()
except KeyboardInterrupt:
print('Stopping server')
Client:
import urllib.request
import urllib.error
import time
def send_data():
start = time.time()
r = urllib.request.Request(url='http://localhost:9999')
print(time.time())
urllib.request.urlopen(r)
print(time.time() - start)
while True:
input('Press enter to send')
send_data()
Any ideas what I'm missing to get rid of that one second delay? I'd like this basic server to respond at least as fast as a web server would.