I got a response when using the requests library but not when using urllib2, so I experimented with HTTP request headers.
As it turns out, the server expects an Accept header; urllib2 doesn't send one, requests and cURL send */*.
Send one with urllib2 as well:
url = 'http://www.winkworth.co.uk/sale/property/flat-for-sale-in-masefield-court-london-n5/HIH140004'
req = urllib2.Request(url, headers={'accept': '*/*'})
response = urllib2.urlopen(req)
Demo:
>>> import urllib2
>>> url = 'http://www.winkworth.co.uk/sale/property/flat-for-sale-in-masefield-court-london-n5/HIH140004'
>>> len(urllib2.urlopen(url).read())
0
>>> request = urllib2.Request(url, headers={'accept': '*/*'})
>>> len(urllib2.urlopen(request).read())
37197
The server is at fault here; RFC 2616 states:
If no Accept header field is present, then it is assumed that the
client accepts all media types.
urlopenasynchronous? If so, maybe it isn't finished downloading when you try to read it?urlopenis synchronous. It is the server being broken by not returning anything when noAcceptheader is present.