Discussion:
How to write a non blocking SimpleHTTPRequestHandler ?
(too old to reply)
y***@yahoo.com
2015-02-02 09:54:10 UTC
Permalink
I wrote a little script that acts like a proxy, you just give it a URL and it will fetch the content and display it back to you.

For some reason, this proxy blocks sometimes and refuses to serve any new queries. The script still runs, but it seems like it's stuck somewhere.

When I strace it to see what it's doing, I find it hanging on this instruction :
***@backup[10.10.10.21] ~/SCRIPTS/INFOMANIAK # strace -fp 6918
Process 6918 attached - interrupt to quit
recvfrom(6,
^CProcess 6918 detached
***@backup[10.10.10.21] ~/SCRIPTS/INFOMANIAK #

I read in the SimpleHTTPServer source code that one can inherit from the SocketServer.TrheadingMixIn mixin to enable a threaded server to handle multiple requests at a time instead of just one (thinking maybe that's what was blocking it). However, it seems like it has nothing to do with my problem. What I need to do is not only handle multiple requests at a time, but more importantly to make the request handler non-blocking.

Any ideas ? here's come code :

import SimpleHTTPServer
import BaseHTTPServer
import SocketServer
import requests

class Handler(SocketServer.ThreadingMixIn,SimpleHTTPServer.SimpleHTTPRequestHandler):
def do_GET(self):
self.send_response(200)
self.send_header('Content-Type', 'text/html')
self.end_headers()
# self.path will contain a URL to be fetched by my proxy
self.wfile.write(getFlux(self.path.lstrip("/")))

session = requests.Session()
IP,PORT = "MY_IP_HERE",8080

def getFlux(url):
response = session.get(url)
s = response.text
return s

server = BaseHTTPServer.HTTPServer((IP,PORT),Handler)
server.serve_forever()

Thank you.
Amirouche Boubekki
2015-02-02 10:07:05 UTC
Permalink
Post by y***@yahoo.com
I wrote a little script that acts like a proxy, you just give it a URL and
it will fetch the content and display it back to you.
For some reason, this proxy blocks sometimes and refuses to serve any new
queries. The script still runs, but it seems like it's stuck somewhere.
When I strace it to see what it's doing, I find it hanging on this
Process 6918 attached - interrupt to quit
recvfrom(6,
^CProcess 6918 detached
I read in the SimpleHTTPServer source code that one can inherit from the
SocketServer.TrheadingMixIn mixin to enable a threaded server to handle
multiple requests at a time instead of just one (thinking maybe that's what
was blocking it). However, it seems like it has nothing to do with my
problem. What I need to do is not only handle multiple requests at a time,
but more importantly to make the request handler non-blocking.
import SimpleHTTPServer
import BaseHTTPServer
import SocketServer
import requests
class Handler(SocketServer.ThreadingMixIn,SimpleHTTPServer.SimpleH
self.send_response(200)
self.send_header('Content-Type', 'text/html')
self.end_headers()
# self.path will contain a URL to be fetched by my proxy
self.wfile.write(getFlux(self.path.lstrip("/")))
session = requests.Session()
IP,PORT = "MY_IP_HERE",8080
response = session.get(url)
s = response.text
return s
server = BaseHTTPServer.HTTPServer((IP,PORT),Handler)
server.serve_forever()
Your code seem perfectly fine. I had some trouble with py3's http.server
with IE10 (in a virtualbox...), I put together a small server script
similar to http.server that doesn't hang up on microsoft. It works with
ayncio. It's not ready to serve big files, but hopefully you can fix that.


HTH
Post by y***@yahoo.com
Thank you.
--
https://mail.python.org/mailman/listinfo/python-list
Yassine Chaouche
2015-02-03 09:08:29 UTC
Permalink
Thank you Amirouch. I was hoping to use something very simple and already provided by the standard library. If I can fix the script, the better. If the script can't be fixed, then I'll switch to another library (I already have one in mind).
Marko Rauhamaa
2015-02-03 09:27:01 UTC
Permalink
Post by Yassine Chaouche
I was hoping to use something very simple and already provided by the
standard library.
The standard library and nonblocking can't be used in the same sentence.

That is, unless and until you go asyncio.


Marko
Yassine Chaouche
2015-02-03 10:47:16 UTC
Permalink
Post by Marko Rauhamaa
The standard library and nonblocking can't be used in the same sentence.
Thanks Marko. It's a lost cause then. I am thinking about switching to one of the following :
- CherryPy
- Bottle
- circuits
- Quixote
- Weblayer.

If anybody has a pointer to an already made comparison I'm a taker.

Thanks for your help !
Chris Angelico
2015-02-03 10:54:49 UTC
Permalink
On Tue, Feb 3, 2015 at 9:47 PM, Yassine Chaouche
Post by Yassine Chaouche
Post by Marko Rauhamaa
The standard library and nonblocking can't be used in the same sentence.
Thanks Marko. It's a lost cause then.
You trimmed out the part where he mentioned asyncio. :)

ChrisA
Yassine Chaouche
2015-02-03 11:04:59 UTC
Permalink
Post by Chris Angelico
Post by Yassine Chaouche
Thanks Marko. It's a lost cause then.
You trimmed out the part where he mentioned asyncio. :)
ChrisA
IIRC asyncio is python 3 only and I'm not ready yet to make the leap.
Chris Angelico
2015-02-03 11:07:50 UTC
Permalink
On Tue, Feb 3, 2015 at 10:04 PM, Yassine Chaouche
Post by Yassine Chaouche
Post by Chris Angelico
Post by Yassine Chaouche
Thanks Marko. It's a lost cause then.
You trimmed out the part where he mentioned asyncio. :)
ChrisA
IIRC asyncio is python 3 only and I'm not ready yet to make the leap.
Then you're stuck with whatever you have, because the Py2 standard
library isn't being expanded any. Why not make the leap? Py3 has a lot
of advantages over Py2.

ChrisA
Yassine Chaouche
2015-02-03 11:23:53 UTC
Permalink
Thanks Chris, it's only a matter of time, I'll eventually make the transition to python3 when I'll learn it well enough.
Marko Rauhamaa
2015-02-03 11:35:20 UTC
Permalink
Post by Chris Angelico
On Tue, Feb 3, 2015 at 10:04 PM, Yassine Chaouche
Post by Yassine Chaouche
IIRC asyncio is python 3 only and I'm not ready yet to make the leap.
Then you're stuck with whatever you have, because the Py2 standard
library isn't being expanded any. Why not make the leap? Py3 has a lot
of advantages over Py2.
I'm all for Py3, but I'm not ready to conclude asyncio is the way to go.
The coroutines haven't won me over. The programming model is quite messy
and simply weird.

So far I've been happy with select.epoll(), socket.socket() and ten
fingers.


Marko
Yassine Chaouche
2015-02-03 12:56:17 UTC
Permalink
Post by Marko Rauhamaa
So far I've been happy with select.epoll(), socket.socket() and ten
fingers.
Marko
There's already software written to take care of much of the HTTP stuff protocol stuff, the headers etc. I wouldn't rewrite it. I prefer to monkey patch parts of existing code rather then rewrite all of it.

But your comment is interesting because, as I understand it, a non-blocking web server is simply a matter of setting timeouts on sockets, catch the exceptions and move on. I don't know why wouldn't that be possible with python stdlib ?
Post by Marko Rauhamaa
The standard library and nonblocking can't be used in the same sentence.
?
Marko Rauhamaa
2015-02-03 13:45:55 UTC
Permalink
Post by Yassine Chaouche
Post by Marko Rauhamaa
So far I've been happy with select.epoll(), socket.socket() and ten
fingers.
[...]
But your comment is interesting because, as I understand it, a
non-blocking web server is simply a matter of setting timeouts on
sockets, catch the exceptions and move on.
Now I think you might have some misconceptions about nonblocking
networking I/O. Nonblocking I/O is done using asynchronous, or
event-driven, programming. Your code reacts to external stimuli, never
blocking, mostly just sleeping. The reactions are defined in callback
routings, aka listeners, aka event handlers.
Post by Yassine Chaouche
I don't know why wouldn't that be possible with python stdlib ?
It is possible using the low-level facilities. However, the traditional
high-level facilities are built on multithreading, which (as a rule) is
based on blocking I/O.


Marko
Chris Angelico
2015-02-03 14:03:37 UTC
Permalink
Post by Marko Rauhamaa
Post by Yassine Chaouche
But your comment is interesting because, as I understand it, a
non-blocking web server is simply a matter of setting timeouts on
sockets, catch the exceptions and move on.
Now I think you might have some misconceptions about nonblocking
networking I/O. Nonblocking I/O is done using asynchronous, or
event-driven, programming. Your code reacts to external stimuli, never
blocking, mostly just sleeping. The reactions are defined in callback
routings, aka listeners, aka event handlers.
Not strictly true - that's just one convenient way of doing things. A
callback/event-handler structure lets you write a bunch of listeners
that coexist effortlessly, but it's not the only way to do
non-blocking I/O, and it's certainly not an intrinsic part of the
concept.

That said, though, it is a VERY convenient way to lay things out in
the code. The Pike system I offered, and most of the older
multiplexed-I/O systems I've used, did work that way. It just isn't
something that non-blocking necessarily implies.
Post by Marko Rauhamaa
Post by Yassine Chaouche
I don't know why wouldn't that be possible with python stdlib ?
It is possible using the low-level facilities. However, the traditional
high-level facilities are built on multithreading, which (as a rule) is
based on blocking I/O.
Multithreading is another way to cope with the same problem of wanting
to deal with different sockets on a single CPU, but I don't think it's
inherently a part of any of Python's own high-level facilities - not
that I can think of, at least? However, in terms of common programming
models, yes, multithreading+blocking I/O is an effective way to write
code, and will therefore be commonly used.

I wish more people had grown up on OS/2 instead of (or as well as)
Windows or Unix. Threading is not such a bugbear as a lot of people
seem to think. Yes, some platforms have traditionally had poor
implementations, and to be sure, you don't want to mix threading and
forking without a *lot* of care, but threads aren't inherently bad.
They're a useful tool in the toolbox. Sometimes non-blocking I/O is
the right thing to do; sometimes threads suit the problem better;
other times, something else again.

ChrisA
Marko Rauhamaa
2015-02-03 14:20:40 UTC
Permalink
Post by Chris Angelico
Threading is not such a bugbear as a lot of people
seem to think. Yes, some platforms have traditionally had poor
implementations
Java has excellent multithreading facilities. Still, I have seen
seasoned Java developers commit atrocious crimes against thread-safety
that are impossible to troubleshoot and repair afterwards.

There are ways to manage the complications of multithreading, but no
universal practices are agreed upon so you can't count on other/legacy
software to obey your policies or your software obey theirs. And even
under the best guidelines there are seemingly intractable cornercases
where you feel helplessly lost.


Marko
Dennis Lee Bieber
2015-02-04 01:25:52 UTC
Permalink
Post by Marko Rauhamaa
Now I think you might have some misconceptions about nonblocking
networking I/O. Nonblocking I/O is done using asynchronous, or
event-driven, programming. Your code reacts to external stimuli, never
blocking, mostly just sleeping. The reactions are defined in callback
routings, aka listeners, aka event handlers.
In my world, "sleep" is a blocking operation -- the process is blocked
from execution until the sleep expires or is otherwise canceled.

So a socket.select() with timeout is a blocking operation -- just that
it is released when any of the multiple items (read sockets, write sockets,
timeout) is satisfied.
--
Wulfraed Dennis Lee Bieber AF6VN
***@ix.netcom.com HTTP://wlfraed.home.netcom.com/
Chris Angelico
2015-02-03 13:21:31 UTC
Permalink
On Tue, Feb 3, 2015 at 11:56 PM, Yassine Chaouche
Post by Yassine Chaouche
But your comment is interesting because, as I understand it, a non-blocking web server is simply a matter of setting timeouts on sockets, catch the exceptions and move on. I don't know why wouldn't that be possible with python stdlib ?
Not really. You could, in theory, set very low timeouts and then poll
everything, but it's not efficient. What you want to do is say to the
system "Hey, see all these sockets? Let me know when *any one of them*
has stuff for me", where "stuff" would be a new connected client if
it's a listening socket, or some data written if it's a connected
socket; and you might need to check if there's room to write more
data, too, which you can do with the same syscall.

The key here is that you have a long timeout on the meta-event "any
one of these being ready". That's not simply a matter of setting
socket timeouts; you need a way to handle the meta-event, and that's
something along the lines of select():

http://linux.die.net/man/2/select

Other languages have inbuilt asynchronous I/O handlers; eg Pike
handles this fairly well, and I've made some use of it with a generic
networking system:

https://github.com/Rosuav/Hogan

Basically, you spin up a server with any number of listening sockets,
each of which can talk to any number of connected clients, and all of
those sockets get smoothly multiplexed on a single thread. Lots of
other languages have similar facilities. Python 2.x doesn't have
anything of that nature; Python 3's asyncio is exactly that.

ChrisA
Amirouche Boubekki
2015-02-03 14:09:51 UTC
Permalink
Post by Marko Rauhamaa
The standard library and nonblocking can't be used in the same sentence.
python 2.x stdlib has no high level support of *async* code. There is
trollius library that ports asyncio to py2 though.

I was a bit quick in my first anwser. What you want is to prevent the
socket to wait indefinetly for data (based on strace output) which is done
with socket.setblocking/settimeout [1]. asynchronous (asyncio) is something
else, and you would still need to handle blocking I think.

There is contentbrowser [2] which is somekind of web proxy.

IMO, python 2 -> python 3 is not a big leap. Some things are better in
python 3.


[1] https://docs.python.org/2/library/socket.html#socket.socket.setblocking
[2] https://bitbucket.org/david/contentbrowser/src





On Tue Feb 03 2015 at 2:00:27 PM Yassine Chaouche
Post by Marko Rauhamaa
Post by Marko Rauhamaa
So far I've been happy with select.epoll(), socket.socket() and ten
fingers.
Marko
There's already software written to take care of much of the HTTP stuff
protocol stuff, the headers etc. I wouldn't rewrite it. I prefer to monkey
patch parts of existing code rather then rewrite all of it.
But your comment is interesting because, as I understand it, a
non-blocking web server is simply a matter of setting timeouts on sockets,
catch the exceptions and move on. I don't know why wouldn't that be
possible with python stdlib ?
Post by Marko Rauhamaa
The standard library and nonblocking can't be used in the same sentence.
?
--
https://mail.python.org/mailman/listinfo/python-list
Yassine Chaouche
2015-02-03 14:50:11 UTC
Permalink
What you want is to prevent the socket to wait indefinetly for data (based on strace output) which is done with socket.setblocking/settimeout [1]
Exactly ! thanks for taking time to reading my original post a second time. Maybe I didn't express my problem well enough and the discussion has maybe drifted to an XYProblem type discussion :)

I was about to say that most webframeworks for python already take care of the problem I'm facing, so I certainly confused people with improper formulation.
There is contentbrowser [2] which is somekind of web proxy.
It's using werkzeug and it seems to be a full web application. I just want something that is right above sockets and selects. All that my script is doing is grabbing some json data from an external url provided by the browser and return a formatted page showing the json data in human-readable HTML pages. Nothing fancy.
Filadelfo Fiamma
2015-02-03 15:03:47 UTC
Permalink
I think You can try the asyncore lib:
https://docs.python.org/2/library/asyncore.html


2015-02-03 15:50 GMT+01:00 Yassine Chaouche <
What you want is to prevent the socket to wait indefinetly for data
(based on strace output) which is done with socket.setblocking/settimeout
[1]
Exactly ! thanks for taking time to reading my original post a second
time. Maybe I didn't express my problem well enough and the discussion has
maybe drifted to an XYProblem type discussion :)
I was about to say that most webframeworks for python already take care of
the problem I'm facing, so I certainly confused people with improper
formulation.
There is contentbrowser [2] which is somekind of web proxy.
It's using werkzeug and it seems to be a full web application. I just want
something that is right above sockets and selects. All that my script is
doing is grabbing some json data from an external url provided by the
browser and return a formatted page showing the json data in human-readable
HTML pages. Nothing fancy.
--
https://mail.python.org/mailman/listinfo/python-list
--
Filadelfo Fiamma
mail: ***@gmail.com
Mark Lawrence
2015-02-03 16:39:02 UTC
Permalink
Post by Filadelfo Fiamma
https://docs.python.org/2/library/asyncore.html
People can try it but it's effectively deprecated, with its partner
asynchat, in favour of asyncio.

Also please don't top post here, thank you.
--
My fellow Pythonistas, ask not what our language can do for you, ask
what you can do for our language.

Mark Lawrence
Yassine Chaouche
2015-02-08 14:13:31 UTC
Permalink
What you want is to prevent the socket to wait indefinetly for data (based on strace output) which is done with socket.setblocking/settimeout [1]. asynchronous (asyncio) is something else, and you would still need to handle blocking I think.
I have installed Faulthandler, a beautiful tool written by Victor "Haypo" Stinner, and thanks to it I could determine precisely where the program hangs. It is in ssl.py::SSLSocket::read


def read(self, len=1024):

"""Read up to LEN bytes and return them.
Return zero-length string on EOF."""
return self._sslobj.read(len) >*< python hangs on this line
except SSLError, x:
if x.args[0] == SSL_ERROR_EOF and self.suppress_ragged_eofs:
return ''
else:
raise

From the traceback given by faulthandler it seems to me that the problem isn't from my webserver trying to receive connections from clients, but from my server (acting as a client) trying a request on a distant server (https URL -> use of ssl.py). Here's the traceback :

1 Current thread 0x00007fb9cb41f700 (most recent call first):
2 File "/usr/lib/python2.7/ssl.py", line 160 in read
3 File "/usr/lib/python2.7/ssl.py", line 241 in recv
4 File "/usr/lib/python2.7/socket.py", line 447 in readline
5 File "/usr/lib/python2.7/httplib.py", line 365 in _read_status
6 File "/usr/lib/python2.7/httplib.py", line 407 in begin
7 File "/usr/lib/python2.7/httplib.py", line 1034 in getresponse
8 File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py", line 353 in _make_request
9 File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py", line 518 in urlopen
10 File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 370 in send
11 File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 573 in send
12 File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 461 in request
13 File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 473 in get
14 File "/usr/local/lib/python2.7/dist-packages/infomaniak/infomaniak.py", line 29 in getFlux
15 File "/usr/local/lib/python2.7/dist-packages/infomaniak/server.py", line 52 in do_GET
16 File "/usr/lib/python2.7/BaseHTTPServer.py", line 328 in handle_one_request
17 File "/usr/lib/python2.7/BaseHTTPServer.py", line 340 in handle
18 File "/usr/lib/python2.7/SocketServer.py", line 649 in __init__
19 File "/usr/lib/python2.7/SocketServer.py", line 334 in finish_request
20 File "/usr/lib/python2.7/SocketServer.py", line 321 in process_request
21 File "/usr/lib/python2.7/SocketServer.py", line 295 in _handle_request_noblock
22 File "/usr/lib/python2.7/SocketServer.py", line 238 in serve_forever
23 File "/usr/local/lib/python2.7/dist-packages/infomaniak/server.py", line 71 in <module>
24 File "/usr/bin/infomaniak", line 2 in <module>
25

If you look at line 13, it shows that the server is actually doing a get to an external URL via the requests library, which is itself relying on urllib3, which in turn is using httplib.py

Below is the code of the last three functions to have been called, in chronoligical order :

In socket.py::_fileobject::readline
[...]
while True:
try:
data = self._sock.recv(self._rbufsize) #<------------
except error, e:
if e.args[0] == EINTR:
continue
raise
if not data:
break
nl = data.find('\n')
if nl >= 0:
nl += 1
buf.write(data[:nl])
self._rbuf.write(data[nl:])
del data
break
buf.write(data)
return buf.getvalue()
[...]

In ssl.py::SSLSocket::recv

def recv(self, buflen=1024, flags=0):
if self._sslobj:
if flags != 0:
raise ValueError(
"non-zero flags not allowed in calls to recv() on %s" %
self.__class__)
return self.read(buflen) #<-------
else:
return self._sock.recv(buflen, flags)


In ssl.py::SSLSocket::read

def read(self, len=1024):

"""Read up to LEN bytes and return them.
Return zero-length string on EOF."""

try:
return self._sslobj.read(len) # >*< python hangs on this line
except SSLError, x:
if x.args[0] == SSL_ERROR_EOF and self.suppress_ragged_eofs:
return ''
else:
raise

I can't go any further because the _sslobj is create via the _ssl.so library, it is very likely C code.

Do you have any idea about how I can investigate this any further ?
Mark Lawrence
2015-02-03 12:10:05 UTC
Permalink
Post by Yassine Chaouche
Post by Chris Angelico
Post by Yassine Chaouche
Thanks Marko. It's a lost cause then.
You trimmed out the part where he mentioned asyncio. :)
ChrisA
IIRC asyncio is python 3 only and I'm not ready yet to make the leap.
The leap from Python 2 to Python 3 is about as high as the second
obstacle is this soon to be Olympic sport

--
My fellow Pythonistas, ask not what our language can do for you, ask
what you can do for our language.

Mark Lawrence
Irmen de Jong
2015-02-02 18:47:55 UTC
Permalink
Post by y***@yahoo.com
I wrote a little script that acts like a proxy, you just give it a URL and it will
fetch the content and display it back to you.
For some reason, this proxy blocks sometimes and refuses to serve any new queries.
The script still runs, but it seems like it's stuck somewhere.
~/SCRIPTS/INFOMANIAK #
I read in the SimpleHTTPServer source code that one can inherit from the
SocketServer.TrheadingMixIn mixin to enable a threaded server to handle multiple
requests at a time instead of just one (thinking maybe that's what was blocking it).
However, it seems like it has nothing to do with my problem. What I need to do is not
only handle multiple requests at a time, but more importantly to make the request
handler non-blocking.
Why? If you have multiple threads serving some requests at the same time, doesn't that
already achieve your goal? In other words, have you tried what you describe above?
(make sure you close the connection correctly or you'll be hogging a thread which may
eventually make the server non responsive)

Irmen
Yassine Chaouche
2015-02-03 09:06:42 UTC
Permalink
Hello Irmen,
Post by Irmen de Jong
Why? If you have multiple threads serving some requests at the same time, doesn't that
already achieve your goal?
Having multiple requests running at a time is one thing. Making them non-blocking is another. That's how I understand it.
Post by Irmen de Jong
In other words, have you tried what you describe above?
Yes I already tried that, it's in the source code I posted if you look closely you'll see that I am inherting from the mixin. The server still hangs sometime, until I send it the SIGINT signal. It then sort of drops the hanging socket and resumes normal operations.
Post by Irmen de Jong
(make sure you close the connection correctly or you'll be hogging a thread which may
eventually make the server non responsive)
Irmen
I didn't see any such code in the tests that came with the standard python library (which I took as a starting point example to write my script). Maybe it's already taken care of by some lower-level internal stuff ?
Yassine Chaouche
2015-04-23 13:39:11 UTC
Permalink
Hello,

I wanted to make a little update on this thread. The problem is solved, and while debugging my application I learned that it is actually possible to have multithreaded or multi-process web application in python using only the standard library.

A longer explanation, along with minimal working code and some UML diagrams explaining the insides of the standard python library modules involved in basic web applications can be found here : http://ychaouche.wikispot.org/HowBottleAppsWork
Yassine Chaouche
2015-04-23 14:35:22 UTC
Permalink
Here is a simple multi-threaded python web application that uses only the stanard library modules :





#!/usr/bin/env python
#-*- encoding=utf-8 -*-
import SimpleHTTPServer
import BaseHTTPServer
import SocketServer

class MyServer(SocketServer.ThreadingMixIn,BaseHTTPServer.HTTPServer):
pass

class Handler(SimpleHTTPServer.SimpleHTTPRequestHandler):
def do_GET(self):
self.send_response(200)
self.send_header('Content-Type', 'text/html')
self.end_headers()
self.wfile.write("Hello world \n")

IP,PORT = "",8010

server = MyServer((IP,PORT),Handler)
server.serve_forever()








It will send Hello world followed by a newline (so that when you invoke curl on the terminal it will nicely put the shell prompt a newline). You can simulate a non-responsive client with this little script :




import socket
import requests

# This will leave an open a connection to our app.
conn = socket.create_connection(('localhost',8010))

# This will never get anything until conn.close() is called.
print requests.get('http://localhost:8010').text.strip()






If you don't inherit from the ThreadingMixIn, the application will get stuck when you launch the client script and any further requests (use curl or wget for example) will simply be postponed until client.py is killed.

If you inherit from ThreadingMixIn, the application will nicely run the request handler on a new thread, making way for further requests to be handled.
Loading...