Check it out here:
http://wiki.rubyonrails.com/rails/pages/Framework+Performance
And don't forget to Digg and del.icio.us!
http://digg.com/programming/Django_vs._Rails_vs._Symfony_Django_is_fastest
http://del.icio.us/url/c44bdc6ac0cd2adfa4eb38af27b9f644
Adrian
--
Adrian Holovaty
holovaty.com | djangoproject.com
By over 30% -- hell yeah!
Now, I've always known in my gut that Django's pretty damn fast, but
seeing it verified by the RoR website...
Priceless :)
Jacob
That's awesome... congrats Django developers!
-ian
And with MySQL, to boot.
--
The Pythonic Principle: Python works the way it does
because if it didn't, it wouldn't be Python.
And with MySQL, to boot.
Andy Dustman wrote:
> On 7/14/06, Jacob Kaplan-Moss <ja...@jacobian.org> wrote:
>
>>On Jul 14, 2006, at 1:18 PM, Adrian Holovaty wrote:
>>
>>>Some folks benchmarked Symfony, Ruby on Rails and Django. Django was
>>>the fastest.
>>
>>By over 30% -- hell yeah!
>>
>>Now, I've always known in my gut that Django's pretty damn fast, but
>>seeing it verified by the RoR website...
>>
>>Priceless :)
>
>
> And with MySQL, to boot.
This is not surprising, it was a "many small hit" benchmark, and mysql
has long outperformed PostGress in that department. I've also seen
shootouts of python vs perl vs php for backending and mod_python just
plain hauls ass. ;)
Iain
httperf --server=localhost --port=8080 --uri=/test/ --num-conns=7000
--num-calls=1 --rate=X --timeout=4
Where X changed from 100 to 700 connections per second. As for django
output it was a simple template with 5 variables from "view", one with
a long string (whole page 15KB)
Results:
Errors: no errors when the rate is 100-400. For 500 conns/s I've got
24441 erros, for 600: 31758 and for 700: 34804 errors.
Avg connection time (ms): 100: 5, 200: 1,8, 300: 1,9, 400: 1,9, 500:
227,3, 600: 350,5, 700: 513,8.
Full logs:
httperf --timeout=4 --client=0/1 --server=localhost --port=8080
--uri=/test/ --rate=100 --send-buffer=4096 --recv-buffer=16384
--num-conns=40000 --num-calls=1
Maximum connect burst length: 1
Total: connections 40000 requests 40000 replies 40000 test-duration
399.992 s
Connection rate: 100.0 conn/s (10.0 ms/conn, <=51 concurrent
connections)
Connection time [ms]: min 1.8 avg 5.0 max 3003.3 median 1.5 stddev 95.1
Connection time [ms]: connect 3.0
Connection length [replies/conn]: 1.000
Request rate: 100.0 req/s (10.0 ms/req)
Request size [B]: 65.0
Reply rate [replies/s]: min 99.8 avg 100.0 max 100.2 stddev 0.1 (80
samples)
Reply time [ms]: response 1.9 transfer 0.0
Reply size [B]: header 142.0 content 15337.0 footer 0.0 (total 15479.0)
Reply status: 1xx=0 2xx=40000 3xx=0 4xx=0 5xx=0
CPU time [s]: user 45.18 system 284.57 (user 11.3% system 71.1% total
82.4%)
Net I/O: 1518.0 KB/s (12.4*10^6 bps)
Errors: total 0 client-timo 0 socket-timo 0 connrefused 0 connreset 0
Errors: fd-unavail 0 addrunavail 0 ftab-full 0 other 0
############################################################################
httperf --timeout=4 --client=0/1 --server=localhost --port=8080
--uri=/test/ --rate=200 --send-buffer=4096 --recv-buffer=16384
--num-conns=40000 --num-calls=1
Maximum connect burst length: 2
Total: connections 40000 requests 40000 replies 40000 test-duration
200.002 s
Connection rate: 200.0 conn/s (5.0 ms/conn, <=4 concurrent connections)
Connection time [ms]: min 1.8 avg 1.8 max 16.8 median 1.5 stddev 0.3
Connection time [ms]: connect 0.0
Connection length [replies/conn]: 1.000
Request rate: 200.0 req/s (5.0 ms/req)
Request size [B]: 65.0
Reply rate [replies/s]: min 199.6 avg 200.0 max 200.4 stddev 0.2 (39
samples)
Reply time [ms]: response 1.8 transfer 0.0
Reply size [B]: header 142.0 content 15337.0 footer 0.0 (total 15479.0)
Reply status: 1xx=0 2xx=40000 3xx=0 4xx=0 5xx=0
CPU time [s]: user 22.20 system 108.07 (user 11.1% system 54.0% total
65.1%)
Net I/O: 3035.9 KB/s (24.9*10^6 bps)
Errors: total 0 client-timo 0 socket-timo 0 connrefused 0 connreset 0
Errors: fd-unavail 0 addrunavail 0 ftab-full 0 other 0
############################################################################
httperf --timeout=4 --client=0/1 --server=localhost --port=8080
--uri=/test/ --rate=300 --send-buffer=4096 --recv-buffer=16384
--num-conns=40000 --num-calls=1
Maximum connect burst length: 2
Total: connections 40000 requests 40000 replies 40000 test-duration
133.333 s
Connection rate: 300.0 conn/s (3.3 ms/conn, <=4 concurrent connections)
Connection time [ms]: min 1.8 avg 1.9 max 12.4 median 1.5 stddev 0.5
Connection time [ms]: connect 0.0
Connection length [replies/conn]: 1.000
Request rate: 300.0 req/s (3.3 ms/req)
Request size [B]: 65.0
Reply rate [replies/s]: min 299.8 avg 300.0 max 300.2 stddev 0.2 (26
samples)
Reply time [ms]: response 1.8 transfer 0.0
Reply size [B]: header 142.0 content 15337.0 footer 0.0 (total 15479.0)
Reply status: 1xx=0 2xx=40000 3xx=0 4xx=0 5xx=0
CPU time [s]: user 10.80 system 53.12 (user 8.1% system 39.8% total
47.9%)
Net I/O: 4553.9 KB/s (37.3*10^6 bps)
Errors: total 0 client-timo 0 socket-timo 0 connrefused 0 connreset 0
Errors: fd-unavail 0 addrunavail 0 ftab-full 0 other 0
############################################################################
httperf --timeout=4 --client=0/1 --server=localhost --port=8080
--uri=/test/ --rate=400 --send-buffer=4096 --recv-buffer=16384
--num-conns=40000 --num-calls=1
Maximum connect burst length: 4
Total: connections 40000 requests 40000 replies 40000 test-duration
100.002 s
Connection rate: 400.0 conn/s (2.5 ms/conn, <=11 concurrent
connections)
Connection time [ms]: min 1.8 avg 1.9 max 21.7 median 1.5 stddev 0.9
Connection time [ms]: connect 0.0
Connection length [replies/conn]: 1.000
Request rate: 400.0 req/s (2.5 ms/req)
Request size [B]: 65.0
Reply rate [replies/s]: min 399.8 avg 400.0 max 400.2 stddev 0.2 (19
samples)
Reply time [ms]: response 1.8 transfer 0.1
Reply size [B]: header 142.0 content 15337.0 footer 0.0 (total 15479.0)
Reply status: 1xx=0 2xx=40000 3xx=0 4xx=0 5xx=0
CPU time [s]: user 5.45 system 25.06 (user 5.4% system 25.1% total
30.5%)
Net I/O: 6071.7 KB/s (49.7*10^6 bps)
Errors: total 0 client-timo 0 socket-timo 0 connrefused 0 connreset 0
Errors: fd-unavail 0 addrunavail 0 ftab-full 0 other 0
############################################################################
httperf --timeout=4 --client=0/1 --server=localhost --port=8080
--uri=/test/ --rate=500 --send-buffer=4096 --recv-buffer=16384
--num-conns=40000 --num-calls=1
Maximum connect burst length: 11
Total: connections 29576 requests 15824 replies 15559 test-duration
83.994 s
Connection rate: 352.1 conn/s (2.8 ms/conn, <=1022 concurrent
connections)
Connection time [ms]: min 1.8 avg 227.3 max 6910.5 median 12.5 stddev
802.1
Connection time [ms]: connect 234.9
Connection length [replies/conn]: 1.000
Request rate: 188.4 req/s (5.3 ms/req)
Request size [B]: 65.0
Reply rate [replies/s]: min 1.2 avg 194.2 max 501.8 stddev 191.1 (16
samples)
Reply time [ms]: response 28.9 transfer 6.8
Reply size [B]: header 142.0 content 15337.0 footer 0.0 (total 15479.0)
Reply status: 1xx=0 2xx=15559 3xx=0 4xx=0 5xx=0
CPU time [s]: user 0.50 system 55.11 (user 0.6% system 65.6% total
66.2%)
Net I/O: 2812.1 KB/s (23.0*10^6 bps)
Errors: total 24441 client-timo 14017 socket-timo 0 connrefused 0
connreset 0
Errors: fd-unavail 10424 addrunavail 0 ftab-full 0 other 0
############################################################################
httperf --timeout=4 --client=0/1 --server=localhost --port=8080
--uri=/test/ --rate=600 --send-buffer=4096 --recv-buffer=16384
--num-conns=40000 --num-calls=1
Maximum connect burst length: 24
Total: connections 23250 requests 8433 replies 8242 test-duration
73.623 s
Connection rate: 315.8 conn/s (3.2 ms/conn, <=1022 concurrent
connections)
Connection time [ms]: min 3.8 avg 350.5 max 6952.9 median 33.5 stddev
957.1
Connection time [ms]: connect 350.7
Connection length [replies/conn]: 1.000
Request rate: 114.5 req/s (8.7 ms/req)
Request size [B]: 65.0
Reply rate [replies/s]: min 0.0 avg 117.7 max 492.9 stddev 146.5 (14
samples)
Reply time [ms]: response 37.8 transfer 14.0
Reply size [B]: header 142.0 content 15337.0 footer 0.0 (total 15479.0)
Reply status: 1xx=0 2xx=8242 3xx=0 4xx=0 5xx=0
CPU time [s]: user 0.56 system 57.56 (user 0.8% system 78.2% total
78.9%)
Net I/O: 1699.5 KB/s (13.9*10^6 bps)
Errors: total 31758 client-timo 15008 socket-timo 0 connrefused 0
connreset 0
Errors: fd-unavail 16750 addrunavail 0 ftab-full 0 other 0
############################################################################
httperf --timeout=4 --client=0/1 --server=localhost --port=8080
--uri=/test/ --rate=700 --send-buffer=4096 --recv-buffer=16384
--num-conns=40000 --num-calls=1
Maximum connect burst length: 17
Total: connections 18938 requests 5386 replies 5196 test-duration
61.615 s
Connection rate: 307.4 conn/s (3.3 ms/conn, <=1022 concurrent
connections)
Connection time [ms]: min 3.5 avg 513.8 max 5704.4 median 32.5 stddev
1116.0
Connection time [ms]: connect 542.6
Connection length [replies/conn]: 1.000
Request rate: 87.4 req/s (11.4 ms/req)
Request size [B]: 65.0
Reply rate [replies/s]: min 0.0 avg 86.6 max 461.2 stddev 134.0 (12
samples)
Reply time [ms]: response 30.1 transfer 13.8
Reply size [B]: header 142.0 content 15337.0 footer 0.0 (total 15479.0)
Reply status: 1xx=0 2xx=5196 3xx=0 4xx=0 5xx=0
CPU time [s]: user 0.32 system 51.55 (user 0.5% system 83.7% total
84.2%)
Net I/O: 1280.3 KB/s (10.5*10^6 bps)
Errors: total 34804 client-timo 13742 socket-timo 0 connrefused 0
connreset 0
Errors: fd-unavail 21062 addrunavail 0 ftab-full 0 other 0
Interesting stuff -- thanks for the benchmarks.
I'd note that running Django on a laptop -- especially one with only
512MB of RAM -- is probably going to give pretty anemic results.
Between the slow bus, slow disk, and lack of RAM you'll get pretty
lousy numbers compared to a server.
> Results:
> Errors: no errors when the rate is 100-400. For 500 conns/s I've got
> 24441 erros, for 600: 31758 and for 700: 34804 errors.
Chances are these errors are caused by reaching the connection limit
on your database, not by Django itself. Again, on such a slow
machine you'll have some pretty serious resource contention between
the database and your other software.
Still -- 400 req/s on a laptop? That's pretty good as far as I'm
concerned :)
Again, thanks for the numbers.
Jacob
Some more "tests" will come (django and others :))
a pre-test showing a serving a simple static page through both of
them (also 15k) would highlight
if django's templating engine is indeed faster than pylons.
(and it might convince the pylons team to use mod-python instead of
fastcgi)
no mod_python, apache or any other.
I'll test X + lighttp/apache soon (I need to read the docs ;) )