Retry in aiohttp

6,491 views
Skip to first unread message

Mike Rans

unread,
Dec 15, 2016, 7:07:56 AM12/15/16
to aio-libs
Hi all,

I suggest adding retry functionality to aiohttp eg. something like the below which I generalised from code by unixsurfer in this ticket so that you can pass in a function to execute on status 200.

Cheers,
Mike

import asyncio
import logging

import aiohttp
import sys

logger = logging.getLogger(__name__)


HTTP_STATUS_CODES_TO_RETRY = [500, 502, 503, 504]


class FailedRequest(Exception):
"""
A wrapper of all possible exception during a HTTP request
"""
code = 0
message = ''
url = ''
raised = ''

def __init__(self, *, raised='', message='', code='', url=''):
self.raised = raised
self.message = message
self.code = code
self.url = url

super().__init__("code:{c} url={u} message={m} raised={r}".format(
c=self.code, u=self.url, m=self.message, r=self.raised))


async def send_http(session, method, url, *,
retries=1,
interval=1,
backoff=2,
read_timeout=300,
http_status_codes_to_retry=HTTP_STATUS_CODES_TO_RETRY,
fn=lambda x:x,
**kwargs):
"""
Sends a HTTP request and implements a retry logic.

Arguments:
session (obj): A client aiohttp session object
method (str): Method to use
url (str): URL for the request
retries (int): Number of times to retry in case of failure
interval (float): Time to wait before retries
backoff (int): Multiply interval by this factor after each failure
read_timeout (float): Time to wait for a response
http_status_codes_to_retry (List[int]): List of status codes to retry
fn (Callable[[x],x]: Function to call on successful connection
"""
backoff_interval = interval
raised_exc = None
attempt = 0

if method not in ['get', 'patch', 'post']:
raise ValueError

if retries == -1: # -1 means retry indefinitely
attempt = -1
elif retries == 0: # Zero means don't retry
attempt = 1
else: # any other value means retry N times
attempt = retries + 1

while attempt != 0:
if raised_exc:
logger.error('Caught "%s" url:%s method:%s, remaining tries %s, '
'sleeping %.2fsecs', raised_exc, method.upper(), url,
attempt, backoff_interval)
await asyncio.sleep(backoff_interval)
# bump interval for the next possible attempt
backoff_interval *= backoff
# logger.info('sending %s %s with %s', method.upper(), url, kwargs)
try:
with aiohttp.Timeout(timeout=read_timeout):
async with getattr(session, method)(url, **kwargs) as response:
if response.status == 200:
return await fn(response)
elif response.status in http_status_codes_to_retry:
logger.error(
'Received invalid response code:%s url:%s error:%s'
' response:%s', response.status, url, '',
response.reason
)
raise aiohttp.errors.HttpProcessingError(
code=response.status, message=response.reason)
else:
raise FailedRequest(
code=response.status, message='Non-retryable response code',
raised='aiohttp.errors.HttpProcessingError', url=url)
except (aiohttp.errors.ClientResponseError,
aiohttp.errors.ClientRequestError,
aiohttp.errors.ClientOSError,
aiohttp.errors.ClientDisconnectedError,
aiohttp.errors.ClientTimeoutError,
asyncio.TimeoutError,
aiohttp.errors.HttpProcessingError) as exc:
try:
code = exc.code
except AttributeError:
code = ''
raised_exc = FailedRequest(code=code, message=exc, url=url,
raised='%s.%s' % (exc.__class__.__module__, exc.__class__.__qualname__))
else:
raised_exc = None
break

attempt -= 1

if raised_exc:
raise raised_exc

Andrew Svetlov

unread,
Dec 15, 2016, 10:11:16 AM12/15/16
to aio-libs
No, this functionality is out of scope of aiohttp.
Retrying could be very easy in particular case but very hard in general.
Too many scenarios might fail: file upload and upload from given generator, downloading big data etc.

Mike Rans

unread,
Jan 13, 2017, 3:35:12 AM1/13/17
to aio-libs
The reason I asked about retry is because urllib3 which requests uses has that functionality built in (and hence it can also be used with grequests). See: http://stackoverflow.com/questions/40417503/applying-retry-on-grequests-in-python/40761250?noredirect=1#comment70405713_40761250

To cover what cases to retry, it allows passing in a list of status codes to retry (as does the example above). Would urllib3 functionality be hard to replicate in aiohttp?

Thanks,
Mike
PS: Thanks again for all the work on this excellent library

Nikolay Kim

unread,
Jan 13, 2017, 1:17:26 PM1/13/17
to aio-libs
i'd suggest to write library for doing retries in aiohttp, cover possible cases. then later when it becomes more mature we can think about inclusion into aiohttp core

multiso...@gmail.com

unread,
Jan 21, 2017, 6:26:22 AM1/21/17
to aio-libs
Here is riprova library, which works fine with asyncio. Probably this is not what you want, but I have never need of repeating exactly one request, so decorator for user function is, in my opinion, good approach.

Vladimir Rutsky

unread,
Feb 6, 2017, 2:17:13 PM2/6/17
to multiso...@gmail.com, aio-libs
There is also "backoff" library allows to implement retrying and which
supports asyncio: https://github.com/litl/backoff/

Regards,
Vladimir
> --
> You received this message because you are subscribed to the Google Groups
> "aio-libs" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to aio-libs+u...@googlegroups.com.
> To post to this group, send email to aio-...@googlegroups.com.
> Visit this group at https://groups.google.com/group/aio-libs.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/aio-libs/53d8e475-7221-4b88-bb91-b4095c7de2c8%40googlegroups.com.
>
> For more options, visit https://groups.google.com/d/optout.

Mike Rans

unread,
Feb 13, 2017, 9:08:46 AM2/13/17
to aio-libs, multiso...@gmail.com
Thanks for the suggestions!
Reply all
Reply to author
Forward
0 new messages