I've run into the issue described in the code below, where (as far as I can tell) a natural use of __str__ in Python 2.7 results in a Unicode error. I'm not quite sure how to write this code to work properly on both Python 2 and Python 3; what am I missing?
(Note this issue happens on Python 2.7 regardless of the presence of the @python_2_unicode_compatible decorator.)
Models:
from django.db import models
from django.utils.encoding import python_2_unicode_compatible
@python_2_unicode_compatible
class A(models.Model):
c = models.CharField(max_length=20)
def __str__(self):
return self.c
@python_2_unicode_compatible
class B(models.Model):
a = models.ForeignKey(A)
def __str__(self):
return str(self.a)
Failure example:
>>> from test.models import A, B
>>> a = A(c=u'répairer')
>>> a.save()
>>>
a.id
1
>>> a1 = A.objects.get(id=1)
>>> a1
<A: répairer>
>>> b = B(a_id=1)
>>> b.save()
>>>
b.id
1
>>> b1 = B.objects.get(id=1)
>>> b1
<B: [Bad Unicode data]>
>>> print b1
Traceback (most recent call last):
File "<console>", line 1, in <module>
File "/Users/xof/Documents/Dev/environments/peep/lib/python2.7/site-packages/django/utils/six.py", line 842, in <lambda>
klass.__str__ = lambda self: self.__unicode__().encode('utf-8')
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 1: ordinal not in range(128)
--
-- Christophe Pettus
x...@thebuild.com