ORM for structured Logs

83 views
Skip to first unread message

guettli

unread,
Apr 4, 2017, 8:18:42 AM4/4/17
to Django users
In the past I was told: Don't store logs in the database.

Time (and hardware) has changed.

I think it is time to store logs where I have great tools for analyzing logs.

Storing in a model which I can access via django orm is my current strategy.

It seems no one has done this before. I could not find such a project up to now.

My first draft looks like this:


class Log(models.Model):
datetime=models.DateTimeField(default=datetime.datetime.now, db_index=True)
data=jsonfield.JSONField()
host=models.CharField(max_length=256, default='localhost', db_index=True)
system=models.CharField(max_length=256, default='', db_index=True)

I am missing two things:

Missing1: Log level: INFO, WARN, ...

Missing2: A way to store exceptions.


What do you think?

What's wrong with this, what could be improved?

Regards,
Thomas Güttler





guettli

unread,
Apr 6, 2017, 4:15:53 AM4/6/17
to Django users
Hello Brick Wall, how are you doing?

Christian Ledermann

unread,
Apr 6, 2017, 4:42:17 AM4/6/17
to django...@googlegroups.com
On 6 April 2017 at 09:15, guettli <guet...@gmail.com> wrote:
> Hello Brick Wall, how are you doing?

Hello Stonemason.

What is your question?

I do not have a strong opinion on your approach - i don't even know
the problem you are trying to solve.
or how big your logs are. a couple of KB per day or some GB per hour?

[the brickwall shrugs its shoulders]
> --
> You received this message because you are subscribed to the Google Groups
> "Django users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to django-users...@googlegroups.com.
> To post to this group, send email to django...@googlegroups.com.
> Visit this group at https://groups.google.com/group/django-users.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/django-users/c8742484-a252-4c05-939a-f7c3f2b4f505%40googlegroups.com.
>
> For more options, visit https://groups.google.com/d/optout.



--
Best Regards,

Christian Ledermann

Newark-on-Trent - UK
Mobile : +44 7474997517

https://uk.linkedin.com/in/christianledermann
https://github.com/cleder/


<*)))>{

If you save the living environment, the biodiversity that we have left,
you will also automatically save the physical environment, too. But If
you only save the physical environment, you will ultimately lose both.

1) Don’t drive species to extinction

2) Don’t destroy a habitat that species rely on.

3) Don’t change the climate in ways that will result in the above.

}<(((*>

knbk

unread,
Apr 6, 2017, 8:10:58 AM4/6/17
to Django users
Hi Thomas,

The primary purpose of logging is to catch and examine errors. If something went wrong, you want to know when and why. Logging to a database increases the complexity and greatly increases the number of things that can go wrong. The last thing you want to find out when retracing an error is that you don't have any logs because the logging system failed. You may also need to log errors that happened during startup, before a database connection can be established. Logging to file is the simplest method, and has the least chance of failure. That's why you should always log to file. 

The two options are not mutually exclusive. Like you said, times have changed, and the overhead to store logs both in a file and in a database are nowadays acceptable. If you have a good reason to store the logs in a database, then go ahead. Just remember that it should be in addition to file-based logging. 

Kind regards,

Marten

guettli

unread,
Apr 7, 2017, 5:21:22 AM4/7/17
to Django users


Am Donnerstag, 6. April 2017 10:42:17 UTC+2 schrieb Christian Ledermann:
On 6 April 2017 at 09:15, guettli <guet...@gmail.com> wrote:
> Hello Brick Wall, how are you doing?

Hello Stonemason.

What is your question?



It was more an idea than a question.

The question could be: What do you think about this idea?

I am lazy and like postgresql. Up to now we store are logs in files and we search for an alternative.

I am unfamiliar with ElasticSearch (DB of ELK-Stack).

ELK-Stack is very popular, but maybe is overrated. I don't know. Why not use PostgreSQL?



 
I do not have a strong opinion on your approach - i don't even know
the problem you are trying to solve.
or how big your logs are. a couple of KB per day or some GB per hour?


Traffic does not play any role here. This question is only about the structure (tables and columns), not about the rows.

 
[the brickwall shrugs its shoulders]



Thank you for your reply.


 

guettli

unread,
Apr 7, 2017, 5:27:38 AM4/7/17
to Django users
Hi Marten,


Am Donnerstag, 6. April 2017 14:10:58 UTC+2 schrieb knbk:
Hi Thomas,

The primary purpose of logging is to catch and examine errors. If something went wrong, you want to know when and why. Logging to a database increases the complexity and greatly increases the number of things that can go wrong. The last thing you want to find out when retracing an error is that you don't have any logs because the logging system failed. You may also need to log errors that happened during startup, before a database connection can be established. Logging to file is the simplest method, and has the least chance of failure. That's why you should always log to file. 

The two options are not mutually exclusive. Like you said, times have changed, and the overhead to store logs both in a file and in a database are nowadays acceptable. If you have a good reason to store the logs in a database, then go ahead. Just remember that it should be in addition to file-based logging. 


Yes, you are right. During the initialization of processes no db connections exists yet. I don't like redundancy but here its needed for a higher availability.

Scot Hacker

unread,
Apr 8, 2017, 12:58:59 PM4/8/17
to Django users
On Tuesday, April 4, 2017 at 5:18:42 AM UTC-7, guettli wrote:
In the past I was told: Don't store logs in the database.

For general purposes, I agree with this. Logging is a python standard, logs can be verbose, logrolling solutions are well established (and built in), etc. However, there are certain situations or activities where database logging makes sense, most likely *in addition to* standard logging rather than instead of. In one of my projects, half a dozen non-technical managers need the ability to track certain types of actions (related to account activations at a school). For this, I developed a simple ORM logging solution that lets those managers search and filter these special logs in the Django admin. 

It's not something that really deserves to be its own project, IMO - just a typical thing a dev might do in Django to satisfy an institutional need. But I've put my solution in this gist, in case its helpful:

 

guettli

unread,
Apr 10, 2017, 6:55:09 AM4/10/17
to Django users
Thank you for sharing.

Yes, the model is not difficult, that's why everybody helps himself up to now. Maybe this is the best solution and there is not much in common.

The things which are in common (ORM, Admin-Interface) are already provided by django.

It's just a bit of glue-code to get it working.

Again, thank you for sharing your solution.

Regards,
  Thomas
Reply all
Reply to author
Forward
0 new messages