How to handle dependency generated during the compiling process

47 views
Skip to first unread message

Fabrizio Da Ros

unread,
Sep 13, 2016, 10:22:17 AM9/13/16
to python-doit
Hi, I'm running a project to replace a set of bacth scripts and makefile with an elegant Python implementation.
The target of the project is to have a building system to compile and link.
I'm using DoIt that I feel very fitting the task.
I have a main script to setup the environment and treat other aspects and then invokes DoIt through a
DoitMain( MyLoader() ).run( ... )

Everything is working fine (makefile looks like stone age compared with Python+DoIt) but I have a question.
The building system is fed by C and H files.
The first time a C file compiled, the file_dep property is set to the source file itself (no other information are available)
During the compiling process a dependency list is generated by the compiler.
On the second run, since the list is available (written in a file), it is read, purged and used as list of
'file_dep'.
Since the list of dependency has been modified, it triggers a compile again.
The third run generates a task equal to the second, so the task is skipped.

With this implementation is not possible to avoid the second (and useless) compiling process.

I'm wondering how this can be solved.
- the first time "main.o" does not exist so it is created from "main.c", along with "main.dep" (list of dependency)
My idea is to write an 'uptodate' routine with this meta-code
- if "main.c" is changed, compile it (no need to load deps)
- if not, look at any mods on the file listed in the dependency

In this way file_deps does not get modified and lists only "main.c".

Among the routine available (uptodate section) I didn't find something that returns whether a passed file has been modified or not (according to its internal algorithm)

Any suggestion is welcome,
Fabrizio.

Eduardo Schettino

unread,
Sep 15, 2016, 1:26:28 PM9/15/16
to python-doit


On Tue, Sep 13, 2016 at 8:59 PM, Fabrizio Da Ros <fab...@gmail.com> wrote:

Everything is working fine (makefile looks like stone age compared with Python+DoIt) but I have a question.
The building system is fed by C and H files.
The first time a C file compiled, the file_dep property is set to the source file itself (no other information are available)
During the compiling process a dependency list is generated by the compiler.
On the second run, since the list is available (written in a file), it is read, purged and used as list of
'file_dep'.
Since the list of dependency has been modified, it triggers a compile again.
The third run generates a task equal to the second, so the task is skipped.

With this implementation is not possible to avoid the second (and useless) compiling process.

I'm wondering how this can be solved.
- the first time "main.o" does not exist so it is created from "main.c", along with "main.dep" (list of dependency)
My idea is to write an 'uptodate' routine with this meta-code
- if "main.c" is changed, compile it (no need to load deps)
- if not, look at any mods on the file listed in the dependency

In this way file_deps does not get modified and lists only "main.c".

Among the routine available (uptodate section) I didn't find something that returns whether a passed file has been modified or not (according to its internal algorithm)

Any suggestion is welcome,


There are a few different ways to deal with this, I guess the best one is to add another action that modifies the task's `file_dep`.

Look at this commit from Nikola project where it deals with the compilation of ReST files.
It is same situation as yours where the list of dependencies is generated during the compilation itself.
https://github.com/schettino72/nikola/commit/70a7542e330cc29751f290a5481c10edaceff3d5

I guess it is at least the third time this question comes up. It would be great if *someone* contribute an example to the docs.
I thought I had done this already but could not find it :P

cheers


Reply all
Reply to author
Forward
0 new messages