Hi, I'm running a project to replace a set of bacth scripts and makefile with an elegant Python implementation.
The target of the project is to have a building system to compile and link.
I'm using DoIt that I feel very fitting the task.
I have a main script to setup the environment and treat other aspects and then invokes DoIt through a
DoitMain( MyLoader() ).run( ... )
Everything is working fine (makefile looks like stone age compared with Python+DoIt) but I have a question.
The building system is fed by C and H files.
The first time a C file compiled, the file_dep property is set to the source file itself (no other information are available)
During the compiling process a dependency list is generated by the compiler.
On the second run, since the list is available (written in a file), it is read, purged and used as list of 'file_dep'.
Since the list of dependency has been modified, it triggers a compile again.
The third run generates a task equal to the second, so the task is skipped.
With this implementation is not possible to avoid the second (and useless) compiling process.
I'm wondering how this can be solved.
- the first time "main.o" does not exist so it is created from "main.c", along with "main.dep" (list of dependency)
My idea is to write an 'uptodate' routine with this meta-code
- if "main.c" is changed, compile it (no need to load deps)
- if not, look at any mods on the file listed in the dependency
In this way file_deps does not get modified and lists only "main.c".
Among the routine available (uptodate section) I didn't find something that returns whether a passed file has been modified or not (according to its internal algorithm)
Any suggestion is welcome,
Fabrizio.