Re: converting bsnes-qt to build using redo

90 views
Skip to first unread message

Avery Pennarun

unread,
Jan 4, 2011, 2:42:12 AM1/4/11
to Tim Allen, redo
On Mon, Jan 3, 2011 at 11:02 PM, Tim Allen <scre...@froup.com> wrote:
> Anyway, I realise you probably don't want to sign up to code-review
> every use of redo on the planet, but you might be interested in the Make
> -> Redo diff:

I probably don't want to discuss *every* use of redo that ever comes
along in the future, but right now "every use of redo on the planet"
is a pretty small number, I think :)

It's an excellent idea to discuss a few real-life examples here on the
list. I personally want to hear how people end up using the system,
and see if that leads us to come up with any better ways to do things
in redo to make it easier.

> http://gitorious.org/bsnes-qt/bsnes-qt/commit/5888b2f4134ee67c1bb8713a66df92e87382ea79
>
> So, over the Christmas break I converted a friend's somewhat
> idiosyncratic C++ codebase from Make to redo, just as an exercise. I
> also converted it to SCons, by way of comparison.

I tried this out, and this is what happened:

7faf613... /tmp/bsnes-qt $ redo
redo all
redo out/bsnes
redo build-scripts/config
Checking for <libsnes.hpp>...
Checking for library 'snes'...
/usr/bin/ld: cannot find -lsnes
collect2: ld returned 1 exit status
Cannot find a libsnes implementation.
redo: build-scripts/config: exit code 1
redo: out/bsnes: exit code 1
redo: all: exit code 1

It looks kind of cool already, although in my dream world
build-scripts/config should end up being broken into a bunch of
separate .do files that each test for exactly one thing.

Where should I get -lsnes? I see a libsnes.hpp, so maybe this should
be obvious somehow.

> SCons really wants you
> to use its built-in build-rules, which turned out to be a problem for
> this particular code-base.

Well... that's actually kind of good news to me, because that's
exactly why I hate build systems that include their own default build
rules :) I already ranted about this in the redoconf thread, but this
is confirmation that the idea of leaving default rules out of core
redo is a very good idea. You'll simply never find a set of rules
that satisfies everyone, and the eternal search for a "perfect" set of
rules results in each version being subtly incompatible with the last.
I don't want redo to suffer from that.

> I used shell-functions for the compile script instead of
> a separate compilation script because I wasn't sure if a "redo-ifchange"
> spawned inside a subprocess would be able to "see" whatever state it
> needed to update the dependency database.

It is okay to rely on that. I was very careful about it, and I use
that functionality extensively in my own code; sometimes I use shell
functions, sometimes I source files directly into my .do scripts, and
sometimes I generate scripts that get executed; it depends on what
seems most elegant at the time. But you can count on redo-ifchange
working right in all these cases. (Basically it sets some environment
variables, which are inherited by subprocesses. It even works if you
'cd' to somewhere else before calling redo-ifchange, and works as you
want it to: 'cd ..; redo-ifchange foo' will create a dependency on
../foo)

In particular, if you generate a script called "compile", that script
is allowed to call redo-ifchange. It will create a dependency from
whichever target called 'compile' (directly or indirectly) to whatever
the compile script runs redo-ifchange on. (It will *not* create a
dependency of the compile script itself on that target, because that
isn't what you want.)

> Also, because writing
> shell-scripts that write shell-scripts that get all their variable
> escaping right is a pain.

That's true. It's a pain to get right, but if you're compiling a
large number of files, it can pay off in performance, since the shell
ends up doing less work per file. For a small project it probably
isn't worth the complexity though, so you've probably got the right
idea doing it the easy way here.

> I guess the easy solution is just to stick all the source in a "src/"
> directory beside whatever "doc/" and "data/" directories may be.
> Consider my objection retracted.
>
> That said, I didn't actually do anything of the sort with the sample
> project I mentioned above - partially because the existing Make system
> used a different layout that I didn't want to change, and partially
> because there was no obvious way to find the source file that
> corresponded to a given .o name.

I haven't actually looked closely at your build system yet (it's more
fun to read the code after I've seen it work first :)) but "there's no
obvious way to find the source file" is exactly what default*.do files
are for.

You just create default.o.do, and in that file, you try all the
obvious or non-obvious possibilities for where the source file might
be, using whatever search algorithm is relevant for your project.

If each .o file comes from a totally different place, then you could
just have a filename.o.do for each one, which is messy, but it's
equally messy in make. Or if most of your files are in obvious
places, but some are weird exceptions, you could have a default.o.do
for the obvious cases and filename.o.do for each of the exceptions.

Or, if you've got a whole lot of weird exceptions and you don't want
to create a zillion separate .do files, you could have a default.o.do
that reads through a sources.list, something like this:

redo-ifchange sources.list
while read objfile srcfile; do
[ "$objfile" = "$1$2" ] && break
done <sources.list
redo-ifchange $srcfile
./compile $objfile $srcfile

> I did create a "default.moc.do" file, but that's because the Makefile
> system used Make's ability to construct make-rules at run-time.

What do you mean by "construct make-rules at run-time" exactly? Do
you have an example of the make syntax you're talking about?

> I suppose the redo equivalent of a Makefile that creates build rules at
> runtime would be some kind of "foo.do.do" file. I didn't try it;
> I wasn't sure redo would understand such a concept.

.do.do files are sort of supported, but it's explicitly not *fully*
supported right now, since I find it too confusing to think about
until it's proven absolutely necessary :) The trouble is that
searching for a .do file would then have to potentially *create* a .do
file, which would involve running a .do file, which might involve
*creating* a .do file, etc etc recursively, and that's a bit too mind
melting and has weird edge cases and is quite possibly not what
anybody wants anyway.

However, I don't think you really need that. As I wrote above, a
default.do could simply run whatever rules you want depending on the
output filename. You could even hardcode particular output filenames
in default.do to act a particular way, just as you might do with make:

case $1$2 in
foo.o) gcc -o foo.o -c ../whatever/foo.c ;;
chicken.moc) moc ../thingy/chicken.moc.in ;;
esac

And so on, as crazy as you want.

Have fun,

Avery

Tim Allen

unread,
Jan 4, 2011, 5:41:11 AM1/4/11
to Avery Pennarun, redo
On Mon, Jan 03, 2011 at 11:42:12PM -0800, Avery Pennarun wrote:
> On Mon, Jan 3, 2011 at 11:02 PM, Tim Allen <scre...@froup.com> wrote:
> > Anyway, I realise you probably don't want to sign up to code-review
> > every use of redo on the planet, but you might be interested in the Make
> > -> Redo diff:
>
> I probably don't want to discuss *every* use of redo that ever comes
> along in the future, but right now "every use of redo on the planet"
> is a pretty small number, I think :)
>
> It's an excellent idea to discuss a few real-life examples here on the
> list. I personally want to hear how people end up using the system,
> and see if that leads us to come up with any better ways to do things
> in redo to make it easier.

Well, I'm happy to blather about my particular efforts in more detail.
:)

> I tried this out, and this is what happened:
>
> 7faf613... /tmp/bsnes-qt $ redo
> redo all
> redo out/bsnes
> redo build-scripts/config
> Checking for <libsnes.hpp>...
> Checking for library 'snes'...
> /usr/bin/ld: cannot find -lsnes
> collect2: ld returned 1 exit status
> Cannot find a libsnes implementation.
> redo: build-scripts/config: exit code 1
> redo: out/bsnes: exit code 1
> redo: all: exit code 1
>
> It looks kind of cool already, although in my dream world
> build-scripts/config should end up being broken into a bunch of
> separate .do files that each test for exactly one thing.

Today (while composing the previous message) I looked back over djb's
"redo" documents now that I actually understand redo a little better,
and noticed he talks about keeping the settings for "cc" separate from
the settings for "ld" and so forth, so changing the ld configuration
doesn't require recompiling all your .o files, which makes a lot of
sense.

However, I wrote that auto-configuration thing after messing with SCons,
which exposes a somewhat similar API, which is roughly "create a block
of default compilation settings, then each configuration test changes
the block, then write it out afterwards". You'll note my configuration
script follows pretty much exactly the same pattern.

I guess it could be done as a bunch of .do files, if it was easy to
coalesce all the modifications to all the compilation variables.

> Where should I get -lsnes? I see a libsnes.hpp, so maybe this should
> be obvious somehow.

No, it's not at all obvious. Let's see if I can explain this without
getting bogged down in details...

"bsnes" is a Super Nintendo emulator written by one "byuu"
(see http://byuu.org/bsnes for details). Up until very recently, the GUI
and the underlying emulation core were bundled together in one package.
However, in an effort to foster other GUIs, byuu created a (roughly)
C-compatible API named "libsnes" to wrap the emulation core, then ported
the GUI to use it.

"bsnes-Qt" is the standalone GUI based on the "libsnes" API. There's no
standalone package for building libsnes, and the original source package
has a pretty minimalist build-system, so the easiest way to get
a libsnes would be:

git clone git://gitorious.org/bsnes/bsnes.git
cd bsnes
git checkout patches
cd bsnes # (yes, there's a second bsnes directory)
make -j3 profile=performance ui=ui-libsnes
sudo make ui=ui-libsnes install # (installs to /usr/local/)

(you'll need g++-4.5 available)

One thing that has come up in discussions about configuring bsnes-qt is
that since nobody can "apt-get install libsnes" yet, people may well
want to choose whether to link with "-ldl -lsnes" or
"/some/path/libsnes.a". The obvious answer is "write a ./configure
script that accepts options" but I'm not sure how that would interact
with a redo-based auto-configuration system.

> > SCons really wants you
> > to use its built-in build-rules, which turned out to be a problem for
> > this particular code-base.
>
> Well... that's actually kind of good news to me, because that's
> exactly why I hate build systems that include their own default build
> rules :)

It gets better - by "really wants" I mean "SCons requires sophisticated
Python code to create new build rules". If you look at the "scons"
branch of my bsnes-qt repository, it includes a "qt4" plugin[1] that adds
special build rules for Qt4-based projects. It's nearly a thousand lines
long, and I still have to hack around it in the SConstruct file because
this particular project doesn't conform to the usual Qt conventions.

By comparison the "build-scripts/util" file in my "redo" branch is about
60 lines long, half of which is the non-Qt-specific "compile" shell
function. Granted, my solution would be a lot bigger if it was as
portable and comprehensive as the SCons plugin, but at least this
approach was feasible for me to create in an afternoon.

[1] http://gitorious.org/bsnes-qt/bsnes-qt/blobs/scons/site_scons/site_tools/qt4/__init__.py

> > Also, because writing
> > shell-scripts that write shell-scripts that get all their variable
> > escaping right is a pain.
>
> That's true. It's a pain to get right, but if you're compiling a
> large number of files, it can pay off in performance, since the shell
> ends up doing less work per file. For a small project it probably
> isn't worth the complexity though, so you've probably got the right
> idea doing it the easy way here.

By "a pain" I mean I couldn't figure out how to do it properly.

For example, the obvious approach:

case $(uname -m) in
x86) ;;
*)
CCFLAGS="$CCFLAGS -fPIC"
CXXFLAGS="$CXXFLAGS -fPIC"
;;
esac

cat << EOF
CCFLAGS="$CCFLAGS"
CXXFLAGS="$CXXFLAGS"
EOF

In the case statement, if $CCFLAGS happens to have a double-quote
character in it, there's no problem because the shell handles all that
in one expansion pass.

In the here-document, if $CCFLAGS happens to have a double-quote
character in it, you get troubles because the shell doing the expansion
(in this case, parsing the here-document) is a different instance from
the shell that will be executing the resulting command-line.

Ideally there'd be a shell variable substitution that automatically
shell-quotes the given variable or something like that; possibly such
a function would be another useful addition to the Redo Standard
Library.

> I haven't actually looked closely at your build system yet (it's more
> fun to read the code after I've seen it work first :)) but "there's no
> obvious way to find the source file" is exactly what default*.do files
> are for.

I mean, the .o filename can be unrelated to the original .cpp filename,
and I'd rather not pick a random .cpp file and hope it's the right one. :)

> You just create default.o.do, and in that file, you try all the
> obvious or non-obvious possibilities for where the source file might
> be, using whatever search algorithm is relevant for your project.

That's pretty much what I do with obj/default.moc.do, since there I can
guess what the original filename is, and approximately where it might
be.

Now that I think about it, a default.o.moc that hard-coded the list of
source-file -> target file mappings would still probably be simpler than
my current system of having a foo.o.do file for every object.

> If each .o file comes from a totally different place, then you could
> just have a filename.o.do for each one, which is messy, but it's
> equally messy in make.

If you'll look at the original Makefile, each of the .o build-rules
includes a few hundred characters of haphazardly specified dependencies,
while the "compile" shell-function uses gcc's dependency-output feature
to get them exactly right. So, it's not "equally messy in make.", it's
much, much more messy. :)

> Or, if you've got a whole lot of weird exceptions and you don't want
> to create a zillion separate .do files, you could have a default.o.do
> that reads through a sources.list, something like this:
>
> redo-ifchange sources.list
> while read objfile srcfile; do
> [ "$objfile" = "$1$2" ] && break
> done <sources.list
> redo-ifchange $srcfile
> ./compile $objfile $srcfile

Yes. In retrospect, that's what I should have done.

> > I did create a "default.moc.do" file, but that's because the Makefile
> > system used Make's ability to construct make-rules at run-time.
>
> What do you mean by "construct make-rules at run-time" exactly? Do
> you have an example of the make syntax you're talking about?

In the original Makefile, it's on lines 101 to 107:

http://gitorious.org/bsnes-qt/bsnes-qt/blobs/patches/Makefile#line101

In this particular package, "*.moc.hpp" files throughout the source tree
are processed into ".moc" files, and dumped into the "obj/" directory.
In Make this is pretty easy, since it's throwing away information:

- recursively find *.moc.hpp files (actually done on line 48)
- for each file "path/to/foo.moc.hpp" generate the name "obj/foo.moc"
- use Make's $(eval) macro to create a rule that declares that
"obj/foo.moc" depends on "path/to/foo.moc.hpp"

In redo, since I start out only knowing the target, I have to
reconstruct the information that Make throws away:

- start with "obj/foo.moc"
- generate the name "foo.moc.hpp"
- use find(1) to search the project for files named "foo.moc.hpp"
- assume the first one we find is the correct source-file.
- redo-ifchange our suspected source.
- build the target from our suspected source.

Since the original Makefile tosses directory names when generating the
target name, it's safe to assume there aren't any potential source files
with identical names. Still, it doesn't feel clean.

> > I suppose the redo equivalent of a Makefile that creates build rules at
> > runtime would be some kind of "foo.do.do" file. I didn't try it;
> > I wasn't sure redo would understand such a concept.
>
> .do.do files are sort of supported, but it's explicitly not *fully*
> supported right now, since I find it too confusing to think about
> until it's proven absolutely necessary :) The trouble is that
> searching for a .do file would then have to potentially *create* a .do
> file, which would involve running a .do file, which might involve
> *creating* a .do file, etc etc recursively, and that's a bit too mind
> melting and has weird edge cases and is quite possibly not what
> anybody wants anyway.

Now that I come to think about it, the scary problem with
meta-build-rules would be distinguishing "this file does not exist and
I don't know how to build it, so I'll stop" from "this file does not exist, and
I haven't yet found a sufficiently meta build-rule for it, so I'll keep
searching." I suppose, since each level of metaness adds three
characters (".do") to the file name redo needs to search for, one could
simply record the longest file-name in the target directory, and abandon
the search when the required build-rule name would be longer than any
existing filename. If you're using the "search parent directories"
branch, you've got to look harder, of course... but any directory has
only a finite number of parent directories, so the search would still
terminate.

If you wanted to build "foo.o", and there was not "foo.o.do" or
"default.o.do", you'd want to check for "foo.o.do.do" or
"default.o.do.do". If you could only find "default.o.do.do", you'd have
to call it as:

default.o.do.do "foo" ".o.do" "foo.o.do-tmp"

...but it might generate either "foo.o.do" or "default.o.do", so you'd
probably have to restart the "does this thing exist, or is there
build-rule to tell me how to make it" search each time.

So I suspect meta-build-rules (and meta-meta-build-rules, etc.) would be
workable... just rather slow. And, as you suggest, quite possibly not
what anybody wants anyway.

Avery Pennarun

unread,
Jan 4, 2011, 6:34:55 AM1/4/11
to Tim Allen, redo
On Tue, Jan 4, 2011 at 2:41 AM, Tim Allen <scre...@froup.com> wrote:
> On Mon, Jan 03, 2011 at 11:42:12PM -0800, Avery Pennarun wrote:
>> It's an excellent idea to discuss a few real-life examples here on the
>> list.  I personally want to hear how people end up using the system,
>> and see if that leads us to come up with any better ways to do things
>> in redo to make it easier.
>
> Well, I'm happy to blather about my particular efforts in more detail.
> :)

Excellent :)

> Today (while composing the previous message) I looked back over djb's
> "redo" documents now that I actually understand redo a little better,
> and noticed he talks about keeping the settings for "cc" separate from
> the settings for "ld" and so forth, so changing the ld configuration
> doesn't require recompiling all your .o files, which makes a lot of
> sense.
>
> However, I wrote that auto-configuration thing after messing with SCons,
> which exposes a somewhat similar API, which is roughly "create a block
> of default compilation settings, then each configuration test changes
> the block, then write it out afterwards". You'll note my configuration
> script follows pretty much exactly the same pattern.

Yeah. I think that method will cause quite a lot of inelegance.

> I guess it could be done as a bunch of .do files, if it was easy to
> coalesce all the modifications to all the compilation variables.

No problem:

config.do:
targets=$(for d in config/*.do; do echo ${d%.do}; done)
redo-ifchange $targets
cat $targets

Now if you want to be lazy, depend on the file 'config', which you can
source into any shell script. If you want to be efficient, depend on
the individual config files you care about.

> "bsnes-Qt" is the standalone GUI based on the "libsnes" API. There's no
> standalone package for building libsnes, and the original source package
> has a pretty minimalist build-system, so the easiest way to get
> a libsnes would be:
>
>    git clone git://gitorious.org/bsnes/bsnes.git
>    cd bsnes
>    git checkout patches
>    cd bsnes # (yes, there's a second bsnes directory)
>    make -j3 profile=performance ui=ui-libsnes
>    sudo make ui=ui-libsnes install # (installs to /usr/local/)

Clearly what we need is a redo rule that contains exactly the above :)

> (you'll need g++-4.5 available)

That pretty much disqualifies me. There are actually programs out
there that depend on a version of g++ *that* new?

> One thing that has come up in discussions about configuring bsnes-qt is
> that since nobody can "apt-get install libsnes" yet, people may well
> want to choose whether to link with "-ldl -lsnes" or
> "/some/path/libsnes.a". The obvious answer is "write a ./configure
> script that accepts options" but I'm not sure how that would interact
> with a redo-based auto-configuration system.

I've been thinking ahead here :)

You can have a config/whatever.do that tries to guess what the options
should be. If you don't like its answer, you can create
config/whatever by hand. redo is smart enough to not overwrite it,
since it knows the file was created from outside redo. Thus it's very
easy to override any file in the build process. (It also prints a
helpful warning so you know why it's not rebuilding.)

> It gets better - by "really wants" I mean "SCons requires sophisticated
> Python code to create new build rules". If you look at the "scons"
> branch of my bsnes-qt repository, it includes a "qt4" plugin[1] that adds
> special build rules for Qt4-based projects. It's nearly a thousand lines
> long, and I still have to hack around it in the SConstruct file because
> this particular project doesn't conform to the usual Qt conventions.

Sweet, I think that counts as redo's first third party testimonial :)

> By "a pain" I mean I couldn't figure out how to do it properly.
>
> For example, the obvious approach:
>
>    case $(uname -m) in
>        x86) ;;
>        *)
>            CCFLAGS="$CCFLAGS -fPIC"
>            CXXFLAGS="$CXXFLAGS -fPIC"
>            ;;
>    esac
>
>    cat << EOF
>    CCFLAGS="$CCFLAGS"
>    CXXFLAGS="$CXXFLAGS"
>    EOF
>
> In the case statement, if $CCFLAGS happens to have a double-quote
> character in it, there's no problem because the shell handles all that
> in one expansion pass.

Yeah, you're right of course, shell quoting is hard. The good news is
you can simply write your shell scripts to not use double quotes - or
else write them to not use single quotes, since people probably use a
lot more double quotes right now - and you won't have to care.
There's also the utterly horrible option:

cat <<EOF
CCFLAGS=$(cat <<GOOF
$CCFLAGS
GOOF)
CXXFLAGS=$(cat <<GOOF
$CXXFLAGS
GOOF)
EOF

(untested, but you get the idea)

> Ideally there'd be a shell variable substitution that automatically
> shell-quotes the given variable or something like that; possibly such
> a function would be another useful addition to the Redo Standard
> Library.

Yes, I could see a justification for that. But let's see redo get
used in a few projects first, then we can always factor out the common
requirements.

>> > I did create a "default.moc.do" file, but that's because the Makefile
>> > system used Make's ability to construct make-rules at run-time.
>>
>> What do you mean by "construct make-rules at run-time" exactly?  Do
>> you have an example of the make syntax you're talking about?
>
> In the original Makefile, it's on lines 101 to 107:
>
> http://gitorious.org/bsnes-qt/bsnes-qt/blobs/patches/Makefile#line101
>
> In this particular package, "*.moc.hpp" files throughout the source tree
> are processed into ".moc" files, and dumped into the "obj/" directory.
> In Make this is pretty easy, since it's throwing away information:
>
>    - recursively find *.moc.hpp files (actually done on line 48)
>    - for each file "path/to/foo.moc.hpp" generate the name "obj/foo.moc"
>    - use Make's $(eval) macro to create a rule that declares that
>      "obj/foo.moc" depends on "path/to/foo.moc.hpp"

Holy sweet mother of bjesus! My eyes! The goggles! They do nothing!

That is quite possibly the wrongest way to do anything in make that I
have ever seen.

> In redo, since I start out only knowing the target, I have to
> reconstruct the information that Make throws away:
>
>    - start with "obj/foo.moc"
>    - generate the name "foo.moc.hpp"
>    - use find(1) to search the project for files named "foo.moc.hpp"
>    - assume the first one we find is the correct source-file.
>    - redo-ifchange our suspected source.
>    - build the target from our suspected source.
>
> Since the original Makefile tosses directory names when generating the
> target name, it's safe to assume there aren't any potential source files
> with identical names. Still, it doesn't feel clean.

Yeah, I might do this instead (untested):

# generate a list of all source files; this needs to run every
# time for safety, but redo-stamp prevents its dependencies
# from being marked as dirty.
sources.list.do:
redo-always
find -name '*.moc.hpp' | tee $3 | redo-stamp

# find the source filename of a particular .moc file.
# sources.list changes when *any* filename changes, but
# this particular one might not change, so use redo-stamp here
# too.
default.mocsrc.do:
redo-ifchange sources.list
grep "/$1.moc.hpp$" sources.list | tee $3 | redo-stamp

# actually generate the .moc file
default.moc.do:
redo-ifchange $1.mocsrc
redo-ifchange $(cat $1.mocsrc)
moc -o $1$2 $(cat $1.mocsrc)

# generate all the .moc files
all.do:
redo-ifchange sources.list
while read src; do echo $(basename $src); done | xargs redo-ifchange


> Now that I come to think about it, the scary problem with
> meta-build-rules would be distinguishing "this file does not exist and
> I don't know how to build it, so I'll stop" from "this file does not exist, and
> I haven't yet found a sufficiently meta build-rule for it, so I'll keep

> searching." [...]

Yeah, in short, it's crazy :) I'm certain it's possible, but it's a
pain, and explaining how it works is very hard. In fact, I should
probably improve redo to explicitly reject anybody who tries to
redo-ifchange a .do file, since it won't work they way they think it
does.

The speed is probably not that big a deal; stat() is pretty fast
anyway, and this would only be needed if a particular .do file didn't
exist. But the craziness is much more of a concern.

Have fun,

Avery

Tim Allen

unread,
Jan 4, 2011, 6:56:58 AM1/4/11
to Avery Pennarun, redo
(just a brief note to correct a mistake I made)

On Tue, Jan 04, 2011 at 03:34:55AM -0800, Avery Pennarun wrote:
> On Tue, Jan 4, 2011 at 2:41 AM, Tim Allen <scre...@froup.com> wrote:
> > "bsnes-Qt" is the standalone GUI based on the "libsnes" API. There's no
> > standalone package for building libsnes, and the original source package
> > has a pretty minimalist build-system, so the easiest way to get
> > a libsnes would be:
> >
> > � �git clone git://gitorious.org/bsnes/bsnes.git
> > � �cd bsnes
> > � �git checkout patches
> > � �cd bsnes # (yes, there's a second bsnes directory)
> > � �make -j3 profile=performance ui=ui-libsnes
> > � �sudo make ui=ui-libsnes install # (installs to /usr/local/)
>
> Clearly what we need is a redo rule that contains exactly the above :)
>
> > (you'll need g++-4.5 available)
>
> That pretty much disqualifies me. There are actually programs out
> there that depend on a version of g++ *that* new?

bsnes' author is a great fan of C++, and likes to use as much of it as
he can. g++-4.5 adds support for C++0x lambdas, which are used to great
effect in the "phoenix" UI toolkit he's working on.

However, I just checked, and both libsnes and the Qt GUI build fine with
g++-4.4. It's just the newer in-development "phoenix" GUI that requires
the latest g++.

Avery Pennarun

unread,
Jan 4, 2011, 7:17:18 AM1/4/11
to Tim Allen, redo
On Tue, Jan 4, 2011 at 3:56 AM, Tim Allen <scre...@froup.com> wrote:
> bsnes' author is a great fan of C++, and likes to use as much of it as
> he can. g++-4.5 adds support for C++0x lambdas, which are used to great
> effect in the "phoenix" UI toolkit he's working on.
>
> However, I just checked, and both libsnes and the Qt GUI build fine with
> g++-4.4. It's just the newer in-development "phoenix" GUI that requires
> the latest g++.

Well, I'm using Debian-lenny, which only has 4.3. Oh well. I'll have
to just imagine how great this build system would be if only the thing
it were building worked on my computer :)

Have fun,

Avery

Tim Allen

unread,
Jan 8, 2011, 11:59:16 PM1/8/11
to Avery Pennarun, redo
On Tue, Jan 04, 2011 at 03:34:55AM -0800, Avery Pennarun wrote:
> On Tue, Jan 4, 2011 at 2:41 AM, Tim Allen <scre...@froup.com> wrote:
> > I guess it could be done as a bunch of .do files, if it was easy to
> > coalesce all the modifications to all the compilation variables.
>
> No problem:
>
> config.do:
> targets=$(for d in config/*.do; do echo ${d%.do}; done)
> redo-ifchange $targets
> cat $targets

That means all the $targets need to be generated to be very careful
about adding to variables rather than clobbering them. On one hand,
that's not hard to do right, on the other hand it leads to very
confusing and difficult-to-trace behaviour when done wrong; on the
gripping hand, "welcome to shell programming!"

> > One thing that has come up in discussions about configuring bsnes-qt is
> > that since nobody can "apt-get install libsnes" yet, people may well
> > want to choose whether to link with "-ldl -lsnes" or
> > "/some/path/libsnes.a". The obvious answer is "write a ./configure
> > script that accepts options" but I'm not sure how that would interact
> > with a redo-based auto-configuration system.
>
> I've been thinking ahead here :)
>
> You can have a config/whatever.do that tries to guess what the options
> should be. If you don't like its answer, you can create
> config/whatever by hand. redo is smart enough to not overwrite it,
> since it knows the file was created from outside redo. Thus it's very
> easy to override any file in the build process. (It also prints a
> helpful warning so you know why it's not rebuilding.)

Does minimal/do have the same helpful feature? It speaks of "Removing
previously built files..." which bodes ill.

I've spent some time thinking about how to have a configure script that
integrated nicely with the configuration system you suggest, and I'd
gotten as far as a system like this (excuse the psuedo-code):

config-deps:

add-dependency c-compiler
add-dependency glib
add-dependency gtk2

configure:

if "--help" in "$@":
add-dependency () {
. $BASEDIR/build-scripts/config/$1-help
}
else
add-dependency () {
. $BASEDIR/build-scripts/config/$1-parse-args
}
fi

. $BASEDIR/config-deps

redo config

config.do:

add-dependency () {
. $BASEDIR/build-scripts/config/$1-guess-config
}

...but as I kept thinking up problems and devising contorted
shell-functions to get around them (wash, rinse, repeat), I eventually
realised I might perhaps be over-engineering a tad.

I'm sure people will want "./configure" style configuration eventually
(if not immediately); maybe it's worth putting it off for later?

For what it's worth, a friend of mine has ported bsnes-Qt to his own
sh-based configuration system (that integrates with Make rather than
replacing it); I've been getting a few ideas from it, so I thought I'd
mention it.

The bsnes-Qt port is:

https://github.com/Themaister/bsnes-Qt/tree/quickbuild

...and his original "quickbuild" repository is:

https://github.com/Themaister/Quickbuild

> > Ideally there'd be a shell variable substitution that automatically
> > shell-quotes the given variable or something like that; possibly such
> > a function would be another useful addition to the Redo Standard
> > Library.
>
> Yes, I could see a justification for that. But let's see redo get
> used in a few projects first, then we can always factor out the common
> requirements.

I think a proper shell-quoting function would look something like this:

shquote () {
# Prints a shell-quoted version of each parameter.
while [ $# -gt 0 ]; do
local QUOTED=`echo "$1" | sed -e "s/'/'\"'\"'/"`
printf "'%s' " "$QUOTED"
shift
done
}

I'm pretty confident that any sizable shell-programming exercise will
need something like it, so I'll just leave it here for future
generations. :)



> > Now that I come to think about it, the scary problem with
> > meta-build-rules would be distinguishing "this file does not exist and
> > I don't know how to build it, so I'll stop" from "this file does not
> > exist, and I haven't yet found a sufficiently meta build-rule for
> > it, so I'll keep searching." [...]
>
> Yeah, in short, it's crazy :) I'm certain it's possible, but it's a
> pain, and explaining how it works is very hard. In fact, I should
> probably improve redo to explicitly reject anybody who tries to
> redo-ifchange a .do file, since it won't work they way they think it
> does.

A definite "no" is better than a "maybe", and if you later figure out
how to manage the craziness, you can be sure there's no existing redo
systems that depend on the former behaviour. :)

Avery Pennarun

unread,
Jan 9, 2011, 12:25:53 AM1/9/11
to Tim Allen, redo
On Sat, Jan 8, 2011 at 8:59 PM, Tim Allen <scre...@froup.com> wrote:
> On Tue, Jan 04, 2011 at 03:34:55AM -0800, Avery Pennarun wrote:
>> On Tue, Jan 4, 2011 at 2:41 AM, Tim Allen <scre...@froup.com> wrote:
>> > I guess it could be done as a bunch of .do files, if it was easy to
>> > coalesce all the modifications to all the compilation variables.
>>
>> No problem:
>>
>> config.do:
>>     targets=$(for d in config/*.do; do echo ${d%.do}; done)
>>     redo-ifchange $targets
>>     cat $targets
>
> That means all the $targets need to be generated to be very careful
> about adding to variables rather than clobbering them. On one hand,
> that's not hard to do right, on the other hand it leads to very
> confusing and difficult-to-trace behaviour when done wrong; on the
> gripping hand, "welcome to shell programming!"

Hmm, well, if you're really concerned about that, you could do one of
two things:

a) Have the script that merges config items auto-detect whether more
than one input file sets the same value, and barf if so.

b) Only allow any given config/whatever.do to output a single value.
That might be too restrictive, but it could be very elegant if we
*did* find a way to make it work. "read CONFIG_WHATEVER
<config/whatever" is very clean.

>> You can have a config/whatever.do that tries to guess what the options
>> should be.  If you don't like its answer, you can create
>> config/whatever by hand.  redo is smart enough to not overwrite it,
>> since it knows the file was created from outside redo.  Thus it's very
>> easy to override any file in the build process.  (It also prints a
>> helpful warning so you know why it's not rebuilding.)
>
> Does minimal/do have the same helpful feature? It speaks of "Removing
> previously built files..." which bodes ill.

Not exactly. minimal/do just always removes all the files it has ever
produced. It doesn't actually check whether you've changed the
contents of those files or not. If it was very slightly smarter, we
could make it only remove the files it produced during the last build
- and if you manually created one after doing a 'minimal/do clean'
(for example) it would know not to delete it.

In any case, minimal/do is, well, minimal. It might make sense to
have something slightly less minimal if people are actually going to
use it as something other than a last resort.

>    configure:
>
>        if "--help" in "$@":
>            add-dependency () {
>                . $BASEDIR/build-scripts/config/$1-help
>            }
>        else
>            add-dependency () {
>                . $BASEDIR/build-scripts/config/$1-parse-args
>            }
>        fi
>
>        . $BASEDIR/config-deps
>
>        redo config
>
>    config.do:
>
>        add-dependency () {
>            . $BASEDIR/build-scripts/config/$1-guess-config
>        }

My first impression of that is it sure is a whole lot of code,
consider it doesn't actually do anything yet :)

I think maybe to appease the people who feel like they need it, a
./configure that just says this:

./do config/all

would be fine. If people really want to override settings, I think
manually writing the overrides to files in config/* is actually much
clearer and easier than running ./configure with different options.
Standard ./configure options have at least these problems:

- --with-whatever vs. --enable-whatever; any given option only works
with one of them, but which one?

- if I make a typo on the command line, it doesn't tell me

- it's unclear whether I should pass things like CC=gcc-4.3 as an
argument or in the environment

- if I want to reconfigure one setting, I have to rerun the entire config script

- the --help output is 99% junk that doesn't actually tell me how to
configure the package I'm using.

On the whole, cloning that UI doesn't seem like it would be very
helpful to the world at large. I won't stop you if you want to try,
of course, but it just doesn't seem like anything anyone will cry too
much about losing.

> For what it's worth, a friend of mine has ported bsnes-Qt to his own
> sh-based configuration system (that integrates with Make rather than
> replacing it); I've been getting a few ideas from it, so I thought I'd
> mention it.

Hmm, it seems to have no documentation and I can't even tell what it
does. And I don't know what you mean by "integrates with make." So I
think I'll wait until there's something more to see there.

> I think a proper shell-quoting function would look something like this:
>
>    shquote () {
>        # Prints a shell-quoted version of each parameter.
>        while [ $# -gt 0 ]; do
>            local QUOTED=`echo "$1" | sed -e "s/'/'\"'\"'/"`
>            printf "'%s' " "$QUOTED"
>            shift
>        done
>    }
>
> I'm pretty confident that any sizable shell-programming exercise will
> need something like it, so I'll just leave it here for future
> generations. :)

The road to security hole hell is paved with the bones of people who
tried and failed to write correct shell quoting functions. I
personally am too scared of the issue to even try :) I don't see any
obvious flaws in your implementation, but that probably doesn't prove
anything.

>> [rules for .do.do files]


>
> A definite "no" is better than a "maybe", and if you later figure out
> how to manage the craziness, you can be sure there's no existing redo
> systems that depend on the former behaviour. :)

Yeah, true.

Have fun,

Avery

Reply all
Reply to author
Forward
0 new messages