Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Geany, gcc and libraries - Do I have to keep adding them to the build command ?

383 views
Skip to first unread message

R.Wieser

unread,
Dec 14, 2017, 7:53:58 AM12/14/17
to
Hello all,

I'm currently wetting my feet with writing programs in C on a RPi3, using
Geany as the editor. But after having written a few programs I've got a
couple of questions:

1) For a certain program I had to add "-l wiringPi" to the 'build' command.
The problem (to me) is that I can't seem to find any "wiringPi.so" file.
So, what gives ? How does this at all work (how can I follow the
breadcrumbs to the actually-used library file) ?

2) I've now added three library references (GTK, ALSA and wiringPi) to the
build command, and am not liking that at all - I could imagine that after a
year or so I will have a whole slew if them there ... :-( So, is there
maybe a way to include the library references to the sourcefile ? It
would certainly make life easier.

3) I've found (on the 'Web) that gcc accepts @file references (holding extra
aguments), but when I use such files they must exist or gcc will error out.
As not all of my programs need extra libraries I don't want to add such an
(empty!) file for each of them. In other words, can I somehow tell gcc (or
bash!) to use (suply) that file, but only if it exists ?


Than again, I'm a novice in this (Linux, C) respect, so maybe there is an
even easier method to handle libraries (.so files) than the above. If so,
please do not hesitate to mention it. :-)

Regards,
Rudy Wieser


Bit Twister

unread,
Dec 14, 2017, 8:40:20 AM12/14/17
to
On Thu, 14 Dec 2017 13:53:39 +0100, R.Wieser wrote:
> Hello all,
>
> I'm currently wetting my feet with writing programs in C on a RPi3, using
> Geany as the editor. But after having written a few programs I've got a
> couple of questions:
>
>
> 3) I've found (on the 'Web) that gcc accepts @file references (holding extra
> aguments), but when I use such files they must exist or gcc will error out.
> As not all of my programs need extra libraries I don't want to add such an
> (empty!) file for each of them. In other words, can I somehow tell gcc (or
> bash!) to use (suply) that file, but only if it exists ?

I can suggest creating a Makefile which describes what all needs to be
done to generate the target binary.

You then run "make" which reads Makefile and compiles/links your program.

Example Makefile to generate errno from errno.c

$ cat Makefile
#
# Turn on just about everything reasonably possible
#
GCC_WFLAGS = -Wcast-align -Wcast-qual -Wmissing-prototypes \
-Wshadow -Wnested-externs -Wstrict-prototypes \
-Waggregate-return # -Wtraditional #-Wpointer-arith

#
# Comment out as needed...
#
GCC_OPTIM = -O2
GCC_SYMTAB = -ggdb # -I "${QTDIR}/include"
GCC_ANSI = -std=c99
GCC_PEDANT = -pedantic

#
GCC_DEBUG = ${GCC_SYMTAB} ${GCC_OPTIM} ${GCC_ANSI} ${GCC_PEDANT} -Wall -W
CC = gcc
LD = ${CC} #-L /usr/X11R6/lib

LIBS = #-lncurses #-lutil -lXt #-lncurses -lm #-lreadline
OPTS = ${GCC_DEBUG} ${GCC_WFLAGS} ${GCC_FFLAGS}

SRCS = errno.c
OBJS = $(SRCS:.c=.o)
TARGETS = errno

all: $(TARGETS)

.c.o:
$(CC) $(OPTS) -c $<

errno: ${OBJS}
${LD} ${OBJS} $(LIBS) $(LDOPTS) -o $@
#***************** end Makefile *********************************

R.Wieser

unread,
Dec 14, 2017, 9:20:05 AM12/14/17
to
Bit Twister,

> I can suggest creating a Makefile which describes what all needs to be
> done to generate the target binary.
>
> You then run "make" which reads Makefile and compiles/links your program.

I know. But as I mentioned in my post, Geany uses a different method.

Also, how would that effectivily be different from my #3 point (using an
@file) ?

Furthermore, "make" is a bitch to use (tried to use it on Windows a couple
of years ago).

Thanks for that example file though.

Regards,
Rudy Wieser


Dan Purgert

unread,
Dec 14, 2017, 9:28:14 AM12/14/17
to
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

R.Wieser wrote:
> Hello all,
>
>
> 1) For a certain program I had to add "-l wiringPi" to the 'build'
> command. The problem (to me) is that I can't seem to find any
> "wiringPi.so" file. So, what gives ? How does this at all work (how
> can I follow the breadcrumbs to the actually-used library file) ?

Did you perhaps forget to install one of the -dev packages, or perhaps
you've not yet compiled / installed something that provides the named
library.

What does `locate wiringPi.so' result in?
>
> 2) I've now added three library references (GTK, ALSA and wiringPi) to
> the build command, and am not liking that at all - I could imagine
> that after a year or so I will have a whole slew if them there ... :-(
> So, is there maybe a way to include the library references to the
> sourcefile ? It would certainly make life easier.

If I'm understanding the question right, it should be something along
the lines of:

#include $header_file

Note that there is some specific syntax you need to use.

- #include <stdio.h> <- include a *system* header file
- #include "mystdio.h" <- include a header file, in either the current
directory, quote directory(or directories), then finally where system
headers are found.
>
> 3) I've found (on the 'Web) that gcc accepts @file references (holding
> extra aguments), but when I use such files they must exist or gcc will
> error out. As not all of my programs need extra libraries I don't want
> to add such an (empty!) file for each of them. In other words, can I
> somehow tell gcc (or bash!) to use (suply) that file, but only if it
> exists ?

Sounds like you'll want to use a makefile for this one.



-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQEcBAEBAgAGBQJaMopmAAoJEI4R3fMSeaKBCiAH/RJpTLE9mUOBkZ2Y/B5FNLzi
y+N6fKTDjVx4bbaWZIAlbL5oYN2cEitbEg4VZ26yzPGJAySiw7eymkWKhx/rbLfh
hJR61cO55wD1YLNY0wSYHdlAQ2AW92n7VeHCpmw3lXPPrW1HRf4/qA2fF64rvel1
P9wzPpB0Lh2dTWeJvKnX7nG1pZrX8UFKdoc+WyDtZLojwZEp6KBww/B70X5ydzPd
GWCjb9uXFS3+TOr7KacLgJ820DdXZnME2vlbzjRuwx4mSNGG4UM0lFnfuX1YbcEe
9XhDnAGMFxUYWQOdsJy/afiBa7njixe3vzZB8J+2iPOaPMCEzIOpuHgeHRe6pUY=
=8c+Q
-----END PGP SIGNATURE-----

--
|_|O|_| Registered Linux user #585947
|_|_|O| Github: https://github.com/dpurgert
|O|O|O| PGP: 05CA 9A50 3F2E 1335 4DC5 4AEE 8E11 DDF3 1279 A281

Dan Purgert

unread,
Dec 14, 2017, 9:49:17 AM12/14/17
to
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

R.Wieser wrote:
> Bit Twister,
>
>> I can suggest creating a Makefile which describes what all needs to be
>> done to generate the target binary.
>>
>> You then run "make" which reads Makefile and compiles/links your program.
>
> I know. But as I mentioned in my post, Geany uses a different method.
>
> Also, how would that effectivily be different from my #3 point (using an
> @file) ?

You customize the script (the makefile) per project (or, if necessary,
per source file that you need to compile).

>
> Furthermore, "make" is a bitch to use (tried to use it on Windows a
> couple of years ago).

good news, Linux is not windows; and the shortcomings of that OS do not
necessarily encumber this one.


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQEcBAEBAgAGBQJaMo9XAAoJEI4R3fMSeaKBjOMH/2bthteOzhW0RyhadLpZaEjp
v2PwlckYwwFcpUY7drE8h8dNuXJhtpcgWnExmZZj2Mus+0v8ZCJZDH2+EXcyQJyl
kbELVYyQWbU5DloQCNjbxluPNTzevfoS+G7ikoLMijAAYuTiqgYNBow/Uau0AREd
djInDohn2/SBpaKblnxfCKfgDRdzTNDDGuQUJEUYK5ytIKvmuS8XvbONQh5jJo60
JxGl8n3pvq+DuFxsgePWk1AfgJ5b44MAbs+0j3FmqGbt5CYC8DbOWjXlOF/dvjd+
cpY8QZQRbb8VzcFA3pBxIOgs4IetBio1cX4ygFUq6aPwlNWgGEjP6deIqvR4Iyk=
=1waQ

R.Wieser

unread,
Dec 14, 2017, 10:05:52 AM12/14/17
to
Dan,

> Did you perhaps forget to install one of the -dev packages,

I'm sorry that I was not clear about it, but he program compiles with that
library designation (without it it doesn't. Yeah, I actually tried it.
:-) ).

Its really just a question of how the system translates that "wiringPi"
argument into the (I'm guessing here) "libwiringPi.so" that I was able to
find. Just an automatic pre- and postfixing of "lib" and ".so", or is
something more going on ?

> What does `locate wiringPi.so' result in?

"bash: locate: command not found" :-( :-)

I did search it thru the file browser though, and was unable to find it (in
the /usr tree) (hence my question).

> If I'm understanding the question right, it should be something
> along the lines of:
>
> #include $header_file

Yes. But that is for the header (.h), not the libary (.so) file (wish
something similar, like "#includelib" was available though).

> Sounds like you'll want to use a makefile for this one.

Nope. All I want is a *simple* system to designate which library files the
source should be linked against. "make" is rather overpowered (and not to
mention: complex) for that.

Regards,
Rudy Wieser


Bit Twister

unread,
Dec 14, 2017, 10:37:24 AM12/14/17
to
On Thu, 14 Dec 2017 15:19:50 +0100, R.Wieser wrote:
> Bit Twister,
>
>> I can suggest creating a Makefile which describes what all needs to be
>> done to generate the target binary.
>>
>> You then run "make" which reads Makefile and compiles/links your program.
>
> I know. But as I mentioned in my post, Geany uses a different method.
>
> Also, how would that effectivily be different from my #3 point (using an
> @file) ?
>
> Furthermore, "make" is a bitch to use (tried to use it on Windows a couple
> of years ago).

Well, @file would have the same command line arguments you would have
placed in the Makefile.

When you get to coding a large program with several modules make will
do it faster than your @file.

Just for fun, try making a copy of my Makefile for your app.
do a global change of errno to your apps name, and run make.

Do remember to add any libraries as needed for your app.

Bit Twister

unread,
Dec 14, 2017, 10:41:39 AM12/14/17
to
On Thu, 14 Dec 2017 16:05:36 +0100, R.Wieser wrote:
>
> "bash: locate: command not found" :-( :-)

You might want to consider installing mlocate package. It is very
handy for locating files on the system.

For instance
$ locate libwiring
$

See nothing found. Another example

$ locate libwir
/usr/lib64/libwireshark.so.8
/usr/lib64/libwireshark.so.8.1.11
/usr/lib64/libwiretap.so.6
/usr/lib64/libwiretap.so.6.0.11

Eef Hartman

unread,
Dec 14, 2017, 11:57:26 AM12/14/17
to
R.Wieser <add...@not.available> wrote:
> Its really just a question of how the system translates that "wiringPi"
> argument into the (I'm guessing here) "libwiringPi.so" that I was able to
> find. Just an automatic pre- and postfixing of "lib" and ".so", or is
> something more going on ?

1) ld (the linker/loader) looks for both lib<whatever>.a and
lib<whatever>.so in the standard library directories (OR in
those directories, given by -L<dirname>, note CAPITAL L, option(s)
which normally are /lib /usr/lib and /usr/local/lib but in 64-bit
systems these can be followed by "64", so /lib64 etc.

>> #include $header_file

No, #include looks for the header (.h) file FOR the definitions of
types and functions, included IN those libraries (and these headers
are searched for in /usr/include).

> something similar, like "#includelib" was available though).

The problem is, the # commands are processed BEFORE the C compiler
(by cpp, the C pre-processor) while the -l (and -L) options are passed
on to ld and only processed AFTER the C compiler has exited.
The resulting .o file doesn't have anything like "which libraries are
needed", just a bunch of unresolved references.
The task of ld is to link all those .o's (there can be a lot of them)
together and then resolve all the remaining references from the
specified libraries (of which only the standard C library is implied,
but even the math one HAS to specified as -lm).
So, yes, there can be a LOT of -l options that are needed for a
certain program.

For instance, gcc itself uses -lm and i.e. the vim editor has the
following list of (shared) libraries:
$ ldd /usr/bin/vim
linux-gate.so.1 (0xffffe000)
libm.so.6 => /lib/libm.so.6 (0xb7828000)
libncurses.so.5 => /lib/libncurses.so.5 (0xb77da000)
libacl.so.1 => /lib/libacl.so.1 (0xb77d1000)
libgpm.so.2 => /lib/libgpm.so.2 (0xb77cb000)
libdl.so.2 => /lib/libdl.so.2 (0xb77c5000)
libperl.so => /usr/lib/perl5/CORE/libperl.so (0xb7638000)
libpthread.so.0 => /lib/libpthread.so.0 (0xb761e000)
libc.so.6 => /lib/libc.so.6 (0xb7492000)
libpython2.7.so.1.0 => /usr/lib/libpython2.7.so.1.0 (0xb7298000)
/lib/ld-linux.so.2 (0xb789e000)
libattr.so.1 => /lib/libattr.so.1 (0xb7292000)
libnsl.so.1 => /lib/libnsl.so.1 (0xb7277000)
libcrypt.so.1 => /lib/libcrypt.so.1 (0xb7244000)
libutil.so.1 => /lib/libutil.so.1 (0xb7240000)
from which linux-gate.so.1 and /lib/ld-linux.so.2 are standard (they
are always needed for shared object support) and libc.so.6 is the
standard C library. All the rest did need -l options to be included
into the resulting executable.
You see that the ldd command does give the pathnames of the shared
libraries found for this executable.

To automatize the build you either need a custom shell script for the
executable you want to build or a makefile (which is close to that but
can be used for multiple executable instead of just the single one).

For a single executable a makefile can be as simple as
executable: <list of .o files>
gcc <same list> -l<library> ... $(LDFLAGS) -o <executable>

Make will normally automatic create the .o files from the same named
.c files (if they don't exist yet).

R.Wieser

unread,
Dec 14, 2017, 12:14:35 PM12/14/17
to
Dan,

>> Also, how would that effectivily be different from my #3 point (using
>> an @file) ?
>
> You customize the script (the makefile) per project (or, if necessary,
> per source file that you need to compile).

Again, how would that be DIFFERENT from using an @file ?

> good news, Linux is not windows; and the shortcomings of that OS
> do not necessarily encumber this one.

Thanks for that wisdom. Now, how is that related to "make" ? I hope you
do not mean to insinuate that "make" compiled on Windows is most likely to
be inferior to the same compiled on Linux ... In that case you better have
a talk with whomever designed it ("make" I mean). :-D

Regards,
Rudy Wieser


ray carter

unread,
Dec 14, 2017, 12:29:29 PM12/14/17
to
Suggest you try comp.sys.raspberry-pi.

Dan Purgert

unread,
Dec 14, 2017, 12:35:30 PM12/14/17
to
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

R.Wieser wrote:
> Dan,
>
>> What does `locate wiringPi.so' result in?
>
> "bash: locate: command not found" :-( :-)

mlocate is a godsend :)
>
> I did search it thru the file browser though, and was unable to find
> it (in the /usr tree) (hence my question).
>
>> If I'm understanding the question right, it should be something
>> along the lines of:
>>
>> #include $header_file
>
> Yes. But that is for the header (.h), not the libary (.so) file (wish
> something similar, like "#includelib" was available though).

Ugh, it's been far too long since I've done C (and I was bad back then
too). As I recall though, you kind of had two options for function
declarations.

(1) Include the whole thing in your code, e.g.

#code here

void myFucntion (int input1, int input2) {
#do stuff here
return $result;
}

# more code here

void main ()

This works, but every time we want to use 'myFunction', we've got to
copy/paste the code into our new project. So, this leads us to:

(2) Write a library, which consists of two components
(2a) The library itself (e.g. myio.c), containing all the functions
and the actual code to do stuff.
(2b) The header file (e.g. myio.h), which provides a listing of the
functions, and some information on "how" they work.

Now, a lot of the rest of this is a few minutes of google, and restating
what I've found in the manner I understand it. I'm quite possibly way
off the mark.

Anyway, since we've decided a library is the way to go, our next
decision is whether or not to make it a static library ('myio.a'), or a
shared library ('myio.so'). The biggest difference I found between them
is that

- a static library will pull the requisite code into the binary at
compile-time, as if you had put the function into the binary instead
of using a library.

- a shared library will NOT pull the requisite code into the binary,
but simply have a pointer saying "go over here and do this".

There are other differences as well; but they seem to require an
understanding of C that I myself do not posess.

>
>> Sounds like you'll want to use a makefile for this one.
>
> Nope. All I want is a *simple* system to designate which library
> files the source should be linked against. "make" is rather
> overpowered (and not to mention: complex) for that.

As complex as make may be, as I recall it is less burdensome than having
to try remembering all of the compile-time switches to get something
done (especially if it's been a while since you compiled a specific
program). But if you don't want to do that, then suppose you're "stuck" with
writing the options out at compile time, or perhaps writing a shell
script to handle the commands.


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQEcBAEBAgAGBQJaMrZKAAoJEI4R3fMSeaKBkZMH/1J/0Aoc8ZwDZYGGrxmUN82H
y/68EbB4U9Qmkr7Nt+T8Vme1iqnCoXwCYnvGXanAcAgCNLwyrSkwUiFeS6hjvtko
0Coh1iztBRzcEYpWNJza2ZiCcpqvGwegYzAN0fWba5IvLz4a0fC4orOGXGynlroo
b9oQypgxcRD9KYYgqZZ2Wi2KElPZ3Q94ALzwRDozHst/YSCBT+/ps+IpYQ6sY6t8
hUCH4gQev8gDrY2e7RWQhhjDF2KeuOiUOT4Y/DWNhPWFsmwxJyM38qsss18sDXTz
TQ7KU/RRocYNYrJRYj0F+m4ho8XQKiNJoGlf2pyrW/dNmEqU5EwNcPv7VnwW6ZY=
=9cxR

J.O. Aho

unread,
Dec 14, 2017, 12:49:54 PM12/14/17
to
On 12/14/17 14:40, Bit Twister wrote:
> On Thu, 14 Dec 2017 13:53:39 +0100, R.Wieser wrote:
>> Hello all,
>>
>> I'm currently wetting my feet with writing programs in C on a RPi3, using
>> Geany as the editor. But after having written a few programs I've got a
>> couple of questions:
>>
>>
>> 3) I've found (on the 'Web) that gcc accepts @file references (holding extra
>> aguments), but when I use such files they must exist or gcc will error out.
>> As not all of my programs need extra libraries I don't want to add such an
>> (empty!) file for each of them. In other words, can I somehow tell gcc (or
>> bash!) to use (suply) that file, but only if it exists ?
>
> I can suggest creating a Makefile which describes what all needs to be
> done to generate the target binary.

It may be a lot easier to use cmake instead, it makes make a lot easier
for everyone.

--

//Aho

R.Wieser

unread,
Dec 14, 2017, 2:02:53 PM12/14/17
to
Eef,

> No, #include looks for the header (.h) file FOR the definitions
> of types and functions, included IN those libraries

Yes, thats bit odd as an answer, as he already has shown that the "-l"
argument is the way to specify libraries ... Also, I never mentioned a
problem with header files.

> The problem is, the # commands are processed BEFORE the C
> compiler (by cpp, the C pre-processor) while the -l (and -L) options
> are passed on to ld and only processed AFTER the C compiler has
> exited.

:-) It was an example. Change that "#" to anything else, and question
would remain the same.

Also, gcc calls the linker, so gcc has full control over what gets delivered
to it, and thus is in the perfect position to generate such arguments from
stuff it could find in the provided sourcefile.

... That it doesn't (seem to) do so is a whole other problem.

> The task of ld is to link all those .o's (there can be a lot of them)

True. And when that moment comes I will probably (again) take another look
at "make". But at the moment I, nor Geany, have any use for it.

> To automatize the build you either need a custom shell script for
> the executable you want to build

Nope. Yes, you would need a shell script. Nope, it would not need to be
specifically for a single program. In the exact same way as "make" does
not need to be tailored to a specific program (mind you, I said "make", not
"makefile").

The whole difference would be that with a shellscript I could read the
embedded libraries info (and possibly also references to other object files)
outof the source file, which is handy for small projects (with multiple of
them in a single folder). Also, the shellscript could easily detect the
presence a special file holding makefile contents, meaning it could invoke
"make" with that file when it does -- thereby having the best of both
worlds.

Regards,
Rudy Wieser


William Unruh

unread,
Dec 14, 2017, 2:17:53 PM12/14/17
to
On 2017-12-14, R.Wieser <add...@not.available> wrote:
> Hello all,
>
> I'm currently wetting my feet with writing programs in C on a RPi3, using
> Geany as the editor. But after having written a few programs I've got a
> couple of questions:
>
> 1) For a certain program I had to add "-l wiringPi" to the 'build' command.
> The problem (to me) is that I can't seem to find any "wiringPi.so" file.
> So, what gives ? How does this at all work (how can I follow the
> breadcrumbs to the actually-used library file) ?

Well, you will need to find a Debian library package that supplies wiringPi

>
> 2) I've now added three library references (GTK, ALSA and wiringPi) to the
> build command, and am not liking that at all - I could imagine that after a
> year or so I will have a whole slew if them there ... :-( So, is there
> maybe a way to include the library references to the sourcefile ? It
> would certainly make life easier.

Not sure what your question is. If you have them on your system, then they
will be found.

>
> 3) I've found (on the 'Web) that gcc accepts @file references (holding extra
> aguments), but when I use such files they must exist or gcc will error out.

Of course. You told it to read such a file, and if it is not there, gcc cannot
guess what you wanted.

> As not all of my programs need extra libraries I don't want to add such an
> (empty!) file for each of them. In other words, can I somehow tell gcc (or
> bash!) to use (suply) that file, but only if it exists ?

It cannot supply that file. If just uses it. And you only need to tell gcc to
use it when you want to use it.

William Unruh

unread,
Dec 14, 2017, 2:21:09 PM12/14/17
to
On 2017-12-14, R.Wieser <add...@not.available> wrote:
> Dan,
>
>>> Also, how would that effectivily be different from my #3 point (using
>>> an @file) ?
>>
>> You customize the script (the makefile) per project (or, if necessary,
>> per source file that you need to compile).
>
> Again, how would that be DIFFERENT from using an @file ?

Because it is far more powerful and flexible. It calls gcc. It is not
something that runs under gcc.

>
>> good news, Linux is not windows; and the shortcomings of that OS
>> do not necessarily encumber this one.
>
> Thanks for that wisdom. Now, how is that related to "make" ? I hope you
> do not mean to insinuate that "make" compiled on Windows is most likely to
> be inferior to the same compiled on Linux ... In that case you better have
> a talk with whomever designed it ("make" I mean). :-D

I think we will leave you to talk with Microsoft as to their implimentation of
make.

You thought that it was somehow universal?
>
> Regards,
> Rudy Wieser
>
>

R.Wieser

unread,
Dec 14, 2017, 2:22:56 PM12/14/17
to
Ray,

> Suggest you try comp.sys.raspberry-pi.

I already tried. They seem to think that helping someone consists outof
repeating "read the documentation" at nauseum, while refusing to even hint
at *which* documentation that should be.

Not funny, especially not when you are just starting with your RPi3, and
have absolutily *no clue* to what needs to be done to get something to work
(in my case, what I needed to do to be able to create my very first GUI
program). Very frustrating too.

As a result I question the reason of that newsgroup existance. They can
replace (pretty-much) everyone there with a simple, "read the
documentation!" replybot and be done with it.

Also, the problem isn't RPi3 related (It would not dream of asking about how
to program GPIO here), but just some Debian (from which raspbian is a direct
child) tools.

Remarkable though, that with the success of the Raspberry there is just
*one* newsgroup that deals with it. Just as remarkable that there are so
few general Linux programming newsgroups actually.

Regards,
Rudy Wieser


William Unruh

unread,
Dec 14, 2017, 2:24:26 PM12/14/17
to
On 2017-12-14, R.Wieser <add...@not.available> wrote:
> Dan,
>
>> Did you perhaps forget to install one of the -dev packages,
>
> I'm sorry that I was not clear about it, but he program compiles with that
> library designation (without it it doesn't. Yeah, I actually tried it.
>:-) ).
>
> Its really just a question of how the system translates that "wiringPi"
> argument into the (I'm guessing here) "libwiringPi.so" that I was able to
> find. Just an automatic pre- and postfixing of "lib" and ".so", or is
> something more going on ?

Nope. That is what it does. So you do have a libwiringPi.so file?


>
>> What does `locate wiringPi.so' result in?
>
> "bash: locate: command not found" :-( :-)

Install locate.


>
> I did search it thru the file browser though, and was unable to find it (in
> the /usr tree) (hence my question).

It is a library so it would be in one of the lib/ directories (/usr/lib or
/lib)


>
>> If I'm understanding the question right, it should be something
>> along the lines of:
>>
>> #include $header_file
>
> Yes. But that is for the header (.h), not the libary (.so) file (wish
> something similar, like "#includelib" was available though).
>
>> Sounds like you'll want to use a makefile for this one.
>
> Nope. All I want is a *simple* system to designate which library files the
> source should be linked against. "make" is rather overpowered (and not to
> mention: complex) for that.

Makefile IS a "simple" system.

>

R.Wieser

unread,
Dec 14, 2017, 3:00:13 PM12/14/17
to
Dan,

> mlocate is a godsend :)

I've not heard anything which makes it different, let alone better, than
what I've already been using (the "find file" option in my filebrowser). As
such I cannot agree (nor disagree) with you (or others).

It also means that I have no reason to even take a look at it (other than
why you guys think its the best since sliced bread - but still having to
second-guess it) ...

>> something similar, like "#includelib" was available though).
>
> Ugh, it's been far too long since I've done C (

My apologies, I expressed myself badly there. :-\

I ment that line as an indication of which library/ies gcc c|should
(internally) generate "-l" arguments for (which it than could pass to the
linker). Its not at all about using static or dynamic libraries.

Probably as I'm way more aquainted with the way (my environment in) Windows
does it: A .INC file for the definitions, and a .LIB file for the
translation between external function names and where to find them in the
.DLL files (After converting to an executable the first two are not needed
anymore).

> As complex as make may be, as I recall it is less burdensome than
> having to try remembering all of the compile-time switches to get
> something done

In other words, your makefiles are most likely carbon copies of each other,
with some minor work done on them to change the target, object and (perhaps)
library filenames.

In that case, why not put the (few) changing parts somewhere else, and put
the static parts in a - never to be looked again - shellscript (or better
yet, a configuration file) ? :-)

> But if you don't want to do that, then suppose you're "stuck" with
> writing the options out at compile time,

Not really. That is what that "build" command(line) in the Editor (Geany)
is for: to put all the static stuff in there, with placeholders for the
different filenames (in,out and intermediate object files).

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 14, 2017, 3:20:51 PM12/14/17
to
William,

> Well, you will need to find a Debian library package that supplies
> wiringPi

Thats not the problem, those libraries where pre-installed.

The problem is that, to be able to successfully compile a program which uses
wiringPi functions I have to provide a "-l wiringPi" argument to gcc, and
even though it is, *I* cannot find any "wiringPi.so" (or .o) file anywhere
on my machine.

My question (therefore) is, how can that be ? If its actually there,
where is it ? Or is something else going on, and is it perhaps actually
referring to a "libwiring PI.so" file ? And in that case, how does that
happen ?

> Of course. You told it to read such a file, and if it is not there, gcc
> cannot
<strike>guess what you wanted.</stike> read it.

Duh!

> It cannot supply that file.

Thats not what I asked. Read again:
[quote]
In other words, can I somehow tell gcc to use that file, but only
if it exists ?
[/quote]
(I removed the "bash" references, maybe that makes it easier to read)

Regards,
Rudy Wieser


William Unruh

unread,
Dec 14, 2017, 3:39:29 PM12/14/17
to
On 2017-12-14, R.Wieser <add...@not.available> wrote:
> William,
>
>> Well, you will need to find a Debian library package that supplies
>> wiringPi
>
> Thats not the problem, those libraries where pre-installed.
>
> The problem is that, to be able to successfully compile a program which uses
> wiringPi functions I have to provide a "-l wiringPi" argument to gcc, and
> even though it is, *I* cannot find any "wiringPi.so" (or .o) file anywhere
> on my machine.

And as you discovered that is because that directive looks for a file called
libwiringPi.so.
Ie it attaches a "lib" to the front.

The library refered to by name "z" is in the file libz.so or libz.a



>
> My question (therefore) is, how can that be ? If its actually there,
> where is it ? Or is something else going on, and is it perhaps actually
> referring to a "libwiring PI.so" file ? And in that case, how does that
> happen ?

libwiringPi is the file containing the library wiringPi.

>
>> Of course. You told it to read such a file, and if it is not there, gcc
>> cannot
><strike>guess what you wanted.</stike> read it.
>
> Duh!
>
>> It cannot supply that file.
>
> Thats not what I asked. Read again:
> [quote]
> In other words, can I somehow tell gcc to use that file, but only
> if it exists ?
> [/quote]

No. that directive says "read that file" it does not say "read that file if it
exists". Note you, presumably, when you type the gcc command, know whether or
not that file exists.

William Unruh

unread,
Dec 14, 2017, 3:45:05 PM12/14/17
to
On 2017-12-14, R.Wieser <add...@not.available> wrote:
> Dan,
>
>> mlocate is a godsend :)
>
> I've not heard anything which makes it different, let alone better, than
> what I've already been using (the "find file" option in my filebrowser). As

It looks for names which contain the the argument, not names which are exactly
the same as the argument.


> such I cannot agree (nor disagree) with you (or others).
>
> It also means that I have no reason to even take a look at it (other than
> why you guys think its the best since sliced bread - but still having to
> second-guess it) ...

You do not have to take advice. But do not be surprized if people stop
offering advice of any kind if you refuse to listen to them.

And your getting snarky does not help your case. You are starting to act like
a troll.

>
>>> something similar, like "#includelib" was available though).
>>
>> Ugh, it's been far too long since I've done C (
>
> My apologies, I expressed myself badly there. :-\
>
> I ment that line as an indication of which library/ies gcc c|should
> (internally) generate "-l" arguments for (which it than could pass to the
> linker). Its not at all about using static or dynamic libraries.
>
> Probably as I'm way more aquainted with the way (my environment in) Windows
> does it: A .INC file for the definitions, and a .LIB file for the
> translation between external function names and where to find them in the
> .DLL files (After converting to an executable the first two are not needed
> anymore).

Yes. being used to Windows is an impediment.


>
>> As complex as make may be, as I recall it is less burdensome than
>> having to try remembering all of the compile-time switches to get
>> something done
>
> In other words, your makefiles are most likely carbon copies of each other,
> with some minor work done on them to change the target, object and (perhaps)
> library filenames.

No. Makefiled can have all kinds of stuff in them. They are rarely copies of
each other. However an attempt was made to strt you off simply.


>
> In that case, why not put the (few) changing parts somewhere else, and put
> the static parts in a - never to be looked again - shellscript (or better
> yet, a configuration file) ? :-)

Why not try to learn something before wandering off into all kinds of
hypotheticals.


>
>> But if you don't want to do that, then suppose you're "stuck" with
>> writing the options out at compile time,
>
> Not really. That is what that "build" command(line) in the Editor (Geany)
> is for: to put all the static stuff in there, with placeholders for the
> different filenames (in,out and intermediate object files).

Again if you do not want advice why ask for it. I suspect that none of us use
Geany. So learing how to use it is up to you.

>
> Regards,
> Rudy Wieser
>
>

R.Wieser

unread,
Dec 14, 2017, 4:00:33 PM12/14/17
to
William,

> Because it is far more powerful and flexible.

Thats not an answer to the question I asked I'm afraid. And I could not
care less if that program could fly my granny twice around the world in the
blink of an eye either, as I have a) no need for that b) do not have a
granny.

Also, as I've mentioned a few times now, I think that makefiles are *way* to
complex to be created easily. And no, I said *created*, not copy-pasted.

> I think we will leave you to talk with Microsoft as to their
> implimentation of make.

MS was not by far the only company which wrote such a program I'm afraid ...

> You thought that it was somehow universal?

Which "it" do you mean ? Taxes ? Stupidity perhaps ? I think they both
are, but am not fully sure about the first ... :-p (yeah yeah, the
origional joke was talking about time, I know it)

As for "it" (presuming the program "make") being universal ? Don't be
daft! Even Linux has several programs all with their own versions of it,
and than I do not even want to look at all the "obsolete" versions of
yesteryear or the different Linux versions and distributions.

Regards,
Rudy Wieser



R.Wieser

unread,
Dec 14, 2017, 4:19:10 PM12/14/17
to
William

> Nope. That is what it does. So you do have a libwiringPi.so file?

Would it matter ? As you have made clear, what I think *could* happen,
according to you, doesn't. The existance of that libwiringPi.so file is
than of no consequence to anyone, right ?

... or is it ? And if it is, should you not try to explain why AS THAT IS
WHAT I'VE BEEN ASKING ALL THIS TIME ?. And yes, I'm getting pissed-off.
How good of you to notice it.

> Install locate.

No. And I'm not going to explain the rather obvious *again* (hint: How did
I find the above library ?).

> It is a library so it would be in one of the lib/ directories (/usr/lib
> or /lib)

Only one of those two ? Even under /usr I found at least four. Under / I
found .... *way* more than twenty.

> Makefile IS a "simple" system.

Just like the moonshuttle is just a "simple" rocket I presume.

Regards,
Rudy Wieser


Eef Hartman

unread,
Dec 14, 2017, 7:16:25 PM12/14/17
to
R.Wieser <add...@not.available> wrote:
> Yes, thats bit odd as an answer, as he already has shown that the "-l"
> argument is the way to specify libraries ...

This was just to counter someone else's argument that the OP should
use #include, which _is_ something different.

> Also, gcc calls the linker, so gcc has full control over what gets

But gcc does NOT read the source file, it just calls
cpp
the C compiler itself on the OUTPUT of that
and optional the linker ld, it you did NOT supply the -c (compile
only) argument in the gcc commandline.
To the latter it supplies all of the .o files and other options for
the linker, supplied on the command line.

> to it, and thus is in the perfect position to generate such arguments
> from stuff it could find in the provided sourcefile.

As I said, the gcc "driver" is just like a script, it doesn't read the
source (and often, with multiple modules, it even doesn't compile but
just puts all the .o files in the commandline for the linker).

When gcc would need the source you also would need to have the source
of ALL the functions in the libraries (which essentially are just a
bunch OF .o files joined together) present.
Because library functions CAN call other functions, that may be in
other libraries.
That's also why most libraries have a corresponding header file (.h)
to tell the actual compiler the types and headers of the functions
you want to make use of.

For instance, look up the man page for the sine function:
it will tell you to put
#include <math.h>
in the source AND to link the result with the -lm option:
#include <math.h>

double sin(double x);
float sinf(float x);
long double sinl(long double x);

Link with -lm.

BOTH are needed for the sine function to be usable by your program.

Eef Hartman

unread,
Dec 14, 2017, 7:35:55 PM12/14/17
to
R.Wieser <add...@not.available> wrote:
> The whole difference would be that with a shellscript I could read the
> embedded libraries info (and possibly also references to other object
> files) out of the source file, which is handy for small projects

So that script would have to be smart enough to know that
#include <stdio.h>
#include <stdlib.h>
do NOT need an extra library (they are for the standard C library,
which ld will use anyway), while
#include <math.h>
DOES (and that the library is called libm.a or .so).

Also note that -l isn't just for .so files, it is for ANY library of
objects (actually originally it was only for .a "archives", the .so
usage came later as programs were getting too large, so they decided
to share the most common functions between them so they get loaded
into RAM just the single instance for all executable.

Note that the library name isn't always evident from the header and
sometimes the library does not NEED a header file of its own.

There are on most systems more than a 1000 different shared objects
(that's what .so means, it is a library of functions. essentially .o
files, that can be shared by multiple executables).
The man page for the function(s) used will normally tell you which
(if any) -l option they will need.
The number of possible functions your program could be using is over
10,000 (for instance the functions of "wiringPi" are only for the
Raspberry Pi, other Linux systems normally won't have this library on
the system).

William Unruh

unread,
Dec 14, 2017, 10:27:24 PM12/14/17
to
On 2017-12-14, R.Wieser <add...@not.available> wrote:
> William,
>
>> Because it is far more powerful and flexible.
>
> Thats not an answer to the question I asked I'm afraid. And I could not
> care less if that program could fly my granny twice around the world in the
> blink of an eye either, as I have a) no need for that b) do not have a
> granny.

Nor did my answer imply it could fly your granny anywhere.
"more powerful" does not mean "all powerful".

It allows you to set up compilation of a file or a set of files, and install
them where needed if the compilation is successful. You can always just do
everything by hand. But you asked how to automate the compiling of your
program. You were told how. You wanted more power than the command line gave
you.

You really are a troll.


R.Wieser

unread,
Dec 15, 2017, 2:56:16 AM12/15/17
to
William,

>> Nope. That is what it does. So you do have a libwiringPi.so file?
>
> Would it matter ? As you have made clear, what I think *could* happen,
> according to you, doesn't.

I think I owe you an apology for this, and the following outburst. I
misread what you said, and took it to mean its exact opposite. Hence my, no
doubt confusing, response. Sorry.

I could blame me getting tired or a few other causes, but that does not take
away that I was in the wrong.

Instead, may I thank you for confirming my thoughts about what c|would be
happening (but as all of this is new to me, I was not at all sure about), it
enables me to take a "good guess" to which library should be added when only
the headerfile is known (and I do not need to consult google for it).

Remarkable though that you are the first, and *only* one to respond to this,
with a supposedly known to everyone here, simple answer (it can't get much
simpler than "yes", no ? :-) ).

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 15, 2017, 4:31:02 AM12/15/17
to
William,

> It looks for names which contain the the argument, not names which are
> exactly the same as the argument.

What makes you think that the method I described does it differently ?
<huh?>

And yes, that is _exactly the same_ as that search option in the filebrowser
I'm using does it. Its also how I was able to search for "*wiringPi*", and
come up with "libwiringPi.so".

> You do not have to take advice.

Are you sure ? Even though I've made *very* clear that I do not see
anything in using "make" I still get it thrown at me. Somehow that feels
as "not accepting 'no' as an answer" to me ... And yes, that angers me.

> But do not be surprized if people stop offering advice of any
> kind if you refuse to listen to them.

Read the above. It would be like the pot having a problem with the kettle
being black (aka: Hypocricy).

> And your getting snarky does not help your case. You are starting
> to act like a troll.

Pray tell, what the *fuck* do I have to do to get the people with the "our
solution is the only one, no matter what you want or say" people off my back
instead of being "snarky" and "act like a troll", and I will must likely try
it.

> Yes. being used to Windows is an impediment.

I translate that as "being used to something not being Linux is an
impediment". Which says more about you than it does about me ...

I could as easily accuse you of being victim/active supporter of a
mono-culture, which gets strengthened by seemingly noone here being able to
think about *anything* else to respond to my question than "make".

> No. Makefiled can have all kinds of stuff in them.

I did not say otherwise.

> They are rarely copies of each other.

Ofcourse they are not. As I already said, the filenames change. And
somethimes you have more or less sourcefiles defined. Everything else ?
Most likely stays the same, as the compiling environment (mostly) doesn't
change.

> Why not try to learn something before wandering off into all
> kinds of hypotheticals.

:-) How come you think what I said there was in any way hypothetical ?
It wasn't. As you could have read just below it (That is what that "build"
command(line) in the Editor (Geany) is for).

> Again if you do not want advice why ask for it.

"advice" eh ? Is that what you call obnoxious pushing behaviour (while not
not taking 'no' for an answer) ?

> I suspect that none of us use Geany.

:-) You're starting to learn yourself. If you would have mentioned that
earlier and/or have asked for what I'm actually looking at I could have
posted some additional info, which would possiby have enabled you to come up
with an answer thats actually tailored to my circumstances.

> So learing how to use it is up to you.

Nope. The question was not about Geany at all. It was about a "build"
option - which starts with "gcc -Wall", and which could have been, but for
the filenames, executed from the commandline -- which I assume you are
intimatily aquainted with.

Regards,
Rudy Wieser

P.s.
I've added a bit of AWK to that above build option, and can now put the
needed liberary references in the sourcefile. Problem solved. Without a
"everything is a nail" 10-pound hammer I might add. :-)


R.Wieser

unread,
Dec 15, 2017, 4:49:58 AM12/15/17
to
Eef,

> But gcc does NOT read the source file, it just calls
> cpp the C compiler itself on the OUTPUT of that

That does not change anything. gcc would be in a perfect position to
either do it itself, or call another program to do it for it.

> When gcc would need the source you also would need to have the
> source of ALL the functions in the libraries

Why ? When I'm providing the library names on the commandline it does not
seem to need them, so why would it need to when I would store those library
names in the sourcefile (where gcc or some other program could extract them)
?

> in the source AND to link the result with the -lm option:

Thats not something Geany (or any other "editor" using the same "invoke this
commandline" method) can do for anyone: the used arguments (but for the
filenames) are static. But hey, if I can store library names in the
sourcefile (and I can) I'm quite confident that I can do the same for
options too.

Regards,
Rudy Wieser

P.s.
I solved my problem by using a bit of AWK. No "make", no external script.


R.Wieser

unread,
Dec 15, 2017, 4:56:49 AM12/15/17
to
Eef,

> So that script would have to be smart enough to know that [snip]

As I mentioned somewhere else in this thread, I was thinking of using
(somerthing like) "#includelib" for it. (Not my idea, I took my cue from
another programming environment. <whistle> )

It would allow me to *not* include libraries for stuff like stdio, and also
include libraries for which no headerfile was included. In other words:
flexible.

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 15, 2017, 5:13:56 AM12/15/17
to
William,

> Nor did my answer imply it could fly your granny anywhere.
> "more powerful" does not mean "all powerful".

Whoosh!

(I don't really think so, but if you want to present yourself like stupid,
than that is the answer you're getting).

> It allows you to set up compilation of a file or a set of files, and
> install them where needed if the compilation is successful.

How many times in this thread have I mentioned that I do not need anything
of that kind, because (currently) my projects are tiny ? And you *still*
keep on pressing your POV regardless ? Really ?

> You really are a troll.

Than what would you call someone who *keeps* forcing his POV onto someone
while *refusing* to accept a "no thank you, not for me" ? Yes, I would
call someone (at least) an asshole too. :-((

As mentioned, I used a bit of AWK and had my problem solved.

And that was all it was about, a problem which needed to be solved to *my*
satisfaction. Not yours, *mine*. The moment you even *think* that you
(somehow) have the right to demand that others do exactly whay you think is
best for them you've already crossed a line. :-(

Regards,
Rudy Wieser


Eef Hartman

unread,
Dec 15, 2017, 5:16:38 AM12/15/17
to
Eef Hartman <E.J.M....@gmail.com> wrote:
> But gcc does NOT read the source file, it just calls

By the way, there are several more frontend drivers like gcc in the GNU
compiler collection, like g++ (which will compile any C++ and c sources
and then call the linker) and gfortran (which will do the same for
Fortran source files, of course). But the linking part (and options)
are the same for all of them and you can even call ld directly if you
want/need to. The major difference between the different frontends is
which libraries they will include automaticcally in the linker command
line (g++ will always add a -lstdc++ option while gfortran uses
-lgfortran instead). If you have a project with mixed language sources
there is no driver which will automatically include all of the language
dependant libraries, so for i.e. gcc you will have to include both
-llstdc++ and -lgfortran (and possible others) on the commandline.
Other languages in the GNU compiler collection (which is what gcc means)
are Ada (gnat), Objective C and Go.
From the home page:
The GNU Compiler Collection includes front ends for C, C++,
Objective-C, Fortran, Ada, and Go, as well as libraries for these
languages (libstdc++,...). GCC was originally written as the compiler
for the GNU operating system. The GNU system was developed to be 100%
free software, free in the sense that it respects the user's freedom.

Note that the whole of this is older then Linux (and not exclusively
geared TO it).

Eef Hartman

unread,
Dec 15, 2017, 5:54:35 AM12/15/17
to
R.Wieser <add...@not.available> wrote:
> Why ? When I'm providing the library names on the commandline it
> does not seem to need them, so why would it need to when I would store
> those library names in the sourcefile (where gcc or some other program
> could extract them)

An example: you're using Fast Fourier Transform from the fftw package
(which, by the way, has multiple shared objects) but those functions
may need the math library too. Now as gcc doesn't have the source of
fftw it cannot know for which routines the -lm option is needed, so
now you in YOUR source file will have to supply multiple "includelib"
directives as to make up for that. The number of those directives is
exactly equal to the number of -l options the commandline would need.
So you just move your problem to another place, you still would need
to know exactly which libraries your program will need (and optionally
in what directories those libraries are located, when not in the
standard set (/lib, /usr/lib cq /usr/local/lib).

BTW: this is from the ld man page:
The linker uses the following search paths to locate required shared
libraries:

1. Any directories specified by -rpath-link options.

2. Any directories specified by -rpath options.
The difference between -rpath and -rpath-link is that
directories specified by -rpath options are included in
the executable and used at runtime, whereas the
-rpath-link option is only effective at link time.
Searching -rpath in this way is only supported by native
linkers and cross linkers which have been configured with
the --with-sysroot option.

3. On an ELF system, for native linkers, if the -rpath and
-rpath-link options were not used, search the contents of
the environment variable "LD_RUN_PATH".

4. On SunOS, if the -rpath option was not used, search any
directories specified using -L options.

5. For a native linker, search the contents of the
environment variable "LD_LIBRARY_PATH".

6. For a native ELF linker, the directories in "DT_RUNPATH"
or "DT_RPATH" of a shared library are searched for shared
libraries needed by it. The "DT_RPATH" entries are ignored
if "DT_RUNPATH" entries exist.

7. The default directories, normally /lib and /usr/lib.

8. For a native linker on an ELF system, if the file
/etc/ld.so.conf exists, the list of directories found
in that file.

If the required shared library is not found, the linker will issue a
warning and continue with the link.


PS: the Linux version of "ld" is a native ELF linker, unless you use
a cross-compiler implementation of gcc (then the cross linker for the
target system will be used, of course).

Note that /etc/ld.so.conf is only for shared object files, not for
static libraries (.a files). LD_LIBRARY_PATH is for both.

Dan Purgert

unread,
Dec 15, 2017, 10:01:56 AM12/15/17
to
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Eef Hartman wrote:
> R.Wieser <add...@not.available> wrote:
>> Yes, thats bit odd as an answer, as he already has shown that the "-l"
>> argument is the way to specify libraries ...
>
> This was just to counter someone else's argument that the OP should
> use #include, which _is_ something different.

Mine, since I (obviously) got things crosswise between '#include' and
'linking at compile time'. I really oughta go back and re-learn things.


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQEcBAEBAgAGBQJaM+PNAAoJEI4R3fMSeaKBwvcIAKiTGB1hGoLEu/c75v/q2qrB
i9sHbcM9Zf5KN5yaiiigZb+9+L8Z83FZCbG+7IgGmi6S/M2sXRcTwAGYBMGur6XC
X/6ILUzfYxuKCrFVURlo2bcaW5I5DUQsnUKBQFkbLP82S22QwVWV5aFdpf+zZNLe
KseJZkJPVWpOtbGoY6/mHDzgKycKcOOf4oOClav0KgD7qrJotv6Q+O0c2TIcPpzQ
wd0j0aQJBjpe9tf5C0BQRGqqr/N94+yCtejm/NxYb9Y1lnmE6qBqxEdzef0qK6kP
erhoRZuI91LqZ7tNbrRz7oHWbNhBtXeXs2XlKyzzCXCalKbrZ+LRCRWXHjpBCAo=
=dT5t

Dan Purgert

unread,
Dec 15, 2017, 10:44:05 AM12/15/17
to
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

R.Wieser wrote:
> Dan,
>
>> mlocate is a godsend :)
>
> I've not heard anything which makes it different, let alone better, than
> what I've already been using (the "find file" option in my
> filebrowser). As such I cannot agree (nor disagree) with you (or
> others).

Short version, when using `find', you have to have some idea of where to
look. On the other hand, `locate' keeps a database of all the files,
and spits back more of a "hey, maybe it's one of these" type answer.
Note that the update may not necessarily be automatic (I *THINK* most
packages include a weekly cronjob, but I've run into systems where it's
needed a manual update to work as well)

As an example, "hmm, where was that stdio file again...?"

... maybe find?
dan@xps-linux: ~$ find . -name "*stdio*"
./.local/share/Steam/ubuntu12_32/filesystem_stdio.so
./.local/share/Steam/steamapps/common/Half-Life/filesystem_stdio_amd64.so
./.local/share/Steam/steamapps/common/Half-Life/filesystem_stdio.so

... maybe locate? [NOTE - swapped "/home/dan/" in the results here for
"~" in order to keep things neat)
dan@xps-linux: ~$ locate "*stdio*"
~/.local/share/Steam/steamapps/common/Half-Life/filesystem_stdio.so
~/.local/share/Steam/steamapps/common/Half-Life/filesystem_stdio_amd64.so
~/.local/share/Steam/ubuntu12_32/filesystem_stdio.so
/usr/include/stdio.h
/usr/include/stdio_ext.h
/usr/include/c++/4.9/cstdio
/usr/include/c++/4.9/ext/stdio_filebuf.h
/usr/include/c++/4.9/ext/stdio_sync_filebuf.h
/usr/include/c++/4.9/tr1/cstdio
/usr/include/c++/4.9/tr1/stdio.h
/usr/include/x86_64-linux-gnu/bits/stdio-ldbl.h
/usr/include/x86_64-linux-gnu/bits/stdio-lock.h
/usr/include/x86_64-linux-gnu/bits/stdio.h
/usr/include/x86_64-linux-gnu/bits/stdio2.h
/usr/include/x86_64-linux-gnu/bits/stdio_lim.h
/usr/lib/x86_64-linux-gnu/perl/5.20.2/CORE/nostdio.h
/usr/share/doc/libnet-ssleay-perl/examples/stdio_bulk.pl
/usr/share/man/man3/stdio.3.gz
/usr/share/man/man3/stdio_ext.3.gz
/usr/share/man/man3/unlocked_stdio.3.gz
/usr/share/man/man3/xdrstdio_create.3.gz


>
>> As complex as make may be, as I recall it is less burdensome than
>> having to try remembering all of the compile-time switches to get
>> something done
>
> In other words, your makefiles are most likely carbon copies of each
> other, with some minor work done on them to change the target, object
> and (perhaps) library filenames.
>
> In that case, why not put the (few) changing parts somewhere else, and
> put the static parts in a - never to be looked again - shellscript (or
> better yet, a configuration file) ? :-)

A makefile is precisely a config file for `make'. As I said, it's been
ages since I've done C. What little "help" I've tried to offer is
probably considerably less helpful than I'd imagined.

Sure, some stuff is static between projects (although, I think in my
case, most of the "static" stuff was the make targets moreso than the
compile-time options).

Come to think of it, a lot of larger projects (say gcc, for example)
also come with a "configure" script. I bet looking at one of those
would also help...

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQEcBAEBAgAGBQJaM+2sAAoJEI4R3fMSeaKBPO0IAKtmavVbqHAF7zAtv8W1nbd7
HHNyOpVQ2lOXwD3/raDWohBaCXnajL3/UWIGkw5dxoPya2q8xmkPYx0osyQIoABq
7D7DyiCR+syAINh4wgXZYvivgXm6QL0PknswSWeHQeelB8ZGymZh8tmPr4kkDiEb
Q0jqIKAhXfplKhOBGsHEQwLedG5ulr7N4RaVMpKtMfGmaq+gEeSESBw5csczAte/
aIoggV+MT0FN14anGaXT9FGZ7fraHPtiwVCXq0r/znW+mYLgTPdsrENQP//G33Ks
jhOTp+BKkVuatKiLfK977Sl/CgGJVe3BPT/JOnZ4ySnOYOAojgC5hHgGR+moCmM=
=vc4J

R.Wieser

unread,
Dec 15, 2017, 11:06:59 AM12/15/17
to
Eef,

> By the way, there are several more frontend drivers like gcc in
> the GNU compiler collection,

Thanks, I will keep them in mind.

> Note that the whole of this is older then Linux (and not
> exclusively geared TO it).

:-) Yup.

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 15, 2017, 11:31:38 AM12/15/17
to
Eef,

> so now you in YOUR source file will have to supply multiple
> "includelib" directives as to make up for that.

I do not directly see a problem with that.

> So you just move your problem to another place

And which problem might that be ? Yes, placing those library references
does not magically free me of having to know which libraries I need or where
they are stored. But handing that off (or even trying to) wasn't my
intention.

What was was to be able to keep using the (pretty-much) commandline
arguments approach (just hidden behind the F9 key), while not having to edit
it every time I need another library - and end up with a whole slew, even
though the current program does not need them. In that regard I seem to
have been successfull.

> Note that /etc/ld.so.conf is only for shared object files, not for
> static libraries (.a files). LD_LIBRARY_PATH is for both.

I did not even know such a config file existed, thank you for mentioning it,
and its limitation. As for that LD_LIBRARY_PATH (environment?) variable,
where can I find it (how do I read and/or alter it) ?

... I'm still quite the newbie on Linux and C, don't you think ? :-)

Regards,
Rudy Wieser


William Unruh

unread,
Dec 15, 2017, 12:25:09 PM12/15/17
to
...
> P.s.
> I solved my problem by using a bit of AWK. No "make", no external script.

Of course you can use a dime to drive in a screw, but you are still better off
learning to use a screwdriver.

>
>

R.Wieser

unread,
Dec 15, 2017, 12:42:00 PM12/15/17
to
Dan,

> Short version, when using `find', you have to have some idea of
> where to look

:-) not really a good argument I'm afraid.

In the above case nothing stops anyone from starting the search at / --
which is the equivalent of the "I have *no* idea where it is, just search
*everywhere*".

In the case of that locate command (with its database), if you cannot
indicate where to start looking (one way or the other) you can easily be
innundated with lots of irrelevant results. Because the results are not in a
specific subtree.

In other words, being able to specify where to start looking is, in my book,
a *good* thing.

Having said that, using a database ofcourse speeds such a search up
considerably. But as I disabled the equivalent on Windows (fast search) you
will probably know how important that is to me. :-)

> A makefile is precisely a config file for `make'.

I know. As I've mentioned it, I've tried to use it years ago, on Windows.

> Come to think of it, a lot of larger projects (say gcc, for example)
> also come with a "configure" script. I bet looking at one of those
> would also help...

I don't think so, as those are written to encompass pretty much *every*
eventuality you could think of on a lot of different versions of Linux.
The sheer ammount of of irrelevant cases (to my current environment) would
probably just block me from being able to see what actually *is* important
(I think I've seen a few).

But pray tell, what is it actually ment for ? I do not assume that it will
just generate makefiles outof thin air ...

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 15, 2017, 1:05:01 PM12/15/17
to
William,

> Of course you can use a dime to drive in a screw, but you are still
> better off learning to use a screwdriver.

Too bad that the problem wasn't a screw, but you kept demanding that I
should use a screwdriver regardless ...

And as have mentioned a few times (but gotten ignored *every time*),
*ofcourse* you can try to use a 10-pound hammer to hit a small nail, but you
would be *way* better off choosing a bit smaller, lighter hammer, more
attuned to its task.

Regards,
Rudy Wieser




"William Unruh" <un...@invalid.ca> wrote in message
news:p110hj$h65$1...@dont-email.me...

Dan Purgert

unread,
Dec 15, 2017, 4:43:48 PM12/15/17
to
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

R.Wieser wrote:
> Dan,
>
>> Short version, when using `find', you have to have some idea of
>> where to look
>
>:-) not really a good argument I'm afraid.
>
> In the above case nothing stops anyone from starting the search at / --
> which is the equivalent of the "I have *no* idea where it is, just search
> *everywhere*".

Indeed, nothing stops you from using `find / [...]' However, that can
take quite a bit longer, not to mention having to also filter out all the
"Permission Denied" errors that you'll be getting.

time (find / -type f -name "*stdio*")

real 0m18.713s
user 0m0.684s
sys 0m1.316s

time (locate "*stdio*")

real 0m0.148s
user 0m0.148s
sys 0m0.000s




>
> In the case of that locate command (with its database), if you cannot
> indicate where to start looking (one way or the other) you can easily be
> innundated with lots of irrelevant results. Because the results are
> not in a specific subtree.

The whole point of 'locate' is "I have no idea where to look, or whether
the one I know about in /some/path is the correct (or only) file.

>
> In other words, being able to specify where to start looking is, in my
> book, a *good* thing.

Which is why we also have find. While they overlap in their
functionality to some degree, they each also have some features that
make them a better choice depending on the situation.

>
>> A makefile is precisely a config file for `make'.
>
> I know. As I've mentioned it, I've tried to use it years ago, on Windows.

Yes, and "on windows" is a poor choice of comparison to "on linux".
You're effectively comparing apples to the sun.
>
>> Come to think of it, a lot of larger projects (say gcc, for example)
>> also come with a "configure" script. I bet looking at one of those
>> would also help...
>
> I don't think so, as those are written to encompass pretty much *every*
> eventuality you could think of on a lot of different versions of Linux.
> The sheer ammount of of irrelevant cases (to my current environment)
> would probably just block me from being able to see what actually *is*
> important (I think I've seen a few).
>
> But pray tell, what is it actually ment for ? I do not assume that it
> will just generate makefiles outof thin air ...

What is what meant for? 'make' is a program that reads the makefile,
and in turn has a recipe for doing things, which it then executes.

Grabbing a short example from the gnu make manual

foo.o : foo.c defs.h
[tab]cc -c -g foo.c

[NOTE, the first line of a recipe MUST start with a tab(or whatever is
defined in .RECIPEPREFIX), I'm explicitly writing it as [tab] so it is
not simply whitespace]

We have the target 'foo.o', which in turn has prerequisites 'foo.c' and
'defs.h'. Assuming that's all good, we finally run the recipe 'cc -c -g
foo.c'.

Yes - you can do all this on the commandline, why should you go through
the "trouble" to create makefiles? Well, let's say you've got a larger
project, with one of the final recipes being this:

objects := $(patsubst %.c,%.o,$(wildcard *.c))

foo: $(objects)
[tab]cc -o foo $(objects)

There's a couple of things going on here

objects := [...] <-- set the variable "objects" to the value [...]
$(patsubst %.c,%.o,$(wildcard *.c)) <-- look up every file that matches
the wildcard "*.c", and do pattern substitution to replace ".c" with
".o"

foo : $(objects) <-- the target "foo" has the prereq of whatever is
contained in the variable "$(objects)".

cc -o foo $(objects) <-- compile foo


If you only ever compile 'foo' once, you don't gain much. BUT, let's
say you have a problem in bar.o. When you fix it (or well, bar.c), you
would have to manually recompile bar.c AND also recompile the final
program 'foo'. Not too hard, but a step you have to remember
nonetheless; and a step that can be forgotten during rounds of
debugging.

With the makefile, you fix bar.c and re-run make. make recompiles bar.c
into bar.o, and then when it checks the recipe for foo, sees that one of
its components is now newer than the program, so automatically
re-compiles foo for you.


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQEcBAEBAgAGBQJaNEH6AAoJEI4R3fMSeaKBlLQIAJl+Cg3tGJpH3ZvIB8HIBe1t
OMEagVO8X/2Ujj62+6eVGmxDweBqL9Mv3itaGcwAfcvsOF2xdPAlLNpgQBLOnvoa
IngVZlr3UDjc4revrFSY/ZERZ1mHOOLAcJPgE5yH/dZFv+21L5qZgoTW9ZsTooTT
tbkr6iFkqQ3u3ICLYvwTo0clzGpTRX87T6hwx0WHMOndrr6WyIUUOeG51Xadtl3l
MuhXaqS57EcE+JXcrTxJ+3T47X9RN+br5Zbom5tsFydjfutmFquqRUc4PKsiaTjO
+upZgV6khCux24YxoSWbgEsUyNLgn4cmtr+C36uJHBtQSPHOvtFZTTPc9EC+Ynk=
=nh3W

Jasen Betts

unread,
Dec 15, 2017, 7:01:08 PM12/15/17
to
On 2017-12-15, Eef Hartman <E.J.M....@gmail.com> wrote:
> R.Wieser <add...@not.available> wrote:
>> The whole difference would be that with a shellscript I could read the
>> embedded libraries info (and possibly also references to other object
>> files) out of the source file, which is handy for small projects
>
> So that script would have to be smart enough to know that
> #include <stdio.h>
> #include <stdlib.h>
> do NOT need an extra library (they are for the standard C library,
> which ld will use anyway), while
> #include <math.h>
> DOES (and that the library is called libm.a or .so).


it would also need to know that on some platforms if you use %g in a
printf argument that -lm is also required etc....

> Note that the library name isn't always evident from the header and
> sometimes the library does not NEED a header file of its own.
>
> There are on most systems more than a 1000 different shared objects
> (that's what .so means, it is a library of functions. essentially .o
> files, that can be shared by multiple executables).
> The man page for the function(s) used will normally tell you which
> (if any) -l option they will need.

And then there's "pkg-config" which automates some of that across
different platforms.



--
This email has not been checked by half-arsed antivirus software

Jasen Betts

unread,
Dec 15, 2017, 7:31:19 PM12/15/17
to
On 2017-12-15, R.Wieser <add...@not.available> wrote:
> Eef,
>
>> So that script would have to be smart enough to know that [snip]
>
> As I mentioned somewhere else in this thread, I was thinking of using
> (somerthing like) "#includelib" for it. (Not my idea, I took my cue from
> another programming environment. <whistle> )

I guess it could be useful in tiny projects, like with less than 10
source files.

The problem you have is that the developers of GCC aren't going to
want to add complexity that makes it harder for them to build GCC.
it also seems like a foot gun most other stuff in a C file (and all
other preprocessor directives) have module scope, but this one has
project global scope, but no module scope.

R.Wieser

unread,
Dec 16, 2017, 4:35:08 AM12/16/17
to
Dan,

> Indeed, nothing stops you from using `find / [...]' However, that
> can take quite a bit longer,

Yes, I already mentioned that.

> not to mention having to also filter out all the
> "Permission Denied" errors that you'll be getting.

I did not get any when I did that (searching for all "lib" entries starting
at "/"). But as I also mentioned, I didn't use "find".

> Yes, and "on windows" is a poor choice of comparison
> to "on linux". You're effectively comparing apples to the
> sun.

Do you mean to tell me that C (or in this case, one of its basic tools)
actually *isn't* portable between different OSes ? :-D

In other words: Its the same tool, just on two different platforms. Do You
have *anything* to underbuild, for this tool, that stance of yours ?

Or is this just some an "it isn't Linux, so it *must* be inferior" kind of
stance ? :-(

> What is what meant for? 'make' is a program that reads the makefile,

I *know* what make and its makefile is used for (as I've mentioned a few
times now). The subject there was "configure script". What is its
function ? What does it *do* ?

Regards,
Rudy Wieser




"Dan Purgert" <d...@djph.net> wrote in message
news:slrnp38gf...@xps-linux.djph.net...

R.Wieser

unread,
Dec 16, 2017, 5:16:12 AM12/16/17
to
Jasen,

> I guess it could be useful in tiny projects, like with less than
> 10 source files.

*That* many ? I was not even really thinking about more than *maybe* two
... :-)

But yes, if-and-when I arrive at working with bigger, multi-file projects
like that I will have to re-evaluate the used tools, and pick the best for
the thanwhile requirements. Which could well include using "make".

But as I do not really seem to get a hang of what "make" is doing
(chronologically speaking) with the info in a makefile (and does not seem to
allow for some in-between scripting) I will probably put that off for as
long as I can.

> The problem you have is that the developers of GCC aren't going
> to want to add complexity that makes it harder for them to build
> GCC.

I was not really expecting that, I just threw it up as a "look at it this
way" blurb, as all other, more down-to-earth possibilities/suggestions
brought up by me seemed to fall upon deaf ears. :-(

And GCC didn't need to be altered at all, just wrapping it in a bit of
(bash) scripting would be have been enough. In the end even that wasn't
needed though.

Regards,
Rudy Wieser


Dan Purgert

unread,
Dec 16, 2017, 7:29:59 AM12/16/17
to
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

R.Wieser wrote:
> Dan,
>
>> Yes, and "on windows" is a poor choice of comparison
>> to "on linux". You're effectively comparing apples to the
>> sun.
>
> Do you mean to tell me that C (or in this case, one of its basic tools)
> actually *isn't* portable between different OSes ? :-D

Yes. Because of "Windows", things will be different. If it's all "GNU
Make", then those differences will be kept to as an absolute minimum as
possible.

However, if you're using different 'make' tools, for example:

- Linux -> GNU Make
- Windows -> Borland Make
- UNIX -> POSIX Make

You would need to compare and contrast their relevant manpages and other
documentation to see what those differences are, and how significant
they may be.

>
> Or is this just some an "it isn't Linux, so it *must* be inferior"
> kind of stance ? :-(

More of "it's Windows, the chances of you having actually used the same
make as is generally used in Linux are small."

>
>> What is what meant for? 'make' is a program that reads the makefile,
>
> I *know* what make and its makefile is used for (as I've mentioned a few
> times now). The subject there was "configure script". What is its
> function ? What does it *do* ?

Short version, it makes configuring the makefile automated (as different
systems, even on the same architecture may have different shared libs;
not to mention writing for different architectures, etc.).


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQEcBAEBAgAGBQJaNRGvAAoJEI4R3fMSeaKBTg8IAJY36AyRxjwtBq9qjzNFpDzl
xTZP4QEfuSzRDJiTrLXx7vRIUL9qgpTWPkwLgaKQKhvaqK2mcEHweR0vhngfxmWt
kkTtDEMB+N0YqiGm0caM0wMezz9Ur+mAiN2AWaI7GuLeDPNSCt4t4RBnWHh68E3q
De55xpGmECOKK0xBUqVnwLZqFXMWjGPFkN6PQRRhCXSApy1JJ3kSyD80I5IphgdH
DacKtZevWFze9fb5BNsr5tf7EUZpcCZMggc10cCgoQ+O5qG/4ZIJN4/CNXZGd4dP
8M4eOlr7vacgzlt9LkCUklhi+F1H+fb4D6L0JoeC9ueRiWtsnynaAg6w8e2Ff4s=
=bwGk

William Unruh

unread,
Dec 16, 2017, 12:52:10 PM12/16/17
to
On 2017-12-16, R.Wieser <add...@not.available> wrote:
> Dan,
>
>> Indeed, nothing stops you from using `find / [...]' However, that
>> can take quite a bit longer,
>
> Yes, I already mentioned that.
>
>> not to mention having to also filter out all the
>> "Permission Denied" errors that you'll be getting.
>
> I did not get any when I did that (searching for all "lib" entries starting
> at "/"). But as I also mentioned, I didn't use "find".
>
>> Yes, and "on windows" is a poor choice of comparison
>> to "on linux". You're effectively comparing apples to the
>> sun.
>
> Do you mean to tell me that C (or in this case, one of its basic tools)
> actually *isn't* portable between different OSes ? :-D

No it is not the same tool. It is two different development environments who
raced off in different directions in developing their tools, and even using
the same name. Unlike Windows, the name make is not trademarked.

>
> In other words: Its the same tool, just on two different platforms. Do You
> have *anything* to underbuild, for this tool, that stance of yours ?

No it is two different tools with the same name and somewhat the same
functionality.

>
> Or is this just some an "it isn't Linux, so it *must* be inferior" kind of
> stance ? :-(

No.

>
>> What is what meant for? 'make' is a program that reads the makefile,
>
> I *know* what make and its makefile is used for (as I've mentioned a few
> times now). The subject there was "configure script". What is its
> function ? What does it *do* ?

It sets up make files and compilation environments to use what is available on
the particular machine the program is being compiled on.

Jasen Betts

unread,
Dec 16, 2017, 4:01:10 PM12/16/17
to
On 2017-12-16, R.Wieser <add...@not.available> wrote:
> Jasen,
>
>> I guess it could be useful in tiny projects, like with less than
>> 10 source files.
>
> *That* many ? I was not even really thinking about more than *maybe* two
> ... :-)


> But yes, if-and-when I arrive at working with bigger, multi-file projects
> like that I will have to re-evaluate the used tools, and pick the best for
> the thanwhile requirements. Which could well include using "make".

> But as I do not really seem to get a hang of what "make" is doing
> (chronologically speaking) with the info in a makefile (and does not seem to
> allow for some in-between scripting) I will probably put that off for as
> long as I can.

make is mostly a declarative language, and yeah, it can get complex.
it's not in any way limited to running compilers, what it does is
solve the proble of finding the path between current filesystem state
and the output product without re-doing unnecessary work.

If you have less than 20000 lines of code to compile there's probably
not much to be gained by using make to reduce the recompile to only
the changed files.

>> The problem you have is that the developers of GCC aren't going
>> to want to add complexity that makes it harder for them to build
>> GCC.
>
> I was not really expecting that, I just threw it up as a "look at it this
> way" blurb, as all other, more down-to-earth possibilities/suggestions
> brought up by me seemed to fall upon deaf ears. :-(

I can see its utility in small projects but it sees to be contrary to
what the compiler does, and the way the linker actually works. see the
man page ld.so, for example "LD_PRELOAD".

> And GCC didn't need to be altered at all, just wrapping it in a bit of
> (bash) scripting would be have been enough. In the end even that wasn't
> needed though.

Sure, you can invent an "augmented C" language, the Qt folks seem
happy with their "augmented C++".

Richard Kettlewell

unread,
Dec 16, 2017, 5:47:39 PM12/16/17
to
MSVC has had a feature looking rather like this for many years.

Issues with the approach include:

* The outcome of a link depends on the order and multiplicity of the
library inputs. Declaring library dependencies (only) within
translation units isn’t sufficient to specify this in a coherent
way.

This is soluble, sort of, but it would be a bit disruptive to the way
libraries normally work.

* It sometimes happens that one symbol may be defined in multiple
libraries, with multiple implementations, selected on a
per-application basis at link time.

This isn’t directly soluble; codebases uses this approach would have
to be modified to use some other approach.

* Assuming the above issues and any others to be solved, the end state
still isn’t very good, with interface import based entirely in
textual inclusion, no systematic approach to avoiding name clashes,
etc. More modern languages generally do something better; I don’t
expect C to change in this regard in the foreseeable future.

--
https://www.greenend.org.uk/rjk/

R.Wieser

unread,
Dec 18, 2017, 4:07:39 AM12/18/17
to
Dan,

> Yes. Because of "Windows", things will be different.

Are you on purpose vague there ?

Yes, *ofcourse* "things" will be different (like the OSes themselves). But
the question is, are those "things" of any relevance to the implementation
of the idea "make" tries to effect.

In that regard I currently do not see, nor have I seen you mention any.

> You would need to compare and contrast their relevant manpages
> and other documentation to see what those differences are, and
> how significant they may be.

Exactly.

>> Or is this just some an "it isn't Linux, so it *must* be inferior"
>> kind of stance ? :-(
>
> More of "it's Windows, the chances of you having actually used the
> same make as is generally used in Linux are small."

I don't think you realized you effectivily just agreed with my assessment of
your stance.

Its not how good (or bad) any other version on any OS is, its just that its
not the version (you are currently(!))running on Linux. While I can
understand the sentiment, I don't think its a healthy position to be
discussing the merrits of a program from.

> Short version, it makes configuring the makefile automated

Yep, thats pretty-much as I assumed. If only "make" had the smarts to
accept some kind of configuration file. :-p

> ... even on the same architecture may have different shared libs;

Ah, I see that Linux has their own version of "DLL hell" implemented. Well
done. :-D

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 18, 2017, 5:13:17 AM12/18/17
to
William,

> No it is not the same tool.

How the fuck would you know ? You have not even asked which package my
(yesteryear tried) "make" is part of.

http://gcc.gnu.org/install/binaries.html
https://en.wikipedia.org/wiki/MinGW

And yes, that is two references to the gcc suite having been ported to
windows.

>> Or is this just some an "it isn't Linux, so it *must* be inferior"
>> kind of stance ? :-(
>
> No.

Ok, "make" on another OS isn't automatically inferior. Than why did my
experience with "make" on Windows get offhandedly rejected ?

In other words, just a "No" doesn't quite cut it here. :-(

And don't say "because of the difference(s)", as I than will point to other
and older (now obsoleted) versions of "make" on (flavors of) Linux.

Or is its just a simple "Its not whatever version which I'm currently using"
rejection (which would be almost as bad) ?

Also, it would be a good idea of pointing out show-stopping differences to
how make/a makefile works onder Windows versus on on Linux. Who knows, I
could even *learn* something from it. :-)

By the way, I just tried a quick "makefile Windows vs Linux" google, and was
already on page 4 and still found nothing of the kind. Maybe the
differences aren't as big as you think they are ....

> It sets up make files and compilation environments to use what is
> available on the particular machine the program is being compiled on.

While that is a good blurb for a presentation for your higher managment, for
a fellow programmer (like me) its devoid of any actual meaning. Simple
counter-question: Why doesn't "make" do that stuff by itself ?

And *please*, don't answer with "because it can't". That would be the
ob(li)vious answer, and still meaningless.

I can think of a few things, but mine are (still just) educated guesses,
which you are presenting yourself as being "in the know" ...

... Which I would gladly take advantage of. :-)

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 18, 2017, 5:57:51 AM12/18/17
to
Jasen,

> what it does is solve the proble of finding the path between current
> filesystem state and the output product without re-doing unnecessary
> work.

You could put it that way I suppose. A simpler explanation: I skips
re-compiling sourcefiles which do not seem to have changed since the last
time. Feel free to point out important stuff I'm glossing over with it
though.

> I can see its utility in small projects

Which (currently, as a novice and using Geany) is still what I'm working
with.

> but it sees to be contrary to what the compiler does,

Huh? The compiler has absolutily *zero* to do with it (other than being
the reciepient of the extracted data that is).

> Sure, you can invent an "augmented C" language, the Qt folks
> seem happy with their "augmented C++".

Do you think that a preprocessor handling, among others, a #include
directive is part of the C language ? I don't. And for the same reason I
do not think that having a second "preprocessor" extract some data from the
sourcefile is either. Not that this matters in any way though

There are two problems here: one is that the *only* answer to my question
seems to be "makefile" -- even though Geany uses a fancied-up version of a
gcc commandline invocation -- and the second that *anything* differing from
what hasn't done before (deviates from the "normal") is deemed to be bad.

And FYI, the "use this library" data I've added to the sourcefile does not
stop anyone from compiling the exact same file in their own environment.
Provided they mention the needed libraries elsewhere ofcourse.

Regards,
Rudy Wieser


Dan Purgert

unread,
Dec 18, 2017, 7:58:57 AM12/18/17
to
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

R.Wieser wrote:
>> More of "it's Windows, the chances of you having actually used the
>> same make as is generally used in Linux are small."
>
> I don't think you realized you effectivily just agreed with my
> assessment of your stance.

No, you completely missed what I said, and are thinking I validated what
you think my stance is.

Even though they share the same name ('make'), the programs are not
necessarily "the same".

>
> Its not how good (or bad) any other version on any OS is, its just
> that its not the version (you are currently(!))running on Linux.
> While I can understand the sentiment, I don't think its a healthy
> position to be discussing the merrits of a program from.

Ignoring the underlying environment, and the changes to behavior it
requires - even for the same program written by the same author - is
even less of a "healthy position" to discuss "merits of a program".
And that's not even considering the possibility that while the names may
overlap, they were written by two different authors, and the resulting
programs have little, if any, commonality.


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQEcBAEBAgAGBQJaN7t3AAoJEI4R3fMSeaKBYS0H/jKhEdm3eWrFWBQ9NwdNK9V4
Ypt9KZWMVTdAbD4xWpnyhs7x2Qzt3hgRwN/1cr1UFmgaPlZ0mVO6T7n9UqGhGOgn
vpk/P/XR4dBBq4LavatabXT+tTRd1rjzCnY5F2bcmR2Qz1O6lTlPFXxqJ3EAzFgf
Wal7BEy2B8p3YREYRfpPcwRF6EHTr+e2g6TN6nIP7kvcBdbNhFkK0mq0L92dh+Me
2t5Vuf9r8f5cNx5zxennLbBnol7IgBOJHBqU9m2Gr+Ef1LxtbB+JiNWXsaHljvOf
207Jm/7v1C7wRVcpTZTtb/3Zn2F5NtE9iqkm5EmwzKyWbG/JYMokplxt9ByAcmE=
=o1hl

William Unruh

unread,
Dec 18, 2017, 12:28:36 PM12/18/17
to
On 2017-12-18, R.Wieser <add...@not.available> wrote:
> Dan,
>
>> Yes. Because of "Windows", things will be different.
>
> Are you on purpose vague there ?
>
> Yes, *ofcourse* "things" will be different (like the OSes themselves). But
> the question is, are those "things" of any relevance to the implementation
> of the idea "make" tries to effect.

Yes. Make on Windows is different from make on a gnu machine. The purpose is
more of less the same, the implimentation is different. Just because two
things have the same name does not mean they are the same. A robin in England
is different from a robin in the US, although both have the same name.


>
> In that regard I currently do not see, nor have I seen you mention any.
>
>> You would need to compare and contrast their relevant manpages
>> and other documentation to see what those differences are, and
>> how significant they may be.
>
> Exactly.

So, do it.

>
>>> Or is this just some an "it isn't Linux, so it *must* be inferior"
>>> kind of stance ? :-(
>>
>> More of "it's Windows, the chances of you having actually used the
>> same make as is generally used in Linux are small."
>
> I don't think you realized you effectivily just agreed with my assessment of
> your stance.
>
> Its not how good (or bad) any other version on any OS is, its just that its
> not the version (you are currently(!))running on Linux. While I can
> understand the sentiment, I don't think its a healthy position to be
> discussing the merrits of a program from.

It has nothing to do with merits or otherwise. They are different. Whether one
is better than another is another question. But they are different. That you
had experience with one 10 years ago does not mean that you know what the
other is like.


>
>> Short version, it makes configuring the makefile automated
>
> Yep, thats pretty-much as I assumed. If only "make" had the smarts to
> accept some kind of configuration file. :-p

No. configure is another program, which is NOT part of make. It was written so
as to make it easy to transfer a make file on one system to another which
might had libraries with different names, which might have different
capabilities altogether. It is precisely needed because make on on different
systems systems is not the same. and because the environment in which the
compilation takes place on different systems is not the same.
But it is a different program from "make".


>
>> ... even on the same architecture may have different shared libs;
>
> Ah, I see that Linux has their own version of "DLL hell" implemented. Well
> done. :-D

Whatever you want to call it. But again just because you have given it a name
does not mean you understand it. You seem to be in a magical frame of thought
where names and things have an identity, and if you know the name you know the
thing. It is simply not true of the real world of computer programming.

>
> Regards,
> Rudy Wieser
>
>

William Unruh

unread,
Dec 18, 2017, 12:29:30 PM12/18/17
to
On 2017-12-18, R.Wieser <add...@not.available> wrote:
> William,
>
>> No it is not the same tool.
>
> How the fuck would you know ? You have not even asked which package my
> (yesteryear tried) "make" is part of.

It does not matter.

William Unruh

unread,
Dec 18, 2017, 12:37:46 PM12/18/17
to
On 2017-12-18, R.Wieser <add...@not.available> wrote:
>
> There are two problems here: one is that the *only* answer to my question
> seems to be "makefile" -- even though Geany uses a fancied-up version of a
> gcc commandline invocation -- and the second that *anything* differing from
> what hasn't done before (deviates from the "normal") is deemed to be bad.

Nope. Makefile is NOT the only answer to your question. What you were doing
already is also an answer, and there are a millions others as well, where you
for example write you own program to do a subset of what "make" was designed
to do. That seems to be the direction you are comfortable with. Go ahead. Just
do not expect much help from anyone else.

You say: Here is the task I want to accomplish. Others say: here is a standard
tool to accomplish that task. Your say: I do not like that tool, please give
me another one.
At that point there is little that anyone can say. It is the standard tool
that does, amongst other things, exactly what you want to accomplish, which
people have not found it worthwhile to write some other tool to do the same
thing badly, or in a very limited way. So do what you wish.
Or perhaps the task you want to accomplish it irrelevant and what you want to
do is to fight with others. All I could say to that is that it would be more
profitable to you to find anothr hopbby.


Jasen Betts

unread,
Dec 18, 2017, 4:32:59 PM12/18/17
to
On 2017-12-18, R.Wieser <add...@not.available> wrote:
> Jasen,
>
>> what it does is solve the proble of finding the path between current
>> filesystem state and the output product without re-doing unnecessary
>> work.
>
> You could put it that way I suppose.

> A simpler explanation: I skips
> re-compiling sourcefiles which do not seem to have changed since the last
> time. Feel free to point out important stuff I'm glossing over with it
> though.

Make is more general, it's not only used to compile source into
libraries and/or binaries. for example the standard install of
sendmail uses make to rebuild the tables used to look up email aliases
and onther data related to its operation.

make is a tool for the controlled conversion of fils from one form to
another. it was degined to aid in the copilation of source, but has
found other uses,

>> I can see its utility in small projects
>
> Which (currently, as a novice and using Geany) is still what I'm working
> with.
>
>> but it sees to be contrary to what the compiler does,
>
> Huh? The compiler has absolutily *zero* to do with it (other than being
> the reciepient of the extracted data that is).


I guess I missed somthing, when I get time I'll review this thread.


>> Sure, you can invent an "augmented C" language, the Qt folks
>> seem happy with their "augmented C++".
>
> Do you think that a preprocessor handling, among others, a #include
> directive is part of the C language ? I don't.

It seems to me that ISO does.

> And for the same reason I do not think that having a second
> "preprocessor" extract some data from the
> sourcefile is either. Not that this matters in any way though

yeah not that it matters, but where does the stuff it exracts go?

> There are two problems here: one is that the *only* answer to my question
> seems to be "makefile" -- even though Geany uses a fancied-up version of a
> gcc commandline invocation -- and the second that *anything* differing from
> what hasn't done before (deviates from the "normal") is deemed to be bad.

I didn't say it was bad.

> And FYI, the "use this library" data I've added to the sourcefile does not
> stop anyone from compiling the exact same file in their own environment.
> Provided they mention the needed libraries elsewhere ofcourse.

It seems to me that it could cause confusion.

I can see it working well in single-module environments, but can't
see it working in multiple module builds.


And as a programmer I've grown to dislike unneccessary special handling
of edge cases

Jasen Betts

unread,
Dec 18, 2017, 4:33:49 PM12/18/17
to
there are also alternative build tools "Scons", "cmake", "icmake"

I have a small project with 5 source files, I build and link it using
a shell script that just builds everything. even with this inefficient approach
it takes less than 5 seconds to build, and I find shell easier to debug than make.

Dan Purgert

unread,
Dec 18, 2017, 4:35:49 PM12/18/17
to
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

William Unruh wrote:
> On 2017-12-18, R.Wieser <add...@not.available> wrote:
>> Dan,
>>
>>> Yes. Because of "Windows", things will be different.
>>
>> Are you on purpose vague there ?
>>
>> Yes, *ofcourse* "things" will be different (like the OSes themselves). But
>> the question is, are those "things" of any relevance to the implementation
>> of the idea "make" tries to effect.
>
> Yes. Make on Windows is different from make on a gnu machine. The
> purpose is more of less the same, the implimentation is different.
> Just because two things have the same name does not mean they are the
> same. A robin in England is different from a robin in the US, although
> both have the same name.

TBH, you probably should've gone with an African Swallow or a European
Swallow. Otherwise, I agree wholeheartedly.


-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQEcBAEBAgAGBQJaODSdAAoJEI4R3fMSeaKBf6IH/RSp745Hnw49oFuovpouZJyW
NfOTKZtQvDGYMpKuh9HzJUnl44myyJEGtfwVqyjrNL3VrFSKbvcA+eQlqkp2A8Ix
0Jjxt5xW/IkVmRZfMhAR97cqUL2RIa/DYbE+JpnjaUcVDObeVLPIoXaD0iZQnTnH
VD6PJaNOMolUK4A9az0+bJ91hzLbeGirUEQLyeIcAipTkAli8HO0H/R2MbpEYHri
edsuUlHUrt1ErNfCeo7cm7HFMZm+ikywUYuka2WklV+CH9qBFhWjKlSNHZyusW3u
yKIc4BBMuLhM1pgJZjG6lOjWR2m7zwPjLVo4nq+kFd3YKa2xTqpW2uAU7IvMygc=
=ceFR

Dan Purgert

unread,
Dec 18, 2017, 4:40:11 PM12/18/17
to
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

William Unruh wrote:
> On 2017-12-18, R.Wieser <add...@not.available> wrote:
>>
>> There are two problems here: one is that the *only* answer to my
>> question seems to be "makefile" -- even though Geany uses a
>> fancied-up version of a gcc commandline invocation -- and the second
>> that *anything* differing from what hasn't done before (deviates from
>> the "normal") is deemed to be bad.
>
> Nope. Makefile is NOT the only answer to your question. What you were
> doing already is also an answer, and there are a millions others as
> well, where you for example write you own program to do a subset of
> what "make" was designed to do. That seems to be the direction you are
> comfortable with. Go ahead. Just do not expect much help from anyone
> else.
>
> [...]
> Or perhaps the task you want to accomplish it irrelevant and what you
> want to do is to fight with others. All I could say to that is that it
> would be more profitable to you to find anothr hopbby.

On the other hand, perhaps what we have is an XY problem ...

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQEcBAEBAgAGBQJaODWkAAoJEI4R3fMSeaKBtsEH/0MKSEL5Cr8ZySOCm9zARjpo
dnY3JRV4lH230eXzG9PV9nnlIKEuEySjW2ySoR4quPeHt+wKomqptoUw+604lput
Os+1Y3VoroOhSLERDTyNNiVC+zfI+rEjI3vaGAPWnCvIakN9G8yKj5cw0X7+5ZY4
PF7cb7JfEXLaYUJCR1Ewn5whjH1zgBZK8DhmSHMnhCo6oiLslIvClSpfQiFXq29j
SY3vgUaliGF/H9VjLeJWlYvUxclpFgh5CR5rFCK4WdwY2x5LpNuActYaMEPT1QGq
YXID/nTteF/uJfDnXhVeNWPVIOQGDgHcw77XaLrOja+Xl7tfSdFsB1JatqW+618=
=Hy+e

R.Wieser

unread,
Dec 19, 2017, 2:30:11 AM12/19/17
to
Dan,

> Even though they share the same name ('make'), the programs are
> not necessarily "the same".

True. The problem is that you're wielding that as if they *aren't* the
same, and by it dinging the other version as being inferior. You have
provided zero underbuilding for either. And I'm afraid that that simply
does not work for me, sorry.

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 19, 2017, 3:44:40 AM12/19/17
to
William,

> Yes. Make on Windows is different from make on a gnu machine.

You don't say ! :-)

> The purpose is <strike>more of less</strike> the same, ...

There, fixed that for you.

> ... the implimentation is different

I have not yet seen you bring anything forward to substanciate that I'm
afraid. Also, the "black box" principle: The contents of it (read:
implementation) do not matter. What goes in and comes out does.

>>> You would need to compare and contrast their relevant manpages
>>> and other documentation to see what those differences are, and
>>> how significant they may be.
>>
>> Exactly.
>
> So, do it.

Ehrm ?? I mentioned that I had some experience with make, and someone (I'm
not sure who anymore) thought otherwise, because of not all make versions
are being considered equal. Isn't it *their* (your?) duty to come up with
something to underbuild that stance ?

In short, I could (currently, after having tried a couple of times to evoke
something more than vague nothingness) not care less about if they are. If
noone bothers about trying to explain how the Linux version is
different/better than the Windows one than I will simply not be tempted to
reconsider my position at this moment/within shortly. Which is no skin off
of my back, as my current solution works for me. Do you get my drift ?

> No. configure is another program, which is NOT part of make

Are you playing stupid here ? Yes, I KNOW they are not the same. The
(obvious) question is, why ?

> It is precisely needed because make on on different systems systems
> is not the same.

Why ? All it seems to do is to figure out which sourcefiles need to be
(re-)compiled, and in which order. Neither of those seem to be a biggie.

And it brings us back to the begin: If there are so many different "make"
programs around, how come that the one(s) on Windows (the one I said I had
tried to work with) is considered non-comparable/bad/you name it ? As it
turns out, according to your own words, *legions* of them are not the same
...


Or was the initial "thats on Windows, you can't compare it" just ment to be
that: There is *no* version of "make" you can compare to an(y )other, as all
are different ? In that case how can *anyone* suggest that using "make"
is the (or THE) solution ? :-D



But lets cut this short shall we, this has gone on long enough. I'm going
to ignore (or at least try to do so) any posts which do not provide
something more than vague blurbs.

If you got something to substanciate the "can't be compared" stance you're
(very!) welcome to post it. Or even better, *how* the Windows versions
existencially differ from the (legion of) versions used on Linux. I could
actually learn something from that (which I certainly have not been doing
from the last number of posts in this thread :-( ).

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 19, 2017, 4:26:01 AM12/19/17
to
William,

> Nope. Makefile is NOT the only answer to your question.

Of that you can be certain. Also, DUH. I posted that I researched, and
implemented another one.

> Just do not expect much help from anyone else.

I won't anymore. You guys seem to think you can provide answers, but seem
to be unable (unwilling?) to adapt to the pupils (my) circumstances/wishes.
:-(

> Your say: I do not like that tool, please give me another one.

No, I did not. I said that I didn't understand the chronology of what is
done by that tool, indicating that that is the reason why I didn't wish to
use it. Instead of trying to help me understand all I got was "but you
*must* use it!" responses. :-(

> At that point there is little that anyone can say

Wrong. As you said yourself, its not the *only* answer. So, where are the
others ? Other than a vague reference to bash scripting I do not remember
having seen *anything* in that regard.

> All I could say to that is that it would be more profitable to you to
> find anothr hopbby

I don't think so. But it would be more profitable to me to talk to people
who actually try to *answer the question(s)*, instead of assuming they know
*way* better what the poster actually ment (hint: humans *cannot* read
minds) and/or think they are allowed to demand their "suggestions" to be
followed to the letter.


Helping someone does *not* mean you to force your own believes upon someone.
It means you provide him with the data to make an informed decision for
himself. If you are unable (or unwilling) to do so than you should not be
attempting to answer questions to begin with.

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 19, 2017, 5:13:17 AM12/19/17
to
Jasen,

> Make is more general,

In what way ? I've still not heard anyone mention anything specific in
that regard, even though I have asked several times now ...

Mind you, that does not mean I do not recognise that its probably the most
*used* method (and I will not even make any speculation about if that is
because of usefullnes, or just because of "its there, lets just use it"
lazyness :-) ).

> for example the standard install of sendmail uses make to rebuild
> the tables used to look up email aliases and onther data related to
> its operation

I have absolutily *no* idea how that would look in makefile language (read:
although I assume "make" can initiate it, it will probably simply call
another program to do the actual building, as it does for sourcefile
compilation).

> yeah not that it matters, but where does the stuff it exracts go?

Into the fancied commandline invocation of gcc, as "-l" arguments.

> I didn't say it was bad.

No, you did not. But this group certainly has made it abundantly clear
that using *anything* else than "make" is simply out of the question.
Guess what message that conveys ...

> It seems to me that it could cause confusion.

:-) Even a simple "hello world" program could cause confusion.

> I can see it working well in single-module environments, but
> can't see it working in multiple module builds.

My apologies, but as I've replied to the same concerns several times now I
have no will anymore to repeat it.

> And as a programmer I've grown to dislike unneccessary special
> handling of edge cases

As a programmer and electronics hobbyist I rather dislike having to use
tools *way* to big for a certain job (coming from someone who, long ago,
used a 100W soldering gun to solder 74xx series DIL ICs on/into experiment
(pertinax) board)

Also, I have zero intention to have a gazilion of project folders all just
contaning a single(!) sourcefile and its accompanying makefile.

Heck, if "make" would have been the *only* answer I would probably have
renamed the makefiles to something like "projectname.mke" (so I could put
multiple of them together with the sourcefiles in the same folder) and call
it with that filename as an argument. Which would be another "not as we
are accustomed to" edge case. :-) (and not one mentioned as a possibility
either ... :-( )

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 19, 2017, 5:20:01 AM12/19/17
to
Jasen,

> I have a small project with 5 source files, I build and link it
> using a shell script that just builds everything.

Which was also one of the possibilities I considered and mentioned.

> even with this inefficient approach

How so ? You do not check if the sourcefile is older than the existing
objectfile perhaps ? Otherwise it should do the same, and thus be as fast.

> and I find shell easier to debug than make.

... Which is pretty-much the same as what I mentioned to why I didn't use
makefiles to begin with.

Regards,
Rudy Wieser


Eef Hartman

unread,
Dec 19, 2017, 8:28:09 AM12/19/17
to
R.Wieser <add...@not.available> wrote:
> In what way ? I've still not heard anyone mention anything specific in
> that regard, even though I have asked several times now ...

You can "make" anything that is dependant on some sort of source, like
libraries, documents (TeX comes in mind), html pages from source text,
or whatever.
The general idea of make is: determine WHAT should be done, by doing
it by hand and then encode that into a "recipe" in a makefile.

And about your question about "configure", that mostly is a script
which determines WHICH libraries you got, where they are located, if
certain features are supported by your compiler cq kernel and then it
mostly generates a "Makefile", geared to YOUR system.

The tools to create a configure script again are
autoconf:
GNU autoconf is an extensible package of m4 macros that produce shell
scripts to automatically configure software source code packages.
These scripts can adapt the packages to many kinds of UNIX-like
systems without manual user intervention. Autoconf creates a
configuration script for a package from a template file that lists the
operating system features that the package can use, in the form of m4
macro calls. You must install the "m4" package to be able to use
autoconf.

and automake:
Automake is a Makefile generator. It was inspired by the 4.4BSD
make and include files, but aims to be portable and to conform to the
GNU standards for Makefile variables and targets. Automake is a Perl
script. The input files are called Makefile.am. The output files are
called Makefile.in; they are intended for use with Autoconf. Automake
requires certain things to be done in your configure.in.
You must install the "m4" and "perl" packages to be able to use
automake.

Note that both are mainly for software developers and _they_ have to
create the generic templates: Makefile.am cq configure.ac (or .in).
Even the header (.h) files for the compilation can be "configured".

Example: I use a mp3 player, distributed as source and IN that source
there are files:
config.h.in
mpg123.h.in
libmpg123.pc.in (for the libraries)
libout123.pc.in
mpg123.spec.in (.spec files are used for rpm-based distributions)
Makefile.in
but even the configure.ac and Makefile.am are included, as well as,
of course, the resulting script:
configure

In this approach the Makefile goes through 3 incarnations:
Makefile.am input for automake, which creates
Makefile.in which is adapted by the configure script into
Makefile

William Unruh

unread,
Dec 19, 2017, 1:56:15 PM12/19/17
to
They are not the same.
I was not dinging the other version as being inferior. I was dinging it as
being different. You stated that you had exprience with the Windows version of
make 10 years ago, and did not like it. The Windows version of make 10 years
ago is not the same as the gnu version now.

William Unruh

unread,
Dec 19, 2017, 2:04:34 PM12/19/17
to
On 2017-12-19, R.Wieser <add...@not.available> wrote:
> William,
>
>> Yes. Make on Windows is different from make on a gnu machine.
>
> You don't say ! :-)
>
>> The purpose is <strike>more of less</strike> the same, ...
>
> There, fixed that for you.
>
>> ... the implimentation is different
>
> I have not yet seen you bring anything forward to substanciate that I'm
> afraid. Also, the "black box" principle: The contents of it (read:
> implementation) do not matter. What goes in and comes out does.

Complete codswallop. The implimentation does matter. If you think not, they
you have not been using computers.

>
>>>> You would need to compare and contrast their relevant manpages
>>>> and other documentation to see what those differences are, and
>>>> how significant they may be.
>>>
>>> Exactly.
>>
>> So, do it.
>
> Ehrm ?? I mentioned that I had some experience with make, and someone (I'm
> not sure who anymore) thought otherwise, because of not all make versions
> are being considered equal. Isn't it *their* (your?) duty to come up with
> something to underbuild that stance ?

Nope. It is you that is asking for help. Help is offered and you reject it. It
is not up to the helper to do all your work for you. You are perfectly free to
reject the advice, but not to castigate the helper for not doing all your work
for you.


>
> In short, I could (currently, after having tried a couple of times to evoke
> something more than vague nothingness) not care less about if they are. If
> noone bothers about trying to explain how the Linux version is
> different/better than the Windows one than I will simply not be tempted to
> reconsider my position at this moment/within shortly. Which is no skin off
> of my back, as my current solution works for me. Do you get my drift ?

That is fine. It is your right. It is when you give spurious reasons for that
rejection that people have a right to argue with those reasons.


>
>> No. configure is another program, which is NOT part of make
>
> Are you playing stupid here ? Yes, I KNOW they are not the same. The
> (obvious) question is, why ?

In part because of the unix philosophy-- each program should do one thing
well. They can then be hooked togetehr to complicated things well.

>
>> It is precisely needed because make on on different systems systems
>> is not the same.
>
> Why ? All it seems to do is to figure out which sourcefiles need to be
> (re-)compiled, and in which order. Neither of those seem to be a biggie.

Fine. DO it.

>
> And it brings us back to the begin: If there are so many different "make"
> programs around, how come that the one(s) on Windows (the one I said I had
> tried to work with) is considered non-comparable/bad/you name it ? As it
> turns out, according to your own words, *legions* of them are not the same
> ...

Yes. they are not the same. Each thinks that it can do better than the other.
Whether or not they succeed depends on the skill of the programmer, ability of
the programmer to figure out what the program should do and how it should do
it.

>
>
> Or was the initial "thats on Windows, you can't compare it" just ment to be
> that: There is *no* version of "make" you can compare to an(y )other, as all
> are different ? In that case how can *anyone* suggest that using "make"
> is the (or THE) solution ? :-D

You are running linux. There is one ( well actually more than one, but one
called make) on Linux.

>

Eef Hartman

unread,
Dec 19, 2017, 4:10:47 PM12/19/17
to
R.Wieser <add...@not.available> wrote:
> As for that LD_LIBRARY_PATH (environment?) variable,
> where can I find it (how do I read and/or alter it) ?

That depends on your shell. For bash (and other Bourne shell
compatibles like dash, zsh or ksh):
setting it goes with the command
export LD_LIBRARY_PATH=path[:path]..
(you do NOT have to specify the standard dirs here, but all other
directories in which the linker should search for libraries must be
there in the right order).

Note: this usage of export is a bash extension, a pure Bourne shell
would need TWO commands:
LD_LIBRARY_PATH=path
export LD_LIBRARY_PATH
(the first sets the variable, the 2nd makes it into an environment
one, so that programs (like gcc), started AFTER that by the same shell
will inherit that variable).

In csh and tcsh the syntax is different:
setenv LD_LIBRARY_PATH path[:path]..
(note: no =), which does the setting and the export in one line

Reading it is easiest with the "echo" command:
echo $LD_LIBRARY_PATH
(and that's the same in all shells).

By default it is not set, which will result in an error message in
(t)csh for that echo command (bash will just print a blank line).

R.Wieser

unread,
Dec 20, 2017, 6:28:39 AM12/20/17
to
William,

> They are not the same.

Thats an easy assumption, but not one you have bothered to substanciate.

> I was dinging it as being different.

And how is that RELEVANT ? I've been asking that several times now, but
as of yet you've refused to answer it. I'm not going to waste any more
time on it.

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 20, 2017, 6:28:39 AM12/20/17
to
Eef,

> That depends on your shell. For bash

Yup. As thats seems to be the default shell I'm using it (no reason to
change to another shell which I also have no experience with :-) ).

> setting it goes with the command
> export LD_LIBRARY_PATH=path[:path]..

That doesn't look like it will survive a reboot. Am I wrong in that ?

If thats true, where should I store it (config file?) when I want to keep it
over reboots ?

> (the first sets the variable, the 2nd makes it into an environment
> one, so that programs (like gcc), started AFTER that by the same
> shell will inherit that variable).

Transferring the local environment variable to a global one. gotyou.

Thanks.

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 20, 2017, 6:28:39 AM12/20/17
to
Eef,

Thanks for your explanation on what autoconfig does.

FYI, it is in the line of what I thought it would do.

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 20, 2017, 6:28:39 AM12/20/17
to
William,

> Complete codswallop. The implimentation does matter.

Prove it. I won't be holding my breath for it though.

> Nope. It is you that is asking for help.

Yep. And you are the one challenging the relevance of my knowledge in this
regard, and by it its yours to prove.

> Help is offered and you reject it.

Yep, I did. While giving *multiple* reasons for it, *all* of which where
duly ignored. :-((

... as well as indicating other possibilities -- which where, effectivily,
also ignored.

... as well as having described the environment for which I needed the
solution -- which has been ignored from the start.

You think you need to complain about something ? I that case you can take
the backseat, as than I'm first.

> That is fine. It is your right.

Thank you!

> It is when you give spurious reasons for that rejection

I'm not (anymore) going to second-guess what you might be thinking of, so
either you quote it, or I'm going to regard it as (yet another) attempt to
put some bullcrap forward. Your choice.

> that people have a right to argue with those reasons.

Agreed. But not the other way around (me towards you) I take it ? Righty
than ...

And I *do* take offence when such an "argument" is than again-and-again
targetted at a situation I'm not in, which I've mentioned several times and
(obviously) gotten ignored each time. :-(

Fuck, I even mentioned that I would likely reconsider my position when the
need would arise (read: when my projects would become bigger). Not even
*that* stopped you (guys) from regurgitating the same-old same-old. :-(

> Yes. they are not the same.
....
> You are running linux. There is one ( well actually more than one, but
> one called make) on Linux.

Lol, you're a full-blown moron.

You where arguing that different programs having the same name does not mean
that they are actually the same, and here you are, conflating *all* the
same-named Linux programs, no matter which branch they are from or which
version they have. But at the same time claiming that *no* program of the
same name on Windows can be comparable to *any* of the Linux versions
(something you have yet to provide proof for ...).

No, I'm afraid that to me your credibility in this regard has sunken to
below freezing point.


Shall we agree not to talk to each other anymore ? I mean, you are not
getting the respect you obviously deserve as a person who knows how
everything works from me, and I do not get anything which helps me along in
my needs from you. That is good for either of us.

Regards,
Rudy Wieser


Wildman

unread,
Dec 20, 2017, 11:35:07 AM12/20/17
to
On Wed, 20 Dec 2017 12:28:29 +0100, R.Wieser wrote:

> Eef,
>
>> That depends on your shell. For bash
>
> Yup. As thats seems to be the default shell I'm using it (no reason to
> change to another shell which I also have no experience with :-) ).
>
>> setting it goes with the command
>> export LD_LIBRARY_PATH=path[:path]..
>
> That doesn't look like it will survive a reboot. Am I wrong in that ?

You are correct, it will not survive a reboot.

> If thats true, where should I store it (config file?) when I want to keep it
> over reboots ?

Place the commands in /etc/rc.local. Set the execute bit
if it is not already set.

--
<Wildman> GNU/Linux user #557453
The cow died so I don't need your bull!

William Unruh

unread,
Dec 20, 2017, 12:17:32 PM12/20/17
to
On 2017-12-20, Wildman <best...@yahoo.com> wrote:
> On Wed, 20 Dec 2017 12:28:29 +0100, R.Wieser wrote:
>
>> Eef,
>>
>>> That depends on your shell. For bash
>>
>> Yup. As thats seems to be the default shell I'm using it (no reason to
>> change to another shell which I also have no experience with :-) ).
>>
>>> setting it goes with the command
>>> export LD_LIBRARY_PATH=path[:path]..
>>
>> That doesn't look like it will survive a reboot. Am I wrong in that ?
>
> You are correct, it will not survive a reboot.
>
>> If thats true, where should I store it (config file?) when I want to keep it
>> over reboots ?
>
> Place the commands in /etc/rc.local. Set the execute bit
> if it is not already set.

That would be pretty useless. The user's setting would almost certainly
overwrite that.
Try putting it into your own .bashrc or .bash_profile so it is there for you
each time you log in or start the bash shell. (if you put it into
.bash_profile you need to check .bashrc to make sure it does not override what
you did in .bash_profile)

>

Wildman

unread,
Dec 20, 2017, 1:14:01 PM12/20/17
to
On Wed, 20 Dec 2017 17:17:30 +0000, William Unruh wrote:

> On 2017-12-20, Wildman <best...@yahoo.com> wrote:
>> On Wed, 20 Dec 2017 12:28:29 +0100, R.Wieser wrote:
>>
>>> Eef,
>>>
>>>> That depends on your shell. For bash
>>>
>>> Yup. As thats seems to be the default shell I'm using it (no reason to
>>> change to another shell which I also have no experience with :-) ).
>>>
>>>> setting it goes with the command
>>>> export LD_LIBRARY_PATH=path[:path]..
>>>
>>> That doesn't look like it will survive a reboot. Am I wrong in that ?
>>
>> You are correct, it will not survive a reboot.
>>
>>> If thats true, where should I store it (config file?) when I want to keep it
>>> over reboots ?
>>
>> Place the commands in /etc/rc.local. Set the execute bit
>> if it is not already set.
>
> That would be pretty useless. The user's setting would almost certainly
> overwrite that.

I have a path modification in /etc/rc.local here and
it works ok, but, every system can be different so
you may very well right. I would say to OP, try it
and see.

> Try putting it into your own .bashrc or .bash_profile so it is there for you
> each time you log in or start the bash shell. (if you put it into
> .bash_profile you need to check .bashrc to make sure it does not override what
> you did in .bash_profile)
>
>>





Richard Kettlewell

unread,
Dec 20, 2017, 5:50:10 PM12/20/17
to
Eef Hartman <E.J.M....@gmail.com> writes:
> R.Wieser <add...@not.available> wrote:
>> As for that LD_LIBRARY_PATH (environment?) variable,
>> where can I find it (how do I read and/or alter it) ?
>
> That depends on your shell. For bash (and other Bourne shell
> compatibles like dash, zsh or ksh):
> setting it goes with the command
> export LD_LIBRARY_PATH=path[:path]..
> (you do NOT have to specify the standard dirs here, but all other
> directories in which the linker should search for libraries must be
> there in the right order).

What on earth is this supposed to achieve? LD_LIBRARY_PATH primarily
affects the behavior of the runtime linker; the only effect on the
build-time linker is in the special case of dependencies between shared
libraries.

If you want to add directories to the build-time linker search path, use
the -L option.

$ ls -l d
total 8
-rwxrwxr-x 1 richard richard 7848 Dec 20 22:36 libfoo.so
$ gcc -o u u.o -lfoo
/usr/bin/ld: cannot find -lfoo
collect2: error: ld returned 1 exit status
$ LD_LIBRARY_PATH=`pwd`/d gcc -o u u.o -lfoo
/usr/bin/ld: cannot find -lfoo
collect2: error: ld returned 1 exit status
$ gcc -Ld -o u u.o -lfoo
$

--
https://www.greenend.org.uk/rjk/

Eef Hartman

unread,
Dec 20, 2017, 7:32:14 PM12/20/17
to
R.Wieser <add...@not.available> wrote:
> That doesn't look like it will survive a reboot. Am I wrong in that ?

In fact it doesn't even survive a "shell exit" as the variable still
is local to the specific shell/window it was set IN.

> If thats true, where should I store it (config file?) when I want
> to keep it over reboots ?

The normal place, already mentioned by others, is your .profile cq
.bash_profile:
>When bash is invoked as an interactive login shell, or as a
>non-interactive shell with the --login option, it first reads
>and executes commands from the file /etc/profile, if that file
>exists. After reading that file, it looks for ~/.bash_profile,
>~/.bash_login, and ~/.profile, in that order, and reads and executes
>commands from the first one that exists and is readable.
(remark: .profile is the generic Bourne shell name, the other two,
of course, are bash-specific.
In Debian, the shell used by _scripts_ often is dash, which is less
powerful but starts up faster).

Note that shells, started in windows _without_ the login option, may
be, distribution dependant, not do this but execute .bashrc instead.
Many distro's will execute both for "login shells", but this is not
the generic way bash works:
>When an interactive shell that is not a login shell is started, bash
>reads and executes commands from ~/.bashrc, if that file exists.
(both quotes from the bash man page).

> Transferring the local environment variable to a global one. gotyou.

Global for THAT shell, indeed.

BTW: often your window environment will read your .profile and so make
the settings in that global for the whole environment.
This IS dependant on the DE (and the way of starting it), though.

I myself use a .xsession to start the X session and IN that explicitly
source my .profile (which will first execute the .bashrc too):
test -r ~/.bashrc && . ~/.bashrc
alias bye='clear;logout'
so all general settings are in my .bashrc, the ones specific for a
LOGIN shell are in the .profile (logout, of course, can only be done
by a login shell).

R.Wieser

unread,
Dec 21, 2017, 2:14:58 AM12/21/17
to
Wildman,

> Place the commands in /etc/rc.local. Set the execute
> bit if it is not already set.

Thank you for both naming the file and its location, as well as the warning
to have it executable.

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 21, 2017, 2:14:58 AM12/21/17
to
William,

> That would be pretty useless. The user's setting would almost
> certainly overwrite that.

In other words, the order in which the changes from those files are applied
is:
#1 /etc/rc.local
#2 .bashrc
#3 .bash_profile

With the later possibly overwriting the earlier ones.

Any idea why there are three such locations ? I could imagine one as
holding global settings and one the users own preferences, but somehow I get
the idea that all three are user-specific ...

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 21, 2017, 2:20:28 AM12/21/17
to
Wildman,

> I would say to OP, try it and see.

:-) I could do that I suppose. I think that there are less than 10
permutations needed to figure out the order in which those three locations
are handled (for me. at this moment).

Than again, that would be the easy part. I would also like to know the why
there are three to begin with (the significance of each of them).

Regards,
Rudy Wieser


Eef Hartman

unread,
Dec 21, 2017, 2:43:33 AM12/21/17
to
R.Wieser <add...@not.available> wrote:
> #1 /etc/rc.local

Is executed only once, after the (re)boot, contains "things to be
started" for the whole system.
BTW: the path is distro-specific, in my system it is
/etc/rc.d/rc.local # Local system initialization script.
Normally doesn't contain any user settings.
In fact it often is totally empty and it can be changed only by the
root user (after which change you have to reboot, of course).

> #2 .bashrc

At "start a shell" for the user who's home dir it is IN
(open another commandline window, "shell-out" from your editor).

> #3 .bash_profile

The same, but only when the shell is a login shell (window started
with the -l option or the shell itself gets the --login option).
Of course always used for the text consoles as there the shell IS
started by the login process (getty/login/passwd).

R.Wieser

unread,
Dec 21, 2017, 2:51:22 AM12/21/17
to
Eef,

> In fact it doesn't even survive a "shell exit" as the variable
> still is local to the specific shell/window it was set IN.

Which one of the variables are you taking about there, the local origional,
or the exported one ?

Or do I have to look at that "export" method as one to simply transfer data
from a child to a parent shell/window (meaning exporting from a top-level
shell/window does not do anything) ?

>> Transferring the local environment variable to a global one. gotyou.
>
> Global for THAT shell, indeed.

Yup. Forgot that Linux has several available, and (thus) to be specific.

> This IS dependant on the DE (and the way of starting it), though.

Aplogies, what is "DE" short for ? Data Environment ?

> test -r ~/.bashrc && . ~/.bashrc

As far as I can tell this means to first check if the file exists, and if it
does than .... I'm not sure about the second part: The "&&" is followed by
a single dot and the filename. What does that dot there mean/do ?

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 21, 2017, 2:56:49 AM12/21/17
to
Eef,

> Is executed only once, [snip]

Thanks, that clariefies it. That is, I think I may conclude that the latter
two are mutually exclusive (from the systems perspective).

Regards,
Rudy Wieser


Eef Hartman

unread,
Dec 21, 2017, 5:19:40 AM12/21/17
to
R.Wieser <add...@not.available> wrote:
> Which one of the variables are you taking about there, the local
> origional, or the exported one ?

Both. The "exported" one will be exported ONLY to "child" processes,
so processes started by that shell after the setting.

> from a child to a parent shell/window

No, you can never export "upwards", the environment for a parent
process is fixed and cannot be influenced by its children.
That's why it is important to set "session-wide" settings already in
the startup OF the session.

> Yup. Forgot that Linux has several available, and (thus) to be specific.

In this case I meant "instances" of the shell. So if you got three
"terminal windows" open, each got its own instance of (probably) the
same type of shell, but with independant environments.

> Aplogies, what is "DE" short for ? Data Environment ?

Desktop Environment, the whole of the "windows" look and feel.
Well-known DE's are KDE, Gnome, its derivations Mate and Unity and
XFCE.

>> test -r ~/.bashrc && . ~/.bashrc
> As far as I can tell this means to first check if the file exists,

AND is readable, otherwise it wouldn't be able to do "the rest".

> a single dot and the filename. What does that dot there mean/do ?

That is called "sourcing", tell the current shell (instance) to
execute the commands IN that file, not do it by a _sub-shell_.
Your .profile (cq .bash_profile) is also sourced, so that settings
will apply to the current shell.
(t)csh use the explicit command "source", as extension bash can use
that too instead of the .
But I'm used to writing generic for the shell, so try to avoid
"bash-isms", because I also worked a lot in System V Unix, which has
the Posix shell as the default, not bash.
And, as I said before, some distro's like Debian will use dash in
scripts, instead OF bash (bash = Bo(u)rn(e) Again SHell, dash - Debian
Almquist SHell).

R.Wieser

unread,
Dec 21, 2017, 5:51:50 AM12/21/17
to
Eef,

> The "exported" one will be exported ONLY to "child" processes,
> so processes started by that shell after the setting.

Thanks. So "exporting" an environment variable just marks it for
inheritance (and does not even affect any already running children). I
thought it would posibly be actually *exported* (to some other place,
parent, global environment, or even a file) :-)

> No, you can never export "upwards",

Understood.

>> Aplogies, what is "DE" short for ? Data Environment ?
>
> Desktop Environment,

:-) Shows you how much I've yet got to learn that I didn't even get that
...

> AND is readable, otherwise it wouldn't be able to do "the rest".

Ackk... That was what the "-r" was for. Not for "recursive" (was already
wondering about that, but nothing a quick 'man' wouldn't have solved).

> That is called "sourcing", tell the current shell (instance) to
> execute the commands IN that file, not do it by a _sub-shell_.
> Your .profile (cq .bash_profile) is also sourced, so that settings
> will apply to the current shell.

Understood. And now you explained it, rather logical, as when it would be
executed in a sub-shell the changes would be lost as soon as it would
exit-and-return.

Somehow I thought that it would try to execute the current directory (as the
dot was at the position where I would expect an executable), and could not
make any sense of that ....

> And, as I said before, some distro's like Debian will use dash in
> scripts, instead OF bash (bash = Bo(u)rn(e) Again SHell, dash -
> Debian Almquist SHell).

In the back of my mind I knew where the "bash" name came from. Can't
remember having heard of "dash" before though.

Regards,
Rudy Wieser


Eef Hartman

unread,
Dec 21, 2017, 6:09:15 AM12/21/17
to
R.Wieser <add...@not.available> wrote:
> I think I may conclude that the latter
> two are mutually exclusive (from the systems perspective).

Not always, some distro's (and my trick in the .profile) will execute
both, in THAT order (.bashrc before .profile).
But bash itself considers them to be alternatives, only one will be
searched for and executed, depending on the "login" status OF that
shell (instance).

BTW: a lot of the bash extensions above the Bourne (and Posix) shell
come from the way csh (and tcsh) do things (and others from the Korn
shell, ksh). To get back on track: the export var=value is directly
analogue to csh's setenv command.

Historical context: Unix started out with the Bourne shell (actually:
there have been shell's before that, but they weren't powerful enough
and have disappeared), but with the capability of using other ones.
When the University of California, Berkley, developed their one Unix
(BSD = Berkley Software Department), they wanted to make shell
programming more like the language C, so BSD unix'es use the C-shell
as default.
Bash is a later enhancement, open source, of the Bourne shell, tcsh
(Tenex C-SHell) is an open-source enhancement of the C-shell (with
features of the Tenex O/S command processor added).
Linux normally makes both (bash and tcsh) available, as well as the
Korn shell (since it came out of copyright), which is a later
enhanced shell, but not all the way compatible with the Bourne shell.

William Unruh

unread,
Dec 21, 2017, 12:33:03 PM12/21/17
to
On 2017-12-21, R.Wieser <add...@not.available> wrote:
> William,
>
>> That would be pretty useless. The user's setting would almost
>> certainly overwrite that.
>
> In other words, the order in which the changes from those files are applied
> is:
> #1 /etc/rc.local
> #2 .bashrc
> #3 .bash_profile

No. /etc/rc.local -- run by root when the computer boots up. I suspect that at
best it will set an environment variable for root, but not for other users,
but I am not positive about this.

.bash_profile -- run at login as user. It often calls .bashrc.
.bashrc -- run whenever a bash shell is started by any user.

Note that if any of those do not exist, then
rc.local it has no default except itself.
.bash_profile -- the default /etc/profile is run, and it calls /etc/profile.d/*.sh
.bashrc -- /etc/bashrc is run instead

William Unruh

unread,
Dec 21, 2017, 12:34:48 PM12/21/17
to
On 2017-12-21, Eef Hartman <E.J.M....@gmail.com> wrote:
> R.Wieser <add...@not.available> wrote:
>> #1 /etc/rc.local
>
> Is executed only once, after the (re)boot, contains "things to be
> started" for the whole system.
> BTW: the path is distro-specific, in my system it is
> /etc/rc.d/rc.local # Local system initialization script.
> Normally doesn't contain any user settings.
> In fact it often is totally empty and it can be changed only by the
> root user (after which change you have to reboot, of course).

No. You only need to reboot IF you want the changes applied immediately. You
can usually apply the changes from the command line so you would not have to
reboot.

William Unruh

unread,
Dec 21, 2017, 12:40:34 PM12/21/17
to
On 2017-12-21, R.Wieser <add...@not.available> wrote:
> Eef,
>
>> In fact it doesn't even survive a "shell exit" as the variable
>> still is local to the specific shell/window it was set IN.
>
> Which one of the variables are you taking about there, the local origional,
> or the exported one ?
>
> Or do I have to look at that "export" method as one to simply transfer data
> from a child to a parent shell/window (meaning exporting from a top-level
> shell/window does not do anything) ?

No. export means "preserve this environment variable when a child is called"
There is no way of setting an environment variable in a parent process.


...
> As far as I can tell this means to first check if the file exists, and if it
> does than .... I'm not sure about the second part: The "&&" is followed by
> a single dot and the filename. What does that dot there mean/do ?

./ is the current directory.

So, search for the file that follows in the current directory.
../ is the parent directory of the current directory (unless the current
directory is /, in which case ../ is the current directory)
>

Richard Kettlewell

unread,
Dec 21, 2017, 2:38:14 PM12/21/17
to
William Unruh <un...@invalid.ca> writes:
> On 2017-12-21, R.Wieser <add...@not.available> wrote:
>>> That would be pretty useless. The user's setting would almost
>>> certainly overwrite that.
>>
>> In other words, the order in which the changes from those files are applied
>> is:
>> #1 /etc/rc.local
>> #2 .bashrc
>> #3 .bash_profile
>
> No. /etc/rc.local -- run by root when the computer boots up. I suspect
> that at best it will set an environment variable for root, but not for
> other users, but I am not positive about this.

It won’t set an environment variable for anyone. Even if it did setting
LD_LIBRARY_PATH won’t have the effect the OP is apparently seeking; see
my other posting.

--
https://www.greenend.org.uk/rjk/

R.Wieser

unread,
Dec 22, 2017, 6:21:13 AM12/22/17
to
William,

> No. /etc/rc.local -- run by root when [snip]

Thank you. Usefull to know.

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 22, 2017, 6:21:13 AM12/22/17
to
Richard,

> Even if it did setting LD_LIBRARY_PATH won't have the effect
> the OP is apparently seeking;

My apologies, but I was not seeking an(y) effect.

I was just trying to figure out what "exporting" an that variable was
needed/used for, and than what the scope of such a variable is, and how
extending its lifetime works. In short, just some basic "this how you deal
with - what I now recognise as - environment variables" stuff.

But yes, I will, at some time in the future, take a peek at what that
variable is used for, to see if it will be helpfull for me in any way.
Though I think that google will have lots of info on it. :-)

Regards,
Rudy Wieser


R.Wieser

unread,
Dec 22, 2017, 6:21:13 AM12/22/17
to
William,

> No. export means "preserve this environment variable when a child is
> called"

I'm sorry, but that line causes more questions than that it solves. Like
what "preserve this environment variable" (as in, keeping it safe) has to do
with "when a child is called" ...

But don't worry, I think I got what its for from what Eef told me.

> ./ is the current directory. So, search for the file that follows in the
> current directory.

That does not make sense. At least, not when I see the scripts filename
(following the on-its-own dot) being preceeded by a "home folder" tilde --
which seems to indicate the file should be looked for there.

Mind you, I'm not trying to challenge you here. Its just that I can't
(logically) follow what you are trying to tell me ... :-\

Regards,
Rudy Wieser


Eef Hartman

unread,
Dec 22, 2017, 9:40:30 AM12/22/17
to
R.Wieser <add...@not.available> wrote:
>> ./ is the current directory.
>> So, search for the file that follows in the current directory.
>
> That does not make sense.

No, it doesn't. The single dot just means "current" and in this case
it stands for "do in current shell" (contents of the next file).
The filename CAN have a path prefix, so doesn't have to be in current
DIRectory.

Note that a "sourced" script, so a script executed THIS way, doesn't
needs to be executable, while a normal script does.

William Unruh

unread,
Dec 22, 2017, 11:31:35 AM12/22/17
to
On 2017-12-22, R.Wieser <add...@not.available> wrote:
> William,
>
>> No. export means "preserve this environment variable when a child is
>> called"
>
> I'm sorry, but that line causes more questions than that it solves. Like
> what "preserve this environment variable" (as in, keeping it safe) has to do
> with "when a child is called" ...

It preserves it in the sense that when the child is called, it is called with
that environment variable assigned its current value.
A child does not inherit the whole environment. Just that part which has been
exported.


>
> But don't worry, I think I got what its for from what Eef told me.
>
>> ./ is the current directory. So, search for the file that follows in the
>> current directory.
>
> That does not make sense. At least, not when I see the scripts filename
> (following the on-its-own dot) being preceeded by a "home folder" tilde --
> which seems to indicate the file should be looked for there.

When you see
"./" it means the current directory.
If you see " . " (ie a dot with a space infront and behind it" is is short for
the bash command "source" (see man bash for what that does).
If you see it in the middle of a path apple/banana/./pear it is do nothing
symbol. (It means the current directory when you have gotten to apple/banana
which is the directory apple/banana)

../ means the parent directory to the current one.

William Unruh

unread,
Dec 22, 2017, 11:41:13 AM12/22/17
to
On 2017-12-22, Eef Hartman <E.J.M....@gmail.com> wrote:
> R.Wieser <add...@not.available> wrote:
>>> ./ is the current directory.
>>> So, search for the file that follows in the current directory.
>>
>> That does not make sense.
>
> No, it doesn't. The single dot just means "current" and in this case
> it stands for "do in current shell" (contents of the next file).
> The filename CAN have a path prefix, so doesn't have to be in current
> DIRectory.

Yes, . on its own means "source" ./ means "current directory.


>
> Note that a "sourced" script, so a script executed THIS way, doesn't
> needs to be executable, while a normal script does.

It need not be a script at all. It may be just a list of bash statements.
Thus

/tmp/test
--------------------
export APPLE=pear
-------------------

. /tmp/test
will run the command
export APPLE=pear
as if it were issued on the command line

/tmp/test
will do nothing except give an error, because /tmp/test is not a valid script
which must start with a line telling it what shell to use to interpret the
contents.

If you set up
/tmp/test1
-----------------------
#!/bin/bash
export APPLE=pear
--------------------
and made /tmp/test1 executable. then
you could still do
. /tmp/test, and that first line would be interpreted as a comment line and
ignored, and the second would be run under the current bash.

But /tmp/test1 would invoke another bash shell which would interpret the
lines. (it would be useless, since that environment assignment would occur
within the new bash shell and when the program /tmp/test1 exited that new
bash shell would be exited and the environment assignment would die since
parents (your current bash shell) do not inherit anything from the child (the
new bash shell which ran that assignment)

So, . and ./ mean different things.

William Unruh

unread,
Dec 22, 2017, 11:46:01 AM12/22/17
to
On 2017-12-22, R.Wieser <add...@not.available> wrote:
> Richard,
>
>> Even if it did setting LD_LIBRARY_PATH won't have the effect
>> the OP is apparently seeking;
>
> My apologies, but I was not seeking an(y) effect.
>
> I was just trying to figure out what "exporting" an that variable was
> needed/used for, and than what the scope of such a variable is, and how
> extending its lifetime works. In short, just some basic "this how you deal
> with - what I now recognise as - environment variables" stuff.

Well, you started off this thread with a specific problem in compiling a
program.

>
> But yes, I will, at some time in the future, take a peek at what that
> variable is used for, to see if it will be helpfull for me in any way.
> Though I think that google will have lots of info on it. :-)

It lists the directories which are searched by the dynamic loader for library
routines when the dynamic loader loads a program. Those libraries are searched
in the order specified, so that if a library routine requires another library
routine from another library file, it needs to be in one of the directlories
listed after the one containing the first library.


>
> Regards,
> Rudy Wieser
>
>

Chris Elvidge

unread,
Dec 22, 2017, 12:51:32 PM12/22/17
to
Lets try to clear up dots and exports:

I've put /home/chris/temp into the PATH temporarily

$ echo $PATH
/home/chris/temp:/opt/firefox:/home/chris/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/sbin:/usr/sbin

In this directory (temp) there is a file called .echo_j
$ cat .echo_j
#!/bin/bash
echo $0
echo "j= $j"

The dot in front of the file name just means it is hidden from the
normal ls command viz:
$ ls -l
total 0

But is shows up in ls -A
$ ls -lA
total 4.0K
-rwxr-xr-x 1 chris chris 26 Dec 22 17:03 .echo_j

Move .echo_j to echo_j and
$ ls -l
total 4.0K
-rwxr-xr-x 1 chris chris 26 Dec 22 17:03 .echo_j

Now I set j equal to "test" (echo $j shows it)
$ j=test
$ echo $j
test

The first . in this command means source (i.e. run in the current shell)
$ . echo_j
bash
j= test
$ source echo_j
bash
j= test

But when the file is run as an executable, it opens a sub (or child)
shell (where j is not set)
$ echo_j
/home/chris/temp/echo_j
j=

Export the variable and it is available in the sub (child) shell
$ export j
$ echo_j
/home/chris/temp/echo_j
j= test

Close and reopen the shell to remove the temp directory from the PATH
Change to temp directory
$ ls -l
total 4.0K
-rwxr-xr-x 1 chris chris 34 Dec 22 17:34 echo_j

Now run the file
$ echo_j
bash: echo_j: command not found

I.e. current directory is not in the PATH

Run the file echo_j in the current directory (that's the ./ before the
command)
$ ./echo_j
./echo_j
j=

Now set j again
$ j=test
$ ./echo_j
./echo_j
j=

$ export j
$ ./echo_j
./echo_j
j= test


I hope this helps

--

Chris Elvidge, England
It is loading more messages.
0 new messages