Neko
extract all the files to a directory from the archive.
Choose open workspace from the file menu, navigate to where you stored the
crafty files and open the makefile.nt
It will mention something about a wrapper project, just click 'yes'
Select Project/Settings then the General tab, and change the build command
to
nmake /f makefile.nt
Select build all from the 'build' menu, and you should get an .exe file.
You can modify some of the parameters in the makefile to optimize a little,
but this should get you started.
(I've just done it today, and it does work..)
Regards
Simon
"Neko" <neutral_...@yahoo.com> wrote in message
news:bpvpfh$1t7d8k$1...@ID-198243.news.uni-berlin.de...
19.6 will not compile on DOS. No future versions will. Older versions
should compile OK however...
> Neko
--
Robert M. Hyatt, Ph.D. Computer and Information Sciences
hy...@uab.edu University of Alabama at Birmingham
(205) 934-2213 136A Campbell Hall
(205) 934-5473 FAX Birmingham, AL 35294-1170
> 19.6 will not compile on DOS. No future versions will. Older versions
> should compile OK however...
Well, I got 19.6 to compile under Visual C++ 6.0, and it runs in a dos
console, but I guess it is technically a Windows program, right?
It is a 32-bit console program.
Tom Veldhouse
Dr. Hyatt,
Have you considered perhaps using the GNU based configure scripts
(autoconf/automake) for crafty rather than a targeted makefile?
Tom Veldhouse
Yes.
> Dr. Hyatt,
> Tom Veldhouse
I did at one point. But so many systems don't have that installed, that
it created yet another installation issue. Remember that this code runs
on all flavors of unix by every vendor I know of, plus other systems as
well...
Robert Hyatt <hy...@crafty.cis.uab.edu> wrote in message news:<bq0vtk$34e$1...@juniper.cis.uab.edu>...
>
> 19.6 will not compile on DOS. No future versions will. Older versions
> should compile OK however...
One thing I noticed in 19.6 is that there is no "cygwin" target in the
Makefile, although "make help" reports one.
Is it possible to compile 19.6 under cygwin? I tried a couple of targets
(generic, linux) in the hope that it would be close enough, however both fail.
(generic fails very early - on the first gcc invocation - looks like $(opt)
is empty -
gcc -D -c searchr.c
<command line>:9:1: macro names must be identifiers
)
Regards,
Mark
--
ma...@oakden.org
>Well if you have nmake and Visual C++ it is pretty easy.
>
>extract all the files to a directory from the archive.
Thanks for the tip !!!
--
WebWalker
webw...@eudoramail.com
PGP Key ID : 0xB3F1A279
No problem, anytime :)
Simon
>> Dr. Hyatt,
>> Have you considered perhaps using the GNU based configure scripts
>> (autoconf/automake) for crafty rather than a targeted makefile?
>> Tom Veldhouse
>I did at one point. But so many systems don't have that installed,
that
>it created yet another installation issue. Remember that this code
runs
>on all flavors of unix by every vendor I know of, plus other systems
as
> well...
Robert, neither autoconf nor automake need to be installed on any
system other than the system on which the 'configure' script is built.
You seem to be under some misunderstanding about how autoconf/automake
work.
I've shown you (by private email) the fact crafty would not build
without some manual editing using Sun's C & C++ compilers on a Sun,
gcc on a Sun, SGI's cc on an SGI Octane, IBM's C compiler on an
RS/6000, HP's compilers on a Dec Alpha, gcc on an HP PA-RISC, and SGI
Octane running IRIX. Quite simply, the Makefile is seriously broken
for UNIX systems.
The 'configure' script (which get disstributed, MUST to be created on
a system which has both autoconf and automake installed. But once that
is done, the configure script is distributed and does not require
autoconf, automake, or anything else. It is a simple shell script,
just requiring /bin/sh.
As I've said in a private email, I've built a package of mine 'atlc'
on a Sony Playstation 2, a Cray YMP-EL supercomputer running UNIXCOS,
Debian Linux, Slackware Linux, Gentoo Linux, Redhat Linux, Suse Linux,
IBM's AIX, Apples's OS X for Mac, HP's HP-UX (both PA-RISC and
Itanium), SGI's IRIX, Sun's Solaris, SCO's UNIXWare, HP's Tru64,
NetBSD, OpenBSD and FreeBSD.
Some of those systems (the SGI Octane, the Dec Alpha, the HP PA-RISC,
the IRM RS/6000) are owned by me personally. NONE of them have
autoconf or automake installed, yet my programs builtd on them all. I
purposly don't install those, gcc or any of the normal GNU things, to
check the program builds on a basic system. The Cray Y-MP, which I
don't personally own, has neither programs.
The 'configure' script, which is a /bin/sh executable was built on a
Sun for which both autoconf and automake were present. So as long as
'configure' is built on a system with autoconf and auotmake, the user
has no need to have either perl, autoconf, automake or anything much
else installed in order to install the pgrogram. You supply them the
'configure' script, which is a standard /bin/sh executable.
You seem to be under the impression that the end-user needs autoconf,
configure or automake in order to install the software. Nothing could
be futher from the truth. Theese are only needed on the system on
which the configure script it buit.
>>> Dr. Hyatt,
>>> Have you considered perhaps using the GNU based configure scripts
>>> (autoconf/automake) for crafty rather than a targeted makefile?
>>> Tom Veldhouse
>>I did at one point. But so many systems don't have that installed,
> that
>>it created yet another installation issue. Remember that this code
> runs
>>on all flavors of unix by every vendor I know of, plus other systems
> as
>> well...
> Robert, neither autoconf nor automake need to be installed on any
> system other than the system on which the 'configure' script is built.
> You seem to be under some misunderstanding about how autoconf/automake
> work.
Sorry, I really didn't think when I wrote that. What I meant was that
previous attempts simply didn't work. IE too many people have multiple
versions of GCC, the libraries, etc installed, and that produced many
problems when we played with autoconf/automake. IE the configure
script would break, often in really bizarre ways, because of mal-
formed installations. IE see what happens when you have multiple
compilers that are incompatible (gcc 2.x and 3.x for example).
It was also fairly complicated to get it even close to working on multiple
machines. IE the various tests have to define various macros, and that
also got messy. It has the advantage of eliminating some of the spaghetti
if defined() preprocessor stuff I currently have, but it introduces it's
own form of pasta into the code.
> I've shown you (by private email) the fact crafty would not build
> without some manual editing using Sun's C & C++ compilers on a Sun,
> gcc on a Sun, SGI's cc on an SGI Octane, IBM's C compiler on an
> RS/6000, HP's compilers on a Dec Alpha, gcc on an HP PA-RISC, and SGI
> Octane running IRIX. Quite simply, the Makefile is seriously broken
> for UNIX systems.
It is more an issue of the source rather than the Makefile. What happens
is that someone sends suggested changes, and they often break some rarely-used
target without my knowing. IE I have some sparcs, but I _never_ build crafty
on them as they are slower than our slowest PC systems running linux.
> The 'configure' script (which get disstributed, MUST to be created on
> a system which has both autoconf and automake installed. But once that
> is done, the configure script is distributed and does not require
> autoconf, automake, or anything else. It is a simple shell script,
> just requiring /bin/sh.
Yes, but it also depends on the user to have a lot of things pre-defined
that might not be done. It is very hard to debug user problems when using
the current code. configure can blow up and that is _much_ harder to track
down remotely...
For example, "configure" frequently breaks on packages we get here, often
with bizarre compiler errors that make no sense. They ultimately link back
to a bad environment setting that is using compiler A with include files
B, to produce crap (C). :)
> As I've said in a private email, I've built a package of mine 'atlc'
> http://atlc.sourceforge.net/
> on a Sony Playstation 2, a Cray YMP-EL supercomputer running UNIXCOS,
> Debian Linux, Slackware Linux, Gentoo Linux, Redhat Linux, Suse Linux,
> IBM's AIX, Apples's OS X for Mac, HP's HP-UX (both PA-RISC and
> Itanium), SGI's IRIX, Sun's Solaris, SCO's UNIXWare, HP's Tru64,
> NetBSD, OpenBSD and FreeBSD.
> Some of those systems (the SGI Octane, the Dec Alpha, the HP PA-RISC,
> the IRM RS/6000) are owned by me personally. NONE of them have
> autoconf or automake installed, yet my programs builtd on them all. I
> purposly don't install those, gcc or any of the normal GNU things, to
> check the program builds on a basic system. The Cray Y-MP, which I
> don't personally own, has neither programs.
> The 'configure' script, which is a /bin/sh executable was built on a
> Sun for which both autoconf and automake were present. So as long as
> 'configure' is built on a system with autoconf and auotmake, the user
> has no need to have either perl, autoconf, automake or anything much
> else installed in order to install the pgrogram. You supply them the
> 'configure' script, which is a standard /bin/sh executable.
I realize that. As I said, when I wrote my quick response, I really
wasn't thinking. The problem was the remote support.
> You seem to be under the impression that the end-user needs autoconf,
> configure or automake in order to install the software. Nothing could
> be futher from the truth. Theese are only needed on the system on
> which the configure script it buit.
No... I just wrote poorly.
IE remember that I have installed _thousands_ of packages. Xboard is
a good example, which uses configure, and which does break on occasion
due to environment issues.
I'm not against using this approach at all, although it will be
problematic for windows users that now use nmake.
I rather feel that if they have a broken installation, they will find
this on every program they try to build and need to sort out their
system.
> It was also fairly complicated to get it even close to working on multiple
> machines. IE the various tests have to define various macros, and that
> also got messy. It has the advantage of eliminating some of the spaghetti
> if defined() preprocessor stuff I currently have, but it introduces it's
> own form of pasta into the code.
Sorting throught someone elses code is hard to say the least, as I'm
sure you know, some I'm not going to try. But it seems many of the
macros and #defines you define, could be dispensed with. Looking in
chess.h for example I see:
#if defined(FreeBSD)
# undef HAS_64BITS /* machine has 64-bit integers /
operators */
# define HAS_LONGLONG /* machine has 32-bit/64-bit integers
*/
# define UNIX /* system is unix-based
*/
#endif
#if defined(SGI)
# undef HAS_64BITS /* machine has 64-bit integers /
operators */
# define HAS_LONGLONG /* machine has 32-bit/64-bit integers
*/
# define UNIX /* system is unix-based
*/
#endif
#if defined(SUN)
# undef HAS_64BITS /* machine has 64-bit integers /
operators */
# define HAS_LONGLONG /* machine has 32-bit/64-bit integers
*/
# define UNIX /* system is unix-based
*/
#endif
Clearly here you are defining HAS_64BITS and HAS_LONGLONG, but
autoconf and automake would define for you instead
HAVE_LONG_LONG if the compiler supports 'long long'
and set
SIZEOF_LONG = 8
if the compiler supports 64-bit longs. Note I say the compiler, since
it's quite possible the machine is 64-bit (like the modern Suns),
whereras the compiler won't be able to generate 64-bit code, and so
needs 'long long'. Of course 'long long' is by no means standard, but
at least a configure script will test if the compiler accepts it.
Clearly if one builds on a modern Sun running Solaris with a modern
compiler, the machine will be truely 64-bit. Yet when one does a 'make
solaris' or 'make solaris-gcc' your Makefile will define SUN, and so
set these incorrectly. Clearly your 'make solaris' and 'make
solaris-gcc' were set up for 32-bit Suns, not 64-bit ones.
I don't know for sure, but I assume if you build FreeBSD on Itanium,
or other 64-bit machine, then again these are not defined properly.
This day and age you can't really make any assumption that operating
system X will be on 32 or 64-bit, or that any particular arhitecture
(such as SPARC) is 32 or 64-bits. In the case of Solaris, it can be
32-bit SPARC, 64-bit SPARC, or 32-bit x86.
Before releasing 'atlc'
I switch on a number of UNIX boxes I have (32-bit RS/6000 running AIX,
64-bit Dec Alpha running Tru64, 64-bit SGI Octane running AIX, 64-bit
HP-C3000 running HP-UX, 32-bit PC running Linux, 64-bit Sun running
Solaris) and run a small script I wrote that basically copies the
'atlc-x.y.z.tar.gz to the machine by secure copy (scp), configures it,
makes it, runs 'make check' and sends me back the results. By using
ssh, this can all be done non-interactively.
If it fails to build on one, I investigate why. Sometimes I test on
other machines I have (old SPARC running NetBSD), or some systems such
as FreeBSD, Linux etc on Itaniums and Alphas at
I've found that by testing on a number of platforms, bugs can
sometimes show up on one that don't appear on another, yet are waiting
to bite me. I recently found an issue where the results were not 100%
consistant on an RS/6000 running AIX if configured for multi-threaded
use. I found there was in fact a bug in the algorithm, which means it
needs a rethink. Whilst it only appears to show itself under AIX, I
know it can theoretically appear on any OS, so have disabled
multi-threading in the latest release and will re-enable it later.
Yet I've managed to avoid a single line of code that has any
#ifdef SYSTEM_X
by the use of autoconf and automake. I don't use any assembler code,
which is of course always going to be system dependant, but that
should not be that hard to handle, although I've not tried it myself.
The autoconf and automake mailing lists are quite helpful places.
> > I've shown you (by private email) the fact crafty would not build
> > without some manual editing using Sun's C & C++ compilers on a Sun,
> > gcc on a Sun, SGI's cc on an SGI Octane, IBM's C compiler on an
> > RS/6000, HP's compilers on a Dec Alpha, gcc on an HP PA-RISC, and SGI
> > Octane running IRIX. Quite simply, the Makefile is seriously broken
> > for UNIX systems.
>
> It is more an issue of the source rather than the Makefile. What happens
> is that someone sends suggested changes, and they often break some rarely-used
> target without my knowing. IE I have some sparcs, but I _never_ build crafty
> on them as they are slower than our slowest PC systems running linux.
Would it not be unnecessary for people to send you those changes if
the code was written in a less system dependant manner? I could send
you pathces so it builds on a 64-bit Sun, or an Alpha running Tru64,
but none should really be necessary.
> > The 'configure' script (which get disstributed, MUST to be created on
> > a system which has both autoconf and automake installed. But once that
> > is done, the configure script is distributed and does not require
> > autoconf, automake, or anything else. It is a simple shell script,
> > just requiring /bin/sh.
>
> Yes, but it also depends on the user to have a lot of things pre-defined
> that might not be done. It is very hard to debug user problems when using
> the current code. configure can blow up and that is _much_ harder to track
> down remotely...
What does the user have to have predefined? The configure script will
try looking for a compiler such as cc, gcc (you can set an order of
preference), so the user does not need to define CC. Likewise it will
try to see what flags the compiler will accept, so user does not need
to define CFLAGS. I'm sure its a lot easier than trying to use the
Makefile, which would not build first time on any one of about 8 or so
systems I tested it for you on.
> For example, "configure" frequently breaks on packages we get here, often
> with bizarre compiler errors that make no sense. They ultimately link back
> to a bad environment setting that is using compiler A with include files
> B, to produce crap (C). :)
But how is that any different from what your Makefile does? Surely if
the compiler A is set up to use include files for compiler B, there is
always going to be hassle. I have both gcc and Suns cc on this
UltraSPARC and have had relatively few problems. 'crafty' gave me far
more problems than other command line package I've used. Obviously big
graphical based packages such as OpenOffice can be more of a hassle,
as there are usually lots of dependancies.
> > The 'configure' script, which is a /bin/sh executable was built on a
> > Sun for which both autoconf and automake were present. So as long as
> > 'configure' is built on a system with autoconf and auotmake, the user
> > has no need to have either perl, autoconf, automake or anything much
> > else installed in order to install the pgrogram. You supply them the
> > 'configure' script, which is a standard /bin/sh executable.
>
> I realize that. As I said, when I wrote my quick response, I really
> wasn't thinking. The problem was the remote support.
I can't help but feel in the longer run it would make your life
easier!
> IE remember that I have installed _thousands_ of packages. Xboard is
> a good example, which uses configure, and which does break on occasion
> due to environment issues.
I had far less hasssle building xboard than I did crafty, and xboard
is graphical in nature, which one would expect to be more difficult.
> I'm not against using this approach at all, although it will be
> problematic for windows users that now use nmake.
I've had a user build 'atlc' which is basically a UNIX program under
Windoze using 'Cygwin'
That threw up one or two issues I easily fixed. I'd used
fopen(somefile,"r"), but since it was a binary file, I needed to use
"rb" on Windoze. But UNIX systems silently ignore the "b", so it makes
no difference. Since I compute a checksum of files for test purposes,
I needed to allow two different checksums on the text files, due to
the usual CR or CR/LF issue. But once those two were resolved, there
was no problem building atlc on Windoze either. (The method I used to
read/write binary files does not depend on the Endianness of the
machine).
Don't take it the wrong way Robert, 'crafty' is an excellent program
and one many and no doubt pleased you have made open-source. But I'm
sure there must be many people who download it, look for a README, but
dont find one. Then they look for a 'configure' script, realise one is
not there so type 'make', then see a whole load of totally confusing
messages, so just give up. That is a real shame, as it's an excellent
program, once one has figured out how to build it.
David Kirkby.
> I rather feel that if they have a broken installation, they will find
> this on every program they try to build and need to sort out their
> system.
I would agree. However, pretty "green" users still want to play
chess, and they generate a _ton_ of questions, which is what I was
talking about.
> and set
> SIZEOF_LONG = 8
I agree since we now have plenty of 64 bit machines around, and in the
case of the opteron, it can be either, just like the sun, which makes
things even messier.
> Before releasing 'atlc'
> http://atlc.sourceforge.net/
> #ifdef SYSTEM_X
It is more difficult when performance counts. IE look at the many ways
FirstOne() is done. I suppose it is possible to handle that by simply
testing each possible way of doing it...
>> > The 'configure' script (which get disstributed, MUST to be created on
>> > a system which has both autoconf and automake installed. But once that
>> > is done, the configure script is distributed and does not require
>> > autoconf, automake, or anything else. It is a simple shell script,
>> > just requiring /bin/sh.
>>
>> Yes, but it also depends on the user to have a lot of things pre-defined
>> that might not be done. It is very hard to debug user problems when using
>> the current code. configure can blow up and that is _much_ harder to track
>> down remotely...
> What does the user have to have predefined? The configure script will
> try looking for a compiler such as cc, gcc (you can set an order of
> preference), so the user does not need to define CC. Likewise it will
> try to see what flags the compiler will accept, so user does not need
> to define CFLAGS. I'm sure its a lot easier than trying to use the
> Makefile, which would not build first time on any one of about 8 or so
> systems I tested it for you on.
The user has to have the right environment variables set to point to the
right includes, libraries, ld files, etc...
>> For example, "configure" frequently breaks on packages we get here, often
>> with bizarre compiler errors that make no sense. They ultimately link back
>> to a bad environment setting that is using compiler A with include files
>> B, to produce crap (C). :)
>
> But how is that any different from what your Makefile does? Surely if
> the compiler A is set up to use include files for compiler B, there is
> always going to be hassle. I have both gcc and Suns cc on this
> UltraSPARC and have had relatively few problems. 'crafty' gave me far
> more problems than other command line package I've used. Obviously big
> graphical based packages such as OpenOffice can be more of a hassle,
> as there are usually lots of dependancies.
>
>
>> > The 'configure' script, which is a /bin/sh executable was built on a
>> > Sun for which both autoconf and automake were present. So as long as
>> > 'configure' is built on a system with autoconf and auotmake, the user
>> > has no need to have either perl, autoconf, automake or anything much
>> > else installed in order to install the pgrogram. You supply them the
>> > 'configure' script, which is a standard /bin/sh executable.
>>
>> I realize that. As I said, when I wrote my quick response, I really
>> wasn't thinking. The problem was the remote support.
> I can't help but feel in the longer run it would make your life
> easier!
As I said to you via email, I'm certainly willing to look at it again,
as my first cut at this was _way_ early in the Crafty development, back
around 1995.
> David Kirkby.
--