I just merged the nona-gpu branch into trunk. This is a present for
Andrew on a very special day for him :)
Details of how I made the merge are documented at
<http://panospace.wordpress.com/2009/08/02/preparing-the-next-release/>
Now for releases/2009.2:
* The Windows SDK will need to be updated for a new dependecy: GLEW
* currently GPU stitching is available only from the command line (nona
with the switch -g).
* I did some initial investigation on adding support in the GUI. More
will follow.
* Even without GUI support, I think we have something worth releasing.
* I suggest we branch out a release codeline as soon as trunk is
confirmed to build without breaking existing functionalities on the
major platforms - Linux / Windows / OSX.
Yuv
thanks for looking into this.
T. Modes wrote:
> The trunk does not compile on windows. (I asked for help some time
> ago, but there was no reaction.)
yes, this is critical for release and I guess there will be some
reaction now. I also estimate that this is relatively easy to fix with
the combined brain power on this list.
> * include <sys/time.h> not found
would something like:
#ifdef _WIN32
#include <winsock.h>
#else
#include <sys/time.h>
#endif
work?
> Finally copy glut32.dll to hugin/bin-folder.
the current policy for the windows binary is to avoid DLHell and link
statically. This should not be too difficult to fix. I recall doing this
for GLEW while helping James integrate the fast preview a year ago. I
hope to find the time to fix my Windows build chain next week and have a
look into this if it has not been already solved by somebody else.
> Now the trunk is compiles and a first test run was successfull.
that's already a significant step forward. Can you provide a patch file
of what you did?
It would also be helpful to have some feedback about the status on OS X
from the many people working on that platform.
> Could someone (Andrew?) look at the points in ImageTransformsGPU.cpp
> and Interpolator.h? I don't know, if my fast fixes are correct
the whole point of putting things in trunk is for people to have a look.
One test is to run nona with the -g switch and without it on the same
images and compare outputs.
thanks again for moving forward so fast on this!
Yuv
yes. this is very much along the lines of what I was thinking. however
my thinking is still incomplete and instead of continuing to fiddle with
the code I stopped and took a broader look.
I see two ways to approach the problem:
1. forward-thinking: start with a GUI mockup and dig into the code to
implement the functionalities until it trickles down to the actual
project execution. your mockup is the start. this is what I did until I
came across an inconsistency that should not be: the -g option is passed
to the Makefile generated when saving the project, but not to the
Makefile generated when hitting "stitch now".
2. backward-thinking: start with the end, i.e. with the project
execution, and work upstream to the user defined parameters in the GUI.
I was actually able to *partially* pass the -g parameters to a Makefile:
The saved .pto.mk file was correct, so CLI stitching would work, and
probably also PTbatcher (I admit have not used/studied that part of
Hugin). But when I hit the "stitch now" button, the temporary Makefile
created differs from the saved one and it does not work.
Then I thought backward, digging into the code to the points where the
Makefiles are executed and where they are generated. To my shock, it is
different.
And I think I found at least one significant issue: Hugin relies on
wxWidgets to get/set preferences. This dependency handicaps CLI tools
which simply do not read these preferences and "forget" about settings
such as the tmp location or celeste's treshhold. Not
understandable/acceptable for the user who expects his preferences to be
honored.
wxWidgets has the advantage of abstracting the details of different
operating systems in dealing with the preferences: they are stored in
~/.hugin on Linux (and OSX) systems; and in the registry on Windows.
Fully transparent to Hugin. Very convenient, but with limitations:
unless we find a way to parse and implement these preferences in the CLI
tools, the current way of setting, reading and storing preferences is
unacceptable.
It is a major design bug that hitting "stitch now" or "save" in the
Hugin GUI on the same system and with the same project yields two
different Makefiles.
We need a consistent way to set/store/get preferences that is
independent of wxWidgets and that can be used by *all* tools - whether
GUI or CLI.
generating and reading .pto and .pto.mk files should always yield the
same results.
I see two solutions to the problem.
The comprehensive solution would be a new (as in: consider if there are
solutions already used by other Open Source projects before writing from
scratch) library to deal with the setting/storing/reading preferences
that can be linked from both the GUI and the CLI tools; or a workaround
parser that parses the existing preferences settings wherever they are
(~/.hugin file or hugin registry key) to make them available to the CLI
tools.
I personally tend more toward the "new" solution. A good one would also
free us of the Windows registry and save the preferences in a .ini file,
and maybe even move to an XML file format (for both preferences and
project files) making it easier to maintain and expand.
In the meantime trying to pass arguments as the -g will necessarily be a
dirty job.
Yuv
> Now the trunk is compiles and a first test run was successfull.that's already a significant step forward. Can you provide a patch file
of what you did?
It would also be helpful to have some feedback about the status on OS X
from the many people working on that platform.
Tom Sharpless wrote:
> That is a good policy with respect to many of the open source
> libraries included in the Hugin code base, that tend to be in a
> constant state of flux. Building them along with Hugin is a sensible
> defense against "version Hell".
>
> It is not a good policy with respect to stable, standardized system
> components such as GLUT.
a mixed dealing of static and dynamic is confusing.
moreover, can you guarantee that the user won't go into the installation
folder and fiddle with those DLLs? or some anti-virus software go ballistic?
> I would suggest the following policy. If you can get a prebuilt
> standard DLL you need to run Hugin, from a reputable developer, use
> that. Build from source only when you wrote or modified the source.
this would require changes in the CMake build (which is already far from
robust) and of the related SDK.
if somebody feels like cleaning up, a robust CMake build should let the
builder switch between static and dynamic linking with a parameter. Then
we can have static and dynamic builds next to each other, compare pros
and cons, make some experiences and make an informed decision.
Yuv
T. Modes wrote:
> It works only partially. The compiler complains about missing
> gettimeofday. This function is not available on windows. Here we need
> a replacement. (I found a file, but without clear licence, see my
> previous post).
http://www.winehq.org/pipermail/wine-devel/2003-June/018082.html
says GPL. I'm still on Linux, have not tried it.
> I tried both, but with -g switch I get an output image with only one
> line of pixels (all other pixels are "empty"). So a comparision is
> really difficult ;-)
I would say comparison is really easy. Unfortunately it tells us that
whatever happened is not right yet.
Yuv
T. Modes wrote:
> could you test if the patch works under linux or if it breaks.
once the typo at line 47 in ImageTransformsGPU.cpp is fixed, it builds.
I think you can commit it.
> With nona -g the remapped images has sometimes slightly other width:
this is because a multiple of 8 yields faster GPU transferts. the
question is: can we identify the padding pixels and crop them away from
the result after it is out of the GPU?
we'll have to look at this while polishing up for release. There are use
cases where the padding pixels are a nuisance (e.g. if the result should
be an equirectangular of 398x199, 400x199 won't do). I filed a bug
report:
<https://sourceforge.net/tracker/?func=detail&aid=2831574&group_id=77506&atid=550441>
I'm currently on a laptop that does not have a useful GPU to test:
$ nona -g -o gputest.jpg _MG_8768-_MG_8873.pto
get fences failed: -1
param: 6, val: 0
nona: using graphics card: Tungsten Graphics, Inc Mesa DRI Intel(R)
915GM GEM 20090326 2009Q1 RC2 x86/MMX/SSE2
nona: extension GL_ARB_fragment_shader = false
nona: extension GL_ARB_vertex_shader = false
nona: extension GL_ARB_shader_objects = false
nona: extension GL_ARB_shading_language_100 = false
nona: extension GL_ARB_texture_rectangle = true
nona: extension GL_ARB_texture_border_clamp = true
nona: this graphics card lacks the necessary extensions for -g.
nona: sorry, the -g flag is not going to work on this machine.
Makes me think of the preference settings for Hugin: would it be useful
to run these tests on starting Hugin, and graying out the preference if
the GPU can't be used?
Yuv
Panini shows some very useful information "About your OpenGL
Implementation" in the "about Panini" dialog.
Either a tab in the preferences dialog or the "about" dialog would be a
great place to list some technical information about the system / hardware.
Maybe there is also a way to suggest a max. amount of RAM for enblend or
other settings.
Carl
Guido
you will need to tell Cmake to copy it from a folder in the SDK to
INSTALL/FILES/bin
something like:
INSTALL(glew.dll DESTINATION ${BINDIR})
> test szenario to check to build result. I didn't followed the discussion
> about the broken build in depth.
the build is fixed now. to test the build, just cd to the binaries
folder and start nona with GPU and with a test project:
nona -g -o output.tif project.pto
> 1. How can I test whether glut integration is working well?
see above.
> 2. a. Where should the glut dynamic library be copied?
if you really want to, into the same folder where nona is copied. but
make the dynamic linking optional, with default for static linking, please.
> 2.b. Or should we use a static lib instead?
I would prefer static libs first. Once 2009.2 is out, we can look into
moving the Windows version from full static build to either full dynamic
or mixed build.
There is already enough on the plate for 2009.2. And for Windows.
I committed today Ryan's Visual Studio 2008 x64 compile fixes. That's
half of the patches to produce a Windows 64bit version - a request that
has been in the air for quite some time and which has more significant
impact on the users than how libraries are linked.
If I am not mistaken the second half of his patches [0] are for the SDK,
that will need an update.
Speaking of the SDK, I don't like the current status. Unless my memory
betrays me, the original SDK had all the binaries to build Hugin *and*
Enfuse-Enblend. Even if it didn't, I think a proper SDK should because
Hugin requires Enfuse-Enblend.
So bottom line: there is a lot of pent-up work and I don't think that
we'll do ourselves or our Windows users a favor by venturing now into
changing how libraries are linked.
Yuv
[0] -
<https://sourceforge.net/tracker/?func=detail&aid=2789320&group_id=77506&atid=550443>
Habi
Please find attached a small SDK extension. Simply rename the file
glut.txt to glut.zip and extract it to the SDK directory. There should
be a directory glut afterwards.
Furthermore the CMakeLists.txt which resides in root directory of the
trunk has to be modified. I attached my current one.
Thomas maybe you can test whether it will fix your build problems. If so
I will update the SDK and the files in the repository.
Thanks,
Guido
thanks
>> Speaking of the SDK, I don't like the current status. Unless my memory
>> betrays me, the original SDK had all the binaries to build Hugin *and*
>> Enfuse-Enblend. Even if it didn't, I think a proper SDK should because
>> Hugin requires Enfuse-Enblend.
>>
> I only maintan the 32-bit part of the SDK, cause I don't have a 64-bit
> OS. I don't mind if Ryan will provide an extended/modified SDK for
> 64-bit. For instance in terms of an archive that will contain all
> additional files which can easily be copied into my base SDK.
this is similar to how I see it: we need an official SDK that contains
your contributions, Ryan's contributions, and the contributions of
whoever else does it. If it makes sense to have two separate SDKs
(32bit/64bit) so be it. But the official SDK should be a complete and
stable one.
> The second
> point (enblend/enfuse) is likely religious :-) I followed the approach
> to use stable releases of all additional applications and libraries.
> Thus I take the maybe meanwhile outdated stable version of
> enblend/enfuse. If there is any other version which is likely stable I
> can integrate it. Currently I cannot see such version on the
> enblend/enfuse homepage. I know you prefer another build chain approach
> in terms of a bleeding edge environment.
I don't see it as religious choice of mutually exclusive preferences :-)
There are two distinct types of SDK with two distinct purposes:
- a stable SDK, for the purpose of building rock-solid stable releases
- a bleeding-edge SDK, for the purpose of developing the future, giving
feedback to upstream libraries and code providers, add new features to
Hugin.
You are focused on the stable SDK. We might differ on the details, but
in principle I agree with you. We need a stable, *complete* SDK for
this, to which you, Ryan, and others can contribute. I find what is
currently your SDK to be incomplete, even just for a pure 32bit build
(which is what I do, although I have an XP_64 license, it is currently
not installed).
At this moment my experience enblend-enfuse 3.2 is that it is not
stable, even if it is the official "stable" distribution. I would
replace it with a current build from Christoph Spiel's staging branch. I
don't know if you would prefer to leave it as-is or to step back to 3.0,
or anything in between.
>> So bottom line: there is a lot of pent-up work and I don't think that
>> we'll do ourselves or our Windows users a favor by venturing now into
>> changing how libraries are linked.
>>
>>
> Well that's right, but I know about the library problems on windows (DLL
> hell), and before I choose one of two ways I wanted to get some
> impressions which way would be prefered.
both ways: the static link for stable and the dynamic link for bleeding
edge ;-)
Yuv
Guido Kohlmeyer wrote:
> Yuval Levy schrieb:
>> the build is fixed now. to test the build, just cd to the binaries
>> folder and start nona with GPU and with a test project:
>>
>> nona -g -o output.tif project.pto
>>
> I've tested a static glut library an my machine but unfortunately my
> graphic card does not support the mandatory OpenGL methods.
can you post a binary of yout nona-gpu for download so that Windows
users with an appropriate video card can give it a try?
I am particularly keen to hear about speed improvements (or not), i.e.
have the user report their system's performance when doing:
nona -o output.tif project.pto
vs.
nona -g -o output.tif project.pto
we need to know about the system:
- CPU (model, frequency, RAM, etc.)
- GPU (model, frequency, RAM, etc.)
note that nona will automatically use all the cores in a system, but if
you want to be sure, just add -t N with N being the number of threads to
be used (2 on dual core machines, 4 on quad core, 8 on core i7 that have
4 cores + 2 threads/core)
> Please find attached a small SDK extension.
thanks
Yuv
does it take long until it returns from execution?
Yuv
I uploaded an archived version of nona to the panotools webspace:
http://hugin.panotools.org/testing/hugin/nona.zip
It is based on my current build environment.
Due to the fact that a lot of progress is done using GPU power it is
time to get a new machine. My laptop is more and more unstable (maybe
broken motherbord) and my desktop is much older.
Guido Kohlmeyer wrote:
> I uploaded an archived version of nona to the panotools webspace:
> http://hugin.panotools.org/testing/hugin/nona.zip
thank you. I'll boot into Windows tomorrow and try it. On Ubuntu I did
not have much chance.
> Due to the fact that a lot of progress is done using GPU power it is
> time to get a new machine. My laptop is more and more unstable (maybe
> broken motherbord) and my desktop is much older.
seems that we are in the same boat. laptop (HP Compaq nc6120) has a dry
contact on the ICH7. Useless in Windows, in Ubuntu I can still use it
reasonably well but have to do some pressure / massage from time to time
to get the contacts going again. desktop: AMD X2 6000+ with 8GB RAM.
Probably buying a video card to complement the on board nVidia 6150
should do, but which video card? I have no clue of today's video cards
market (the last one I bought was in 2002, still AGP). I'm no gamer, but
a nice video card that can be used for nona-gpu and maybe Photoshop
would be nice.
who recommends what and why?
Yuv