Fwd: Google Summer of Code 2011 Announced

99 views
Skip to first unread message

Yuval Levy

unread,
Jan 25, 2011, 7:45:03 AM1/25/11
to hugi...@googlegroups.com
does anybody care to take care of this?
Yuv

---------- Forwarded Message ----------

Subject: Google Summer of Code 2011 Announced
Date: January 24, 2011, 05:21:01 pm
From: Carol Smith
To: Google Summer of Code Announce

Hi all,

We're pleased to announce that Google Summer of Code will be happening for
its seventh year this year. Please check out the blog post [1] about the
program and read the FAQs [2] and Timeline [3] on Melange for more
information.

[1] -
http://google-opensource.blogspot.com/2011/01/google-summer-of-code-announced-
at-lca.html
[2] -
http://www.google-melange.com/document/show/gsoc_program/google/gsoc2011/faqs
[3] -
http://www.google-
melange.com/document/show/gsoc_program/google/gsoc2011/timeline

Cheers,
Carol


-----------------------------------------

signature.asc

David Haberthür

unread,
Jan 25, 2011, 4:05:44 PM1/25/11
to hugi...@googlegroups.com
On 25.01.2011, at 13:45, Yuval Levy wrote:
> does anybody care to take care of this?
> Yuv

Hey
I'm supposedly a bit less busy with work this summer, so I'd like to help wherever possible to give back something to hugin et al.
I don't think I can (and would) manage the effort alone, but I'd be happy to help wherever I can: Where can I start?
For starters, I've made a wiki-page for the upcoming Summer of Code: http://wiki.panotools.org/GSOC_2011
Habi

Yuval Levy

unread,
Jan 25, 2011, 11:28:57 PM1/25/11
to hugi...@googlegroups.com

danke Habi!

There is a wealth of process information from previous years in the Wiki.

If there was one thing that I would improve process-wise this year, it would
be using blueprints [0] on Launchpad to track/develop ideas and student
proposals.

For this to happen, somebody should could collect (copy&paste) the project
ideas from the past four editions that have not materialized yet and are still
pertinent or just partially implemented and turn each of them into a blueprint
[0]. A good place to start are the lists at [1][2][3][4].

For those that do not have Vetting Exercises (we introduced them in 2010), or
for those whose Vetting Exercises are outdated, we'd need developers help to
specify new Vetting Exercises.

The student application template [5] can be re-used with an update/refresh.

You want to start calling for mentors; and prepare for the availability of the
organization application form from Google early March.

You may find some [6] useful information from the past to fill it. Last year
IIRC I emailed Bruno with some more text from the 2009 application. I can try
to look for that in my sent emails folder; and I don't know if Bruno / James /
Tim kept track of the answers to the application form last year. They may
change / fine tune the questions.

This is all I can think of that can be done in preparation of the bid.

Thank you for picking this up!
Yuv


[0] https://blueprints.launchpad.net/hugin
[1] http://wiki.panotools.org/SoC2007_projects#Processing_of_very_large_images
[2] http://wiki.panotools.org/SoC_2008_ideas
[3] http://wiki.panotools.org/SoC_2009_idea
[4] http://wiki.panotools.org/SoC_2010_ideas
[5] http://wiki.panotools.org/SoC2009_Application_Template
[6] http://wiki.panotools.org/SoC2007_application

signature.asc

David Haberthür

unread,
Jan 27, 2011, 7:31:56 AM1/27/11
to hugi...@googlegroups.com
>> Hey
>> I'm supposedly a bit less busy with work this summer, so I'd like to help
>> wherever possible to give back something to hugin et al. I don't think I
>> can (and would) manage the effort alone, but I'd be happy to help wherever
>> I can: Where can I start? For starters, I've made a wiki-page for the
>> upcoming Summer of Code: http://wiki.panotools.org/GSOC_2011 Habi
>
> danke Habi!
Gern geschehen, ist ja erstmal eine Absichtserklärung :)

> There is a wealth of process information from previous years in the Wiki.
>
> If there was one thing that I would improve process-wise this year, it would
> be using blueprints [0] on Launchpad to track/develop ideas and student
> proposals.
>
> For this to happen, somebody should could collect (copy&paste) the project
> ideas from the past four editions that have not materialized yet and are still
> pertinent or just partially implemented and turn each of them into a blueprint
> [0]. A good place to start are the lists at [1][2][3][4].

I think the ideas of collecting and converting the ideas in the wiki to blueprints is a good idea. I'lll try to pick up the loose ends around there soon.

> For those that do not have Vetting Exercises (we introduced them in 2010), or
> for those whose Vetting Exercises are outdated, we'd need developers help to
> specify new Vetting Exercises.

This is something I'd like to delegate: Does hugins GSOC need new and/or updated vetting exercises?

> The student application template [5] can be re-used with an update/refresh.

I'll also look into this and "convert" it to a new page on the wiki.

> You want to start calling for mentors; and prepare for the availability of the
> organization application form from Google early March.

See my upcoming email

> This is all I can think of that can be done in preparation of the bid.

Thanks for the pointers, I've now got some work to do.

> Thank you for picking this up!

Even If I volunteered to start, I'd very much not like to do this completely on my own, since I didn't closely follow the process last year. I'd be glad for some more shoulders to put some weight on...
Habi

Jeffrey Martin

unread,
Jan 27, 2011, 10:58:44 AM1/27/11
to hugi...@googlegroups.com
If I may, can I suggest continuing the "vertical line and/or horizon detector" project? I think a few people can back me up in saying that this kind of thing (even if not 100% reliable) is sorely missing, and is making the possibility of really automated streetview-type panos not really possible.

Jeffrey

David Haberthür

unread,
Jan 27, 2011, 11:46:07 AM1/27/11
to hugi...@googlegroups.com
Hey Jeffrey

On 27.01.2011, at 16:58, Jeffrey Martin wrote:

> If I may, can I suggest continuing the "vertical line and/or horizon detector" project? I think a few people can back me up in saying that this kind of thing (even if not 100% reliable) is sorely missing, and is making the possibility of really automated streetview-type panos not really possible.

Good idea. Care to add it at https://blueprints.launchpad.net/hugin?
Habi

Jeffrey Martin

unread,
Jan 27, 2011, 12:00:34 PM1/27/11
to hugi...@googlegroups.com

Bruno Postle

unread,
Jan 27, 2011, 6:08:18 PM1/27/11
to Hugin ptx
On Thu 27-Jan-2011 at 13:31 +0100, David Haberthür wrote:

>> Thank you for picking this up!

> Even If I volunteered to start, I'd very much not like to do this
> completely on my own, since I didn't closely follow the process
> last year. I'd be glad for some more shoulders to put some weight
> on...

We will help. I did the admin for the summer of code last year and
will do it again if necessary. Though if someone else wants to do
it I can help with support/backup (if we get accepted that is).

--
Bruno

Jeffrey Martin

unread,
Jan 28, 2011, 3:07:20 AM1/28/11
to hugi...@googlegroups.com
I have another idea - what about a new blender (or improving Enblend)? I am always a bit bummed when I think about how superior smartblend is - it's 5 years old and at point very slow, but it's still better than e.g. Enblend.

Bruno Postle

unread,
Jan 28, 2011, 5:03:59 PM1/28/11
to Hugin ptx
On Fri 28-Jan-2011 at 00:07 -0800, Jeffrey Martin wrote:

> I have another idea - what about a new blender (or improving
> Enblend)? I am always a bit bummed when I think about how superior
> smartblend is - it's 5 years old and at point very slow, but it's
> still better than e.g. Enblend.

One thing we have learnt with previous projects is that there isn't
much time and students need to get going straight away - So we need
a well defined task, even if the detail isn't resolved.

A 'better enblend' isn't going to happen unless we know exactly what
is wrong and what is needed to fix it.

e.g. I strongly suspect that we would get faster and better quality
stitching if remapping and blending were combined - i.e. decompose
the image pyramid for each input photo, remap each pyramid layer
separately, splice them together, and finally reassemble the
pyramid. However this is just a suspicion, and without doing some
tests, it would be unfair to ask a student to work on it.

--
Bruno

Jeffrey Martin

unread,
Jan 29, 2011, 3:10:43 AM1/29/11
to hugi...@googlegroups.com
Yes, PTStitcherNG does this, and it is extremely fast. But the blending is not so smart, and Helmut remarked somewhere that due to the nature of how it works (stitch and blend in one step) it couldn't be too smart. And if it makes a mistake in the beginning of the process, then "it's all over".

Sure, "a better enblend" is too vague.  Well, how about this - we collect examples of bad blending (from enblend) and better blending (e.g. from smartblend) and make a project to improve the current Enblend? I wish I had been collecting such examples over the years, but I haven't been. It's not too late to start though. Does anyone think this is a good idea?

But yes, a new "stitcher/blender" that speeds everything up (and hopefully does better seam optimization as well) is a great idea.

David Haberthür

unread,
Jan 31, 2011, 12:11:25 PM1/31/11
to hugi...@googlegroups.com
Hey All.

> For this to happen, somebody should could collect (copy&paste) the project
> ideas from the past four editions that have not materialized yet and are still
> pertinent or just partially implemented and turn each of them into a blueprint
> [0].  A good place to start are the lists at [1][2][3][4].
I think the ideas of collecting and converting the ideas in the wiki to blueprints is a good idea. I'lll try to pick up the loose ends around there soon.

I did some copy-pasting today. Several ideas from the wiki are now also living on https://blueprints.launchpad.net/hugin.
Several Ideas were (as far as I understand) not relevant anymore, please tell me if I forgot important ones.
Deliberately not copied over are
- http://wiki.panotools.org/SoC_2008_ideas#Lens_Database Is this still relevant? How does Lensfun et al. do?
- http://wiki.panotools.org/SoC_2009_idea#hugin_RAW_support
- http://wiki.panotools.org/SoC_2009_idea#Python_Bindings What about this? There has been quite a bit of emails on the list which I didn't follow. Should this be added as a blueprint?
- http://wiki.panotools.org/SoC_2010_ideas#Zooming_for_fast_preview Still relevant?
- http://wiki.panotools.org/SoC_2010_ideas#Threading_for_Hugin Ditto

> You want to start calling for mentors; and prepare for the availability of the
> organization application form from Google early March.
See my upcoming email

BTW: I didn't see any feedback from https://groups.google.com/d/topic/hugin-ptx/XeypwPGzf-8/discussion. Did any possible mentors see this email, or did it fly under their radar?

Habi

T. Modes

unread,
Jan 31, 2011, 3:30:28 PM1/31/11
to hugin and other free panoramic software
Hi Habi,


> Deliberately not copied over are
> -http://wiki.panotools.org/SoC_2008_ideas#Lens_DatabaseIs this still
> relevant? How does Lensfun et al. do?

Yes, still relevant.

> -http://wiki.panotools.org/SoC_2009_idea#hugin_RAW_support

Here I don't know.

> -http://wiki.panotools.org/SoC_2009_idea#Python_BindingsWhat about this?
> There has been quite a bit of emails on the list which I didn't follow.
> Should this be added as a blueprint?

The main functionality is implemented in an own branch and it is
working.

> -http://wiki.panotools.org/SoC_2010_ideas#Zooming_for_fast_preview
> Still relevant?

Yes. The fast preview window can be further improved.

> -http://wiki.panotools.org/SoC_2010_ideas#Threading_for_HuginDitto

The loading of images into cache with threads has been solved in Hugin
already. So this is not relevant.

I added some more blueprints with this issues.

> BTW: I didn't see any feedback fromhttps://groups.google.com/d/topic/hugin-ptx/XeypwPGzf-8/discussion. Did any
> possible mentors see this email, or did it fly under their radar?

I've seen the mail. But I don't know yet if I have time. It depends
also on the proposals.

Thomas

dmg

unread,
Jan 31, 2011, 7:07:07 PM1/31/11
to hugi...@googlegroups.com
On Sat, Jan 29, 2011 at 5:10 PM, Jeffrey Martin <360c...@gmail.com> wrote:
> Yes, PTStitcherNG does this, and it is extremely fast. But the blending is
> not so smart, and Helmut remarked somewhere that due to the nature of how it
> works (stitch and blend in one step) it couldn't be too smart. And if it
> makes a mistake in the beginning of the process, then "it's all over".
>

If Helmut's previous code is any indication, I presume the speed comes
partially from not using the OO vigra libraries, which
make nona significantly slower than PTmender at doing simple remapping.

I just did a very simple tests. My computer has a SSD, hence I think
the results are skewed:

This is PTmender, run 3 times:
Voluntary Context Switches 2; real 2.72; sys 0.05
Voluntary Context Switches 2; real 2.73; sys 0.09
Voluntary Context Switches 1; real 2.73; sys 0.06

And this is Nona:

Voluntary Context Switches 18; real 7.98; sys 0.27
Voluntary Context Switches 19; real 7.94; sys 0.31
Voluntary Context Switches 25; real 8.19; sys 0.31


The both use the same script file. If anybody wants to try them in
real hard drives, it would be great. This is a simple
remapping with 4 input files (and the output is TIFF).


\time -f 'Voluntary Context Switches %w; real %e; sys %S' nona -o rip
script.txt
\time -f 'Voluntary Context Switches %w; real %e; sys %S' PTmender -o
rip script.txt

(used backslash to avoid aliasing expansion)

If anybody can run these tests in a regular hard drive, that would be
great. I have the feeling that IO will increase nona's time,
but that is only a hunch:

http://turingmachine.org/~dmg/temp/test.zip

--
--dmg

---
Daniel M. German
http://turingmachine.org

Rogier Wolff

unread,
Feb 1, 2011, 9:35:22 AM2/1/11
to hugi...@googlegroups.com

On Mon, Jan 31, 2011 at 12:30:28PM -0800, T. Modes wrote:
> > -http://wiki.panotools.org/SoC_2009_idea#hugin_RAW_support
>
> Here I don't know.

I don't think that hugin should move to support raw formats.

On the input side, users should convert images to the appropriate
bit-depth tiff files, and use those.

On the output side, a camera has a business writing DNG files. It has
a sensor with a bayer pattern a specific ADC bit depth etc etc. Hugin
can just write TIFF and that should be it. As I understand DNG, hugin
would have to come up with a lot of "default" and "controversial" data
to conform to DNG. Suppose someone didn't fix the camera settings. So
the shots were taken with f 5.6 - 11 F-value, and 1/320th to 1/640th
shutter time. I'm assuming that those fields are mandatory in
DNG. Makes sense for a camera to fill those in. But for Hugin?

On the other hand, Hugin might be expanded to allow input-plugins. If
we specify that an input-plugin should accept the filename as the
first argument and should provide PPM output on its standard out,
users can specify say pngtopnm as the input plugin for the PNG format.
Simple shell scripts can become wrappers around programs that don't
conform to the hugin-input-plugin-api.

Input plugins can have a companion application that provides metainfo.
Consider using "exiftool" as the interface spec.....

(of course 16bit PPM should be supported, as formats may have
such bit depths)

And on the output side, hugin might be expanded to use EXIF in jpg
output formats, and DNG-like-metainfo in TIFF output to specify things
that ARE relevant and obviously deducable. For example the time the
picture was taken is not completely defined but it can be taken to be
the time of the first shot, the last shot, an average over all shots
or maybe the median. But it is useful information that should be
carried over one way or another from the source images to the
destination images.

Is BigTiff (tiff > 4Gbytes) support
- already implemented?
- already on the list?
- not on the list?
- too easy for a SOC project?


Roger.

--
** R.E....@BitWizard.nl ** http://www.BitWizard.nl/ ** +31-15-2600998 **
** Delftechpark 26 2628 XH Delft, The Netherlands. KVK: 27239233 **
*-- BitWizard writes Linux device drivers for any device you may have! --*
Q: It doesn't work. A: Look buddy, doesn't work is an ambiguous statement.
Does it sit on the couch all day? Is it unemployed? Please be specific!
Define 'it' and what it isn't doing. --------- Adapted from lxrbot FAQ

Jeffrey Martin

unread,
Feb 1, 2011, 9:59:35 AM2/1/11
to hugi...@googlegroups.com
i agree about raw input / dng output - it's no business of hugin.

what about PSB output?

paul womack

unread,
Feb 1, 2011, 10:29:27 AM2/1/11
to hugi...@googlegroups.com
Rogier Wolff wrote:
>
> On Mon, Jan 31, 2011 at 12:30:28PM -0800, T. Modes wrote:
>>> -http://wiki.panotools.org/SoC_2009_idea#hugin_RAW_support
>>
>> Here I don't know.
>
> I don't think that hugin should move to support raw formats.

There is one plausible (but only just) reason.

I was looking into super-resolution processsing,
where multiple images that are offset by fractional
pixel distances are merged to give an image with a resolution
(not just pixel count) higher than the original sensor.

The paper (*) claimed that higher quality results were
obtained when the interpolation of the Bayer matrix
and the interpolation for super-resolution
were done "in one operation", which is a bit like Hugin's trick
of composing all of its image transforms together before
sampling/interpolating the source image just once.

Too marginal to be a real justification IMHO, but I thought
it might be of theoretical interest.

BugBear

(*) I'll try to find the original paper if anyone really want
the evidence

kfj

unread,
Feb 1, 2011, 11:49:10 AM2/1/11
to hugin and other free panoramic software


On 1 Feb., 15:35, Rogier Wolff <rew-googlegro...@BitWizard.nl> wrote:

> On the other hand, Hugin might be expanded to allow input-plugins.  If
> we specify that an input-plugin should accept the filename as the
> first argument and should provide PPM output on its standard out,
> users can specify say pngtopnm as the input plugin for the PNG format.
> Simple shell scripts can become wrappers around programs that don't
> conform to the hugin-input-plugin-api.

Input plugins sound fine to me. The work on the Python plugin
interface has advanced nicely recently, so I'd even propose thinking
about using Python plugins for the job.

There are enticing possibilities in looking at raw data, though - e.g
having twice the resolution in the green channel without any camera-
specific messing having touched the data. And all raw data could be
taken in as DNG, making it a less daunting task than trying to
accomodate every manufacturer's format.

Kay

Gnome Nomad

unread,
Feb 1, 2011, 11:11:47 PM2/1/11
to hugi...@googlegroups.com
Rogier Wolff wrote:
> On Mon, Jan 31, 2011 at 12:30:28PM -0800, T. Modes wrote:
>>> -http://wiki.panotools.org/SoC_2009_idea#hugin_RAW_support
>> Here I don't know.
>
> I don't think that hugin should move to support raw formats.

I agree. There are plenty of programs around to convert various raw
formats to TIFF or JPG. I think the most the wiki could benefit from is
listing some common raw formats and names of programs that can convert them.

--
Gnome Nomad
gnome...@gmail.com
wandering the landscape of god
http://www.cafepress.com/otherend/

Gnome Nomad

unread,
Feb 1, 2011, 11:13:46 PM2/1/11
to hugi...@googlegroups.com

I use Bibble Lite. It can convert a 48-bit RAW file into a 48-bit TIFF
file, so the raw data is there for Hugin to use.

Carl von Einem

unread,
Feb 2, 2011, 4:19:31 AM2/2/11
to hugi...@googlegroups.com
Gnome Nomad schrieb am 02.02.11 05:11:
> (...) I think the most the wiki could benefit from is listing some

> common raw formats and names of programs that can convert them.

I just searched for 'raw' in the panotools wiki ->
<http://wiki.panotools.org/RAW>
In the pragraph "External links" follow the first link "w:Camera raw at
Wikipedia". Choose "Software support" from the toc.

Direct URL is <http://en.wikipedia.org/wiki/Camera_raw#Software_support>

Cheers,
Carl

Gnome Nomad

unread,
Feb 2, 2011, 5:27:55 AM2/2/11
to hugi...@googlegroups.com

Thanks.

Felix Hagemann

unread,
Feb 3, 2011, 4:24:16 PM2/3/11
to hugi...@googlegroups.com
On 1 February 2011 01:07, dmg wrote:
> I just did a very simple tests. My computer has a SSD, hence I think
> the results are skewed:
>
> This is PTmender, run 3 times:
>  Voluntary Context Switches 2; real 2.72; sys 0.05
>  Voluntary Context Switches 2; real 2.73; sys 0.09
>  Voluntary Context Switches 1; real 2.73; sys 0.06
>
> And this is Nona:
>
> Voluntary Context Switches 18; real 7.98; sys 0.27
> Voluntary Context Switches 19; real 7.94; sys 0.31
> Voluntary Context Switches 25; real 8.19; sys 0.31
>
> [...]

>
> If anybody can run these tests in a regular hard drive, that would be
> great. I have the feeling that IO will increase nona's time,
> but that is only a hunch:

PTMender:
Voluntary Context Switches 2; real 9.14; sys 0.24
Voluntary Context Switches 4; real 9.82; sys 0.18
Voluntary Context Switches 3; real 10.50; sys 0.14

Nona:
Voluntary Context Switches 13; real 43.98; sys 0.40
Voluntary Context Switches 1; real 60.32; sys 0.49
Voluntary Context Switches 1; real 56.58; sys 0.44

This is on a regular (and quite slow) hard drive on a fairly slow
system. Obviously I need something beefier for stitching...
Hope it helps,
Felix

David Haberthür

unread,
Feb 4, 2011, 10:06:51 AM2/4/11
to hugi...@googlegroups.com
Hey Thomas.

>> Deliberately not copied over are
>> -http://wiki.panotools.org/SoC_2008_ideas#Lens_DatabaseIs this still
>> relevant? How does Lensfun et al. do?
>
> Yes, still relevant.
>

>> -http://wiki.panotools.org/SoC_2010_ideas#Zooming_for_fast_preview
>> Still relevant?
>
> Yes. The fast preview window can be further improved.

Thanks for adding those two ideas to the blueprints at launchpad.
Habi

David Haberthür

unread,
Feb 4, 2011, 10:12:05 AM2/4/11
to hugi...@googlegroups.com
Hey Jeffrey

On 28.01.2011, at 09:07, Jeffrey Martin wrote:

> I have another idea - what about a new blender (or improving Enblend)? I am always a bit bummed when I think about how superior smartblend is - it's 5 years old and at point very slow, but it's still better than e.g. Enblend.

Could you expand a bit on this idea and maybe even add it to https://blueprints.launchpad.net/hugin?
Or if you (or others) put a bit more metaphorical meat on the bones I can add it there.
Habi

David Haberthür

unread,
Feb 4, 2011, 10:30:31 AM2/4/11
to hugi...@googlegroups.com
Hey Daniel

> I just did a very simple tests. My computer has a SSD, hence I think
> the results are skewed:
>
> This is PTmender, run 3 times:
> Voluntary Context Switches 2; real 2.72; sys 0.05
> Voluntary Context Switches 2; real 2.73; sys 0.09
> Voluntary Context Switches 1; real 2.73; sys 0.06
>
> And this is Nona:
>
> Voluntary Context Switches 18; real 7.98; sys 0.27
> Voluntary Context Switches 19; real 7.94; sys 0.31
> Voluntary Context Switches 25; real 8.19; sys 0.31
>
> The both use the same script file. If anybody wants to try them in
> real hard drives, it would be great. This is a simple
> remapping with 4 input files (and the output is TIFF).
>
>
> \time -f 'Voluntary Context Switches %w; real %e; sys %S' nona -o rip
> script.txt
> \time -f 'Voluntary Context Switches %w; real %e; sys %S' PTmender -o
> rip script.txt


On OS X the "time" command in the console doesn't seem to know the "-f" flag, so I let it run without it

PTmender
loligo:Downloads habi$ time /Applications/Hugin.app/Contents/MacOS/PTmender -o rip script.txt
real 0m3.626s
user 0m2.967s
sys 0m0.137s

loligo:Downloads habi$ time /Applications/Hugin.app/Contents/MacOS/PTmender -o rip script.txt
real 0m4.267s
user 0m3.023s
sys 0m0.143s

loligo:Downloads habi$ time /Applications/Hugin.app/Contents/MacOS/PTmender -o rip script.txt
real 0m3.296s
user 0m2.935s
sys 0m0.126s

loligo:Downloads habi$ time /Applications/Hugin.app/Contents/MacOS/PTmender -o rip script.txt
real 0m3.626s
user 0m2.984s
sys 0m0.141s

---
nona:
loligo:Downloads habi$ time /Applications/Hugin.app/Contents/MacOS/nona -o rip script.txt
real 0m13.295s
user 0m21.331s
sys 0m0.294s

loligo:Downloads habi$ time /Applications/Hugin.app/Contents/MacOS/nona -o rip script.txt
real 0m13.191s
user 0m21.433s
sys 0m0.291s

loligo:Downloads habi$ time /Applications/Hugin.app/Contents/MacOS/nona -o rip script.txt
real 0m12.843s
user 0m21.325s
sys 0m0.291s

loligo:Downloads habi$ time /Applications/Hugin.app/Contents/MacOS/nona -o rip script.txt
real 0m12.903s
user 0m21.326s
sys 0m0.296s


Does that help?
Habi

David Haberthür

unread,
Feb 4, 2011, 10:36:10 AM2/4/11
to hugi...@googlegroups.com

On 01.02.2011, at 15:59, Jeffrey Martin wrote:

> i agree about raw input / dng output - it's no business of hugin.
>
> what about PSB output?

In the end I also agree about the RAW input, it's best left to other software.
Why would PSB (a quick google detour hinted it's PhotoShop) be desired? Is a layered TIFF not something you can import into PhotoShop?
Do you have any idea if there are (free/libre) libraries that can write according to Adobes Specifications [1]?
Habi


[1]: http://www.adobe.com/devnet-apps/photoshop/fileformatashtml/

Bruno Postle

unread,
Feb 4, 2011, 4:53:18 PM2/4/11
to Hugin ptx
On Fri 04-Feb-2011 at 16:36 +0100, David Haberthür wrote:
> On 01.02.2011, at 15:59, Jeffrey Martin wrote:
>
> > i agree about raw input / dng output - it's no business of
> > hugin.
> >
> > what about PSB output?
>
> In the end I also agree about the RAW input, it's best left to
> other software.

There would only be an advantage with being able to open RAW files
if Hugin also outputted a RAW file (e.g. DNG). With this Hugin
wouldn't need any of the GUI complication of a RAW converter, and
users could do all the photo 'developing' after stitching instead of
before.

An alternative workflow that was suggested by one of the rawstudio
developers would be to convert the RAW shots to 16bit Prophoto RGB
TIFF using rawstudio (without any adjustments), stitch with Hugin to
16bit TIFF, and then load the result into rawstudio to make
adjustments and finish it off - I'm not sure how near we are to
being able to do this.

> Why would PSB (a quick google detour hinted it's PhotoShop) be
> desired? Is a layered TIFF not something you can import into
> PhotoShop?
> Do you have any idea if there are (free/libre) libraries that can

> write according to Adobes Specifications?
> http://www.adobe.com/devnet-apps/photoshop/fileformatashtml/

This is an interesting development, previously Adobe only let you
see the PSD/PSB spec if you faxed them promising never to reveal any
details (i.e. you were not allowed to use it in Open Source
software).

Someone needs to start a free 'libpsb' before we can use it in
Hugin, but I guess this itself could be a SoC project.

--
Bruno

Jim Watters

unread,
Feb 4, 2011, 9:13:13 PM2/4/11
to hugi...@googlegroups.com
On 2011-02-04 5:53 PM, Bruno Postle wrote:

> On Fri 04-Feb-2011 at 16:36 +0100, David Haberth�r wrote:
>
>> Why would PSB (a quick google detour hinted it's PhotoShop) be desired? Is a
>> layered TIFF not something you can import into PhotoShop?
I believe Photoshop does not recognize cropped tiff images. So every layer
would be full width and height. An import filter would be one way to fix this
bug in Photoshop (if it still exist).

>> Do you have any idea if there are (free/libre) libraries that can write
>> according to Adobes Specifications?
>> http://www.adobe.com/devnet-apps/photoshop/fileformatashtml/
>
> This is an interesting development, previously Adobe only let you see the
> PSD/PSB spec if you faxed them promising never to reveal any details (i.e. you
> were not allowed to use it in Open Source software).
>
> Someone needs to start a free 'libpsb' before we can use it in Hugin, but I
> guess this itself could be a SoC project.
>

I found these two projects, but they are not to active.
https://sourceforge.net/projects/libpsd/
https://sourceforge.net/projects/openpsd/

Our big use of PSB format would be for saving of layered raster files. This is
a small subset of the functionality of the PSB file format.

Looking at the document there are very few locations that the structure or
layout for PSB is different from PSD. In all these locations the number of
bytes used to define a width or length has doubled usually from 4 to 8 bytes.


--
Jim Watters
http://photocreations.ca

Jeffrey Martin

unread,
Feb 7, 2011, 3:20:19 PM2/7/11
to hugi...@googlegroups.com
https://blueprints.launchpad.net/hugin/+spec/improve-seam-optimization

but then I saw you made one also. Can you please delete mine (or yours) ?


Jeffrey Martin

unread,
Feb 7, 2011, 3:21:12 PM2/7/11
to hugi...@googlegroups.com
PSB because that is the only sane way to make large gigapixels. I have made numerous images larger than 4GB and the only real way to edit them is if they are PSB format.

David Haberthür

unread,
Feb 7, 2011, 4:03:22 PM2/7/11
to hugi...@googlegroups.com

On 07.02.2011, at 21:20, Jeffrey Martin wrote:

> https://blueprints.launchpad.net/hugin/+spec/improve-seam-optimization
>
> but then I saw you made one also. Can you please delete mine (or yours) ?

I've marked my entry superseded by yours, since you've added more information. Just out of curiosity: couldn't you have done it yourself, e.g. do I have more "power" than you?
Habi

Jeffrey Martin

unread,
Feb 7, 2011, 4:59:48 PM2/7/11
to hugi...@googlegroups.com
I do not see a delete button there. only these:

anyway, thanks!

Bruno Postle

unread,
Feb 7, 2011, 5:09:51 PM2/7/11
to Hugin ptx
On Mon 07-Feb-2011 at 12:21 -0800, Jeffrey Martin wrote:
> PSB because that is the only sane way to make large gigapixels. I
> have made numerous images larger than 4GB and the only real way to
> edit them is if they are PSB format.

This is a problem if the only tool that can edit PSB files is
Photoshop. We would at least need to see a plan for adding PSB
support to GIMP.

--
Bruno

Yuval Levy

unread,
Feb 7, 2011, 5:34:29 PM2/7/11
to hugi...@googlegroups.com

I don't see this as a problem. It's chicken and egg. If a proper library can
be implemented, it is only a matter of time until the new output format
supported by Hugin will be also supported by other Libre tools.

On a related topic (large images): any interest to continue the VIPS project
idea from a few years ago? maybe using GEGL? that would complement the new
file format nicely.

Yuv

signature.asc

Jeffrey Martin

unread,
Feb 8, 2011, 4:59:11 AM2/8/11
to hugi...@googlegroups.com

Sorry but there is always some kind of problem further down the line.

Maybe GIMP will support PSB files eventually.

Until then, can we nuzzle satan a bit and just have PSB output which currently requires photoshop to open/edit ? :-)


David Haberthür

unread,
Feb 8, 2011, 8:13:34 AM2/8/11
to hugi...@googlegroups.com
Ciao Jeff

On Mon, Feb 7, 2011 at 22:59, Jeffrey Martin <360c...@gmail.com> wrote:
> I do not see a delete button there. only these:
>
> Change details
> Mark superseded
> Re-target blueprint
>
> Subscribe
> Subscribe someone else
>
> anyway, thanks!

I've marked my blueprint superseded by yours, so it's still there,
just superseded :)
Habi

David Haberthür

unread,
Feb 8, 2011, 8:16:55 AM2/8/11
to hugi...@googlegroups.com
> On a related topic (large images):  any interest to continue the VIPS project
> idea from a few years ago?  maybe using GEGL?  that would complement the new
> file format nicely.

I've added some bits and pieces of the old GSOC-stuff to
https://blueprints.launchpad.net/hugin/+spec/processing-of-very-large-images,
maybe someone should add/update details regarding GEGL.
Habi

David Haberthür

unread,
Feb 8, 2011, 8:27:49 AM2/8/11
to hugi...@googlegroups.com
Hey Bruno

It's there as a blueprint, if it scratches someones itch to have these
files, then he/she can implement it.
I am not too familiar with GiMP, but can it not open arbitrarily big
(TIFF) files if the RAM is large enough? So, supporting PSB-files
would only be an advantage to PhotoShop-users and no dis-advantage to
GiMP users.
Habi

Jeffrey Martin

unread,
Feb 8, 2011, 1:46:33 PM2/8/11
to hugi...@googlegroups.com
i'm not aware of any program that can generate a tiff larger than 4gb which can then be opened by any other program.

not to be snarky, but what is wrong with supporting photoshop users? it is a standard. and gimp should have PSB support.

honestly, try making large gigapixels. if you're stuck with making tiff files and gimp, you are not going to get very far.

Bruno Postle

unread,
Feb 8, 2011, 4:27:03 PM2/8/11
to Hugin ptx
On Tue 08-Feb-2011 at 10:46 -0800, Jeffrey Martin wrote:
>i'm not aware of any program that can generate a tiff larger than 4gb which
>can then be opened by any other program.

I understand that if you use a libtiff with 'bigtiff' support this
just works.

>not to be snarky, but what is wrong with supporting photoshop users?

You are welcome to implement PSB, and if it is done well it will
probably be merged immediately. Though you should notice that it is
enblend/enfuse that produces the final image, not Hugin.

>it is a standard. and gimp should have PSB support.

PSB isn't a 'standard', until a few months ago adobe were actively
trying to prevent tools like GIMP from supporting PSB.

>honestly, try making large gigapixels. if you're stuck with making tiff
>files and gimp, you are not going to get very far.

Hugin is open source software, it exists within an ecosystem of
other open source projects that benefit by sharing things like image
libraries.

--
Bruno

Roger Howard

unread,
Feb 8, 2011, 4:51:22 PM2/8/11
to hugi...@googlegroups.com

On Feb 8, 2011, at 1:27 PM, Bruno Postle wrote:

> On Tue 08-Feb-2011 at 10:46 -0800, Jeffrey Martin wrote:
>> i'm not aware of any program that can generate a tiff larger than 4gb which
>> can then be opened by any other program.
>
> I understand that if you use a libtiff with 'bigtiff' support this just works.

You're both right... adding support is easy enough, but the apps many of us use in imaging workflows rarely have added this support. Note that the official release of libtiff still doesn't have bigtiff support - you must use the experimental 4.x build to get bigtiff support.


>
>> not to be snarky, but what is wrong with supporting photoshop users?
>
> You are welcome to implement PSB, and if it is done well it will probably be merged immediately. Though you should notice that it is enblend/enfuse that produces the final image, not Hugin.
>
>> it is a standard. and gimp should have PSB support.
>
> PSB isn't a 'standard', until a few months ago adobe were actively trying to prevent tools like GIMP from supporting PSB.

Has Adobe ever gone after PTGUI? I've never heard that Adobe was trying actively to prevent anyone from using PSB; they have created hurdles to *fully* supporting PSB (and PSD) outside of their own circle, but I've never heard of them going after an independent implementation of at least basic PSD and PSB files (by basic I mean bitmap data, not proprietary layer types).


>
>> honestly, try making large gigapixels. if you're stuck with making tiff
>> files and gimp, you are not going to get very far.
>
> Hugin is open source software, it exists within an ecosystem of other open source projects that benefit by sharing things like image libraries.

Speaking with two different voices:

As a user, I just care about what works, and currently PSB support is essential for a seamless, large file support. I use Photoshop, which does not have bigtiff support, and there's no solid indication yet that it ever will. Chris Cox has been involved with bigtiff since the start, but he's been unable to promise any official support. For me, PSB support in all my tools is the only pragmatic solution right now.

As a professional in the imaging and digital asset management world, I'd love to see bigtiff support bloom; I have a background in cultural heritage and TIFF is a highly regarded format for preservation and exchanging image (and other) data between platforms/applications.

But my pragmatic hat wins, right now - I want to generate a huge image file from my stitcher and bring it into Photoshop, and right now the only option is PSB.

Won't someone think of the users? :)

Roger Howard

unread,
Feb 8, 2011, 5:01:16 PM2/8/11
to hugi...@googlegroups.com

FWIW...

I believe you can build both IM and GIMP with bigtiff support by linking against the libtiff 4.x

So it's possible to:

Generate huge PSB (from Hugin or whatever)
Convert PSB to BigTIFF using IM (IM has had PSB support since March 2010)
Use BigTIFF in GIMP

Of course, you could in theory go the other way - generate BigTIFF from Hugin
Convert BigTIFF to PSB using ImageMagick
Use PSB in Photoshop

Note, I don't have a build of GIMP or ImageMagick linked against libtiff 4.x, so this is just speculation.

-R


Bruno Postle

unread,
Feb 8, 2011, 5:18:10 PM2/8/11
to Hugin ptx
On Tue 08-Feb-2011 at 13:51 -0800, Roger Howard wrote:
>On Feb 8, 2011, at 1:27 PM, Bruno Postle wrote:
> > PSB isn't a 'standard', until a few months ago adobe were
> > actively trying to prevent tools like GIMP from supporting PSB.
>
> Has Adobe ever gone after PTGUI? I've never heard that Adobe was
> trying actively to prevent anyone from using PSB; they have
> created hurdles to *fully* supporting PSB (and PSD) outside of
> their own circle, but I've never heard of them going after an
> independent implementation of at least basic PSD and PSB files (by
> basic I mean bitmap data, not proprietary layer types).

Yes, the PSB spec was always available for software like ptgui, but
to get hold of it you had to sign an NDA effectively promising not
to implement it in open source software. This is the sole reason
why GIMP (and Hugin etc...) don't have PSB support.

>Won't someone think of the users? :)

Nobody ever said Hugin shouldn't support PSB, I'm a user and I think
PSB support would be great if GIMP supported it too.

--
Bruno

Jeffrey Martin

unread,
Feb 8, 2011, 5:26:19 PM2/8/11
to hugi...@googlegroups.com
Ok, thanks Roger for your ideas.

The fact is that today, there is no longer this nastiness around implementing PSB support - right?

Bruno, I didn't mean to say that PSB is a standard, I meant to say that Photoshop itself is a standard (like it or not)

And hey, at least they have a large file format that works, and is used elsewhere (in PTGui and Autopano Giga, for example)

Anyway, if there will be no PSB output in Hugin, I think it would be a shame. As I said before, creating huge files without it just isn't possible. 

Roger Howard

unread,
Feb 8, 2011, 5:37:06 PM2/8/11
to hugi...@googlegroups.com

On Feb 8, 2011, at 2:18 PM, Bruno Postle wrote:

> On Tue 08-Feb-2011 at 13:51 -0800, Roger Howard wrote:
>> On Feb 8, 2011, at 1:27 PM, Bruno Postle wrote:
>> > PSB isn't a 'standard', until a few months ago adobe were > actively trying to prevent tools like GIMP from supporting PSB.
>>
>> Has Adobe ever gone after PTGUI? I've never heard that Adobe was trying actively to prevent anyone from using PSB; they have created hurdles to *fully* supporting PSB (and PSD) outside of their own circle, but I've never heard of them going after an independent implementation of at least basic PSD and PSB files (by basic I mean bitmap data, not proprietary layer types).
>
> Yes, the PSB spec was always available for software like ptgui, but to get hold of it you had to sign an NDA effectively promising not to implement it in open source software. This is the sole reason why GIMP (and Hugin etc...) don't have PSB support.

I was under the impression - maybe just guessing - that PSB support was added without the official Adobe specification. I know ImageMagick added it without - there's no reason IM, Hugin, or GIMP can't add support through a cleanroom implementation - Adobe only controls their spec documents, there's nothing illegal about implementing it without the NDA, you just won't get the docs.

>
>> Won't someone think of the users? :)
>
> Nobody ever said Hugin shouldn't support PSB, I'm a user and I think PSB support would be great if GIMP supported it too.

I think it'd be great whether or not GIMP supports it; I use Photoshop.
Sorry for the snark, though - that last comment was meant in jest, mostly.

Bruno Postle

unread,
Feb 8, 2011, 5:37:27 PM2/8/11
to Hugin ptx
On Tue 08-Feb-2011 at 14:01 -0800, Roger Howard wrote:
>
> I believe you can build both IM and GIMP with bigtiff support by
> linking against the libtiff 4.x
>
> So it's possible to:
>
> Generate huge PSB (from Hugin or whatever)
> Convert PSB to BigTIFF using IM (IM has had PSB support since
> March 2010)
> Use BigTIFF in GIMP
>
> Of course, you could in theory go the other way - generate BigTIFF
> from Hugin
> Convert BigTIFF to PSB using ImageMagick
> Use PSB in Photoshop
>
> Note, I don't have a build of GIMP or ImageMagick linked against
> libtiff 4.x, so this is just speculation.

It would be great to have some definitive info on all the options.

For ages we have supported VIFF, which doesn't have any size
limitations, but I'm not aware that anyone has ever tried it.

--
Bruno

Yuval Levy

unread,
Feb 8, 2011, 5:58:06 PM2/8/11
to hugi...@googlegroups.com
On February 8, 2011 05:37:06 pm Roger Howard wrote:
> I was under the impression - maybe just guessing - that PSB support was
> added without the official Adobe specification. I know ImageMagick added
> it without - there's no reason IM, Hugin, or GIMP can't add support
> through a cleanroom implementation - Adobe only controls their spec
> documents, there's nothing illegal about implementing it without the NDA,
> you just won't get the docs.

Whether by reverse engineering (is that what you mean with cleanroom?) or by
following the specs to the letter, one important ingredient is lacking:
motivation.

The experience of four years Summer of Code (and of decades of Open Source
development in general) demonstrated clearly that the most successful projects
are those driven by the needs and wishes of the developer/student himself; not
by the wishes of the user community or any other third party.

Of course some developers are motivated by money and anybody is welcome to
hire such a developer and contribute the result.


> > Nobody ever said Hugin shouldn't support PSB, I'm a user and I think PSB
> > support would be great if GIMP supported it too.
>
> I think it'd be great whether or not GIMP supports it

I would like to see PSB support in Hugin, and I don't think it should be made
dependent on the support in any other tool.

That said, I don't need PSB support as much as I need zooming in the preview
window, automatic detection of vertical lines, and improved blending / seam
placement (just to name three). Others may have different opinion, and
allocate their resources accordingly.

Yuv

signature.asc

Rogier Wolff

unread,
Feb 8, 2011, 7:15:43 PM2/8/11
to hugi...@googlegroups.com

On Tue, Feb 08, 2011 at 05:58:06PM -0500, Yuval Levy wrote:
> On February 8, 2011 05:37:06 pm Roger Howard wrote:
> > I was under the impression - maybe just guessing - that PSB support was
> > added without the official Adobe specification. I know ImageMagick added
> > it without - there's no reason IM, Hugin, or GIMP can't add support
> > through a cleanroom implementation - Adobe only controls their spec
> > documents, there's nothing illegal about implementing it without the NDA,
> > you just won't get the docs.

> Whether by reverse engineering (is that what you mean with
> cleanroom?) or by following the specs to the letter, one important
> ingredient is lacking: motivation.

Official cleanroom protocol is that one set of engineers looks at the
files and draw up a specification. Another set of engineers gets only
that specification and implements it.

In practise especially in open source there will be one engineer who
looks at the files to be imported/exported and tries to mimick
that. It helps a lot if he has software available to generate test
cases (for import) or to test-load the generated output files (for the
export function).

Now, when you are in certain jurisdictions, the end user license
agreement may be valid when it says: "You have the right to have this
program on your computer as long as you agree not to reverse engineer
any part of it." Without looking I'm confident that Adobe didn't
forget that clause.

In the EU, such clauses are forbidden by law. Every EU citizen is
allowed to reverse engineer or de-compile as long as it is to make a
product that provides interoperability. (reverse engineering for
example the smoothing algorithm to make the gimp smooth function as
nice as in photoshop is still illegal (as in, the EULA can legally ask
you to refrain from such activity). There is no interoperability
involved here!)

Roger.

--
** R.E....@BitWizard.nl ** http://www.BitWizard.nl/ ** +31-15-2600998 **
** Delftechpark 26 2628 XH Delft, The Netherlands. KVK: 27239233 **
*-- BitWizard writes Linux device drivers for any device you may have! --*
Q: It doesn't work. A: Look buddy, doesn't work is an ambiguous statement.
Does it sit on the couch all day? Is it unemployed? Please be specific!
Define 'it' and what it isn't doing. --------- Adapted from lxrbot FAQ

Roger Howard

unread,
Feb 8, 2011, 9:12:10 PM2/8/11
to hugi...@googlegroups.com

On Feb 8, 2011, at 2:58 PM, Yuval Levy wrote:

> On February 8, 2011 05:37:06 pm Roger Howard wrote:
>> I was under the impression - maybe just guessing - that PSB support was
>> added without the official Adobe specification. I know ImageMagick added
>> it without - there's no reason IM, Hugin, or GIMP can't add support
>> through a cleanroom implementation - Adobe only controls their spec
>> documents, there's nothing illegal about implementing it without the NDA,
>> you just won't get the docs.
>
> Whether by reverse engineering (is that what you mean with cleanroom?) or by
> following the specs to the letter, one important ingredient is lacking:
> motivation.

I'm not even sure it would require reverse engineering, as that work has already been done in ImageMagick, for instance.

As for motivation, that's why I'm commenting. If no one could be swayed by words on mailing lists, we wouldn't have much use for them :)

> The experience of four years Summer of Code (and of decades of Open Source
> development in general) demonstrated clearly that the most successful projects
> are those driven by the needs and wishes of the developer/student himself; not
> by the wishes of the user community or any other third party.

And I do believe that as the panoramic world continues to push the limits in terms of resolution, while file format support may not seem entirely sexy, it is something bound to affect many people at some point, and perhaps eventually get under the skin of a Hugin dev.

Personally, I have no stake in PSB; since there is at least a potential workflow involving BigTIFF > ImageMagick > PSB, then perhaps just using the 4.x branch of libtiff is enough to call it a day. I vastly prefer TIFF in general. I just think a panoramic package is incomplete these days without a viable huge-file workflow.

> Of course some developers are motivated by money and anybody is welcome to
> hire such a developer and contribute the result.

Totally. I'm not coming in insisting it be done, only speaking up for the idea. I know Hugin is driven by the motivations of the few contributors; I'm not one, so I'm only voicing an opinion, not daring to demand.

>
>
>>> Nobody ever said Hugin shouldn't support PSB, I'm a user and I think PSB
>>> support would be great if GIMP supported it too.
>>
>> I think it'd be great whether or not GIMP supports it
>
> I would like to see PSB support in Hugin, and I don't think it should be made
> dependent on the support in any other tool.
>
> That said, I don't need PSB support as much as I need zooming in the preview
> window, automatic detection of vertical lines, and improved blending / seam
> placement (just to name three). Others may have different opinion, and
> allocate their resources accordingly.

Exactly.

My only intent for stepping into this conversation was to voice why I believe the support matters, and why I think it's attainable. If that gets the interest or changes the focus of a contributor, then great. Otherwise, Hugin will go on. I've lurked on the list for several years and I've seen it made abundantly clear that Hugin will be what it is, not what others want it to be. I've seen the pushback before, and since, no, I'm not willing to offer funding, and no I'm not willing to commit my time as a developer, all I can offer is my perspective.

D M German

unread,
Feb 9, 2011, 2:16:07 AM2/9/11
to hugi...@googlegroups.com

>> Whether by reverse engineering (is that what you mean with cleanroom?) or by
>> following the specs to the letter, one important ingredient is lacking:
>> motivation.

Roger> I'm not even sure it would require reverse engineering, as that work has already been done in ImageMagick, for instance.

Are all of you aware of PTtiff2psd? It is not perfect, but does the
job for 8-bit images. It is in desperate need of a facelift.

--dmg


--
--
Daniel M. German
http://turingmachine.org/
http://silvernegative.com/
dmg (at) uvic (dot) ca
replace (at) with @ and (dot) with .

Lukáš Jirkovský

unread,
Feb 9, 2011, 5:34:25 AM2/9/11
to hugi...@googlegroups.com
> On a related topic (large images):  any interest to continue the VIPS project
> idea from a few years ago?  maybe using GEGL?  that would complement the new
> file format nicely.
>
> Yuv
>

I think I should chime in with some findings about GEGL and why I
think it's not a good idea to use it. We have been examining GEGL (and
VIPS too) at university for your project. I wasn't the one who was in
charge of doing it, but I talked with a guy who was (side note: even
though I shallowly looked at gegl too).

GEGL has a great interface for use within image editing application.
But it's very suitable for use in Hugin. First, it's written in C.
Given the fact Hugin is written using C++ it would be necessary to
write some wrapper around GEGL. Second, it's aimed for application
like GIMP. The operations in GEGL are stored in a graph. This is good
when you are implementing operations which allows you working with
multiple layers channels. Eg. when you want your software to allow
taking one channel as a mask, blur it and concurrently taking the
input image, doing some operations on it and later applying the mask.
But Hugin doesn't need such interface. The worse thing is that the guy
who played with GEGL said that the caching doesn't work very well and
sometimes it can take a lot of memory.

I don't know much about VIPS, but it seems to be in a better shape
and it seems to provide a nice C++ interface.

Lukas

Jim Watters

unread,
Feb 9, 2011, 6:47:18 AM2/9/11
to hugi...@googlegroups.com
On 2011-02-04 10:13 PM, Jim Watters wrote:

> On Fri 04-Feb-2011 at 16:36 +0100, David Haberth�r wrote:
>> Do you have any idea if there are (free/libre) libraries that can write
>> according to Adobes Specifications?
>> http://www.adobe.com/devnet-apps/photoshop/fileformatashtml/
> Looking at the document there are very few locations that the structure or
> layout for PSB is different from PSD. In all these locations the number of
> bytes used to define a width or length has doubled usually from 4 to 8 bytes.

I am about half way though implementing PSB in libpano. I am making all the PSD
functions duel purpose. Either they can be forced to write PSB as a flag or
automatically if either width or height is greater than 30000 pixels.

Now that panotools is switch to Mercurial.
Can someone give a quick overview of Mercurial using TortoiseHg GUI?

Lukáš Jirkovský

unread,
Feb 9, 2011, 7:00:45 AM2/9/11
to hugi...@googlegroups.com
> If there was one thing that I would improve process-wise this year, it would
> be using blueprints [0] on Launchpad to track/develop ideas and student
> proposals.

I wanted to add my idea about defining straight from the beginning of
December (now I realize how long ago it was) lines into blueprints.
Now I want to add it to the blueprints but there's thing which makes
me confused. I thought blueprints are something like a very simple
wiki, but from the descriptions when registering new blueprint it
seems to me that it's supposed only to be a list of proposed ideas
with short description of each. Should I add the complete proposal to
blueprint summary no mater what is in it's description or should I
rather create a page in panotools wiki with more detailed description
and link to it?

Lukas

Yuval Levy

unread,
Feb 9, 2011, 7:51:44 AM2/9/11
to hugi...@googlegroups.com
On February 9, 2011 07:00:45 am Lukáš Jirkovský wrote:
> from the descriptions when registering new blueprint it
> seems to me that it's supposed only to be a list of proposed ideas
> with short description of each.

Yes. The entry there is a summary of the use case, relatively immutable. Not
the place to work on the specs.

The field "specs URL" points to a wiki page where others can collaborate with
you on the specs.


> Should I add the complete proposal to
> blueprint summary no mater what is in it's description or should I
> rather create a page in panotools wiki with more detailed description
> and link to it?

I would suggest you articulate the (immutable) idea in the Summary, leaving
the details for a linked wiki page where others can add their ideas to yours
in the detailed specs.

When the specs have achieved a milestone update the Whiteboard of the
Blueprint (not visible on registration, but visible and editable on the
blueprint's page itself). See an example at [0].

If I understand you correctly, you have already an initial specs. Very good.
Put it on the Whiteboard, so that it shows that the Blueprint is at a more
advanced staged than the simple idea.

The wiki is for drafting, the Whiteboard is for approving. You will approve
into your Whiteboard specs only those suggestions from the wiki that make
sense to you.

Here are some links to excellent specs [1]. All in wikis, but if you wrote
yours as a set of screen mockups in Inkscape or present them as a youtube
video, feel free to link that instead.

And here is a good text on how to write good specs [2].

HTH
Yuv

[0] https://blueprints.launchpad.net/ubuntu/+spec/desktop-maverick-gnome
[1] https://help.launchpad.net/GreatSpecifications
[2] https://help.launchpad.net/WritingGoodSpecifications

signature.asc

Yuval Levy

unread,
Feb 9, 2011, 7:54:41 AM2/9/11
to hugi...@googlegroups.com
On February 9, 2011 05:34:25 am Lukáš Jirkovský wrote:
> I think I should chime in with some findings about GEGL and why I
> think it's not a good idea to use it.

can you record these findings somewhere where they don't get lost (wiki,
blueprint, etc.)?

thanks for sharing
Yuv

signature.asc

Yuval Levy

unread,
Feb 9, 2011, 8:02:54 AM2/9/11
to hugi...@googlegroups.com
On February 9, 2011 06:47:18 am Jim Watters wrote:
> Now that panotools is switch to Mercurial.
> Can someone give a quick overview of Mercurial using TortoiseHg GUI?

there is some applicable information at [0]

the first main difference, coming from subversion, is that your commits and
checkouts are local. You need to explicitly pull/push from/to the central
server.

Before starting your work, do a "pull". Not doing so may create a new head
(if the development on the central repo has moved further) and require you to
perform an extra merge operation (usually very easy, but it is preferable not
to get there).

When you work on your own, it is "commit" just like in SVN, with the
difference that these commits are not yet shared with the world.

When you are ready to share, use the "push" command.

The next main differences is that branches do not live in separate folders.
The "update" command take the branch name as an argument and you can switch
from the "default" branch (trunk in SVN speak) to any other branch and back
within the same folder. The operations are all local and very fast (disk
access is the limiting factor).

If you have other questions about how to perform specific operations, just
ask.

Yuv

[0] http://wiki.panotools.org/Hugin_translation_guide#The_first_time

signature.asc

Yuval Levy

unread,
Feb 9, 2011, 8:15:30 AM2/9/11
to hugi...@googlegroups.com
On February 8, 2011 09:12:10 pm Roger Howard wrote:
> I've lurked on the list for several years and I've seen
> it made abundantly clear that Hugin will be what it is, not what others
> want it to be.

actually: Hugin is to you what you make it for yourself. Nobody will ever
prevent you from going after your dreams and modifying to suit your needs. The
only string attached to it is that if you do distribute it, you must make your
modifications available in source code format.

The codebase is as free of constrains as possible. Technical, legal, human,
and metaphisical.

We are still bound to SourceForge hosting for the central code repository; and
to Launchpad hosting for the central "todo list"; but the binding is soft and
can be easily undone.

You can set up an instance of the full repository on almost any webspace. The
set of requirements is very low.

Last month I had to implement an Hg repo for a client. When SourceForge came
under attack, I also implemented one on my local web space, just to have a
workable plan in case the SourceForge outage becomes serious. It's plain
easy. I'll refine it and share documentation at some point (it's not that
urgent now, other than to give Kay access without signing up to the draconian
T&Cs that he does not agree with).


> I've seen the pushback before

The pushback is against demands. Keep offering your perspective, it is
welcome and as long as there are no expectations there won't be any
disappointments, and sometimes maybe positive surprises.

Yuv

signature.asc

Lukáš Jirkovský

unread,
Feb 9, 2011, 9:16:02 AM2/9/11
to hugi...@googlegroups.com

Hi Yuv,
Thank you for giving me insight into how blueprints are used. For now
I've registered blueprint [1] and a wiki page [2] with complete
proposal. The proposal is open for suggestions. I already included
some from the hugin-ptx thread referenced in the wiki page.

If nobody objects I'd like to add a link to the
http://wiki.panotools.org/GSOC_2011

Lukas

[1] https://blueprints.launchpad.net/hugin/+spec/straight-line-ui
[2] http://wiki.panotools.org/Straight_Line_UI_proposal

Jim Watters

unread,
Feb 10, 2011, 2:32:56 PM2/10/11
to hugi...@googlegroups.com
On 2011-02-09 9:02 AM, Yuval Levy wrote:
> On February 9, 2011 06:47:18 am Jim Watters wrote:
>> Now that panotools is switch to Mercurial.
>> Can someone give a quick overview of Mercurial using TortoiseHg GUI?
> there is some applicable information at [0]
Thank you.

> The next main differences is that branches do not live in separate folders.
> The "update" command take the branch name as an argument and you can switch
> from the "default" branch (trunk in SVN speak) to any other branch and back
> within the same folder. The operations are all local and very fast (disk
> access is the limiting factor).

I am looking at creating my own branch to add my changes to.
After not finding anything in TortoiseHg to do this I went to the command prompt.
It looks like the default branch is optimizeroptions instead of libpano.
Is there a tool for TortoiseHG like Repository Browser for TouroiseSVN?
Repository Explorer allows to see the changes to a particular branch but can not
create on.

... ok I found menu item "Update..." that allows to change between branches and
shows current branch. Every time I switch a branch to I need to do a
Synchronize|Pull?

Is there a way in TourtoiseHg to create a new branches?

> Yuv
>
> [0] http://wiki.panotools.org/Hugin_translation_guide#The_first_time

Yuval Levy

unread,
Feb 11, 2011, 8:29:49 AM2/11/11
to hugi...@googlegroups.com
On February 10, 2011 02:32:56 pm Jim Watters wrote:
> ... ok I found menu item "Update..." that allows to change between branches
> and shows current branch. Every time I switch a branch to I need to do a
> Synchronize|Pull?

no, just an update. I am sorry I am guiding you blindly through this, I don't
have TortoiseHg installed.

On the command line, to list all branches:

hg branches

to switch to a specific branch:

hg up -C BRANCH_NAME

To understand the Synchronize|Pull functionality: Pushing/Pulling is
asynchronous from Check In/Out. This is what most people coming from SVN
stumble upon.

When you check in/out you are performing local operations on your local repo.

When you pull, you synchronize changes from the central repo to your repo.

When you push, you publish your changes to the central repo.


> Is there a way in TourtoiseHg to create a new branches?

I hope some TortoiseHG user will step in. I recently bought a netbook with
Windows 7 but the only operations I performed on the Windows side of things so
far was to resize its partition to make space for Kubuntu.

On the command line, it is

hg branch NAME_YOUR_NEW_BRANCH

At the first commit after that, the branch will be created; and at the first
push after that, the branch will become available on the central server.

Yuv

signature.asc

Jim Watters

unread,
Feb 19, 2011, 9:39:08 AM2/19/11
to hugi...@googlegroups.com
On 2011-02-04 10:13 PM, Jim Watters wrote:
> On 2011-02-04 5:53 PM, Bruno Postle wrote:
>> On Fri 04-Feb-2011 at 16:36 +0100, David Haberth�r wrote:
>>
>>> Why would PSB (a quick google detour hinted it's PhotoShop) be desired? Is a
>>> layered TIFF not something you can import into PhotoShop?
> I believe Photoshop does not recognize cropped tiff images. So every layer
> would be full width and height. An import filter would be one way to fix this
> bug in Photoshop (if it still exist).

>
>>> Do you have any idea if there are (free/libre) libraries that can write
>>> according to Adobes Specifications?
>>> http://www.adobe.com/devnet-apps/photoshop/fileformatashtml/
>>
>> This is an interesting development, previously Adobe only let you see the
>> PSD/PSB spec if you faxed them promising never to reveal any details (i.e.
>> you were not allowed to use it in Open Source software).
>>
>> Someone needs to start a free 'libpsb' before we can use it in Hugin, but I
>> guess this itself could be a SoC project.
>>
> I found these two projects, but they are not to active.
> https://sourceforge.net/projects/libpsd/
> https://sourceforge.net/projects/openpsd/
>
> Our big use of PSB format would be for saving of layered raster files. This
> is a small subset of the functionality of the PSB file format.

>
> Looking at the document there are very few locations that the structure or
> layout for PSB is different from PSD. In all these locations the number of
> bytes used to define a width or length has doubled usually from 4 to 8 bytes.
Google just announced that Google Docs Viewer now supports Adobe Photoshop PSD
files.
http://googledocs.blogspot.com/2011/02/12-new-file-formats-in-google-docs.html
But now that I look I can not see that Google Docs is an open source project. :(

dmg

unread,
Feb 19, 2011, 2:30:50 PM2/19/11
to hugi...@googlegroups.com, Yuval Levy
On Fri, Feb 11, 2011 at 5:29 AM, Yuval Levy <goo...@levy.ch> wrote:
>
>   hg up -C BRANCH_NAME
>

Do not do -C, use -c. If you use -C it will blindly remove your local
changes. -c will warn you if you have
uncommitted ones.


--
--dmg

---
Daniel M. German
http://turingmachine.org

Gnome Nomad

unread,
Feb 19, 2011, 6:27:00 PM2/19/11
to hugi...@googlegroups.com

It isn't as far as I know. Check LibreOffice or OpenOffice. I have no
idea if they support PSD/PSB files, but LibreOffice is the newer release
and includes some features that OpenOffice doesn't include yet.

--
Gnome Nomad
gnome...@gmail.com
wandering the landscape of god
http://www.cafepress.com/otherend/

Yuval Levy

unread,
Feb 20, 2011, 1:46:19 PM2/20/11
to hugi...@googlegroups.com
On February 19, 2011 02:30:50 pm dmg wrote:
> On Fri, Feb 11, 2011 at 5:29 AM, Yuval Levy <goo...@levy.ch> wrote:
> > hg up -C BRANCH_NAME
>
> Do not do -C

Depends on the intended outcome. With -C you make sure that you are on a
clean and fresh checkout of the branch. Paramount for preparing a tarball for
distribution or for building a predictable binary; and to prevent mix ups
between branches.

The bottom line is that you want to know what you are doing, and the -C option
protects the newbie from the consequences on a non-clean tree while teaching
more advanced users like you (maybe the hard way) to better manage their
unversioned modifications inside the checkout.

I personally still prefer to work with different branches cloned / checked out
to different subdirectories. Disk space is cheap nowadays and keeping each
branch clean in its own right prevents a lot of time waste.

Yuv

Yuv

signature.asc
Reply all
Reply to author
Forward
0 new messages