It reminds me of the approach a package I once used called autostitch
There is a link on the following URL to download Photosynth.
http://www.cnn.com/SPECIALS/2009/44.president/inauguration/themoment/
w..
Well, it sure doesn't look "3D" to me, not at all. And, just to show you
that dust knows no bounds, take a look at the top left of the set in the
blue sky...
It isn't supposed to look 3-D. The point is to be able to move around
and view a scene from many perspectives. Perhaps D-Mac could use this
to finally finish his stepped-out linear pano?
It's hardly 3D. It's stitched-together panoramas. I wonder what
Microsoft PAID for this ad?
Hmmmm... It said 3D when I downloaded that crap. Whatever, it's not very
impressive, but does show possibility, as long as you clean the damned
sensor. That was ugly dust.
How's the new cam working out? I'm fixin' to order one in February.
OMG!! You mean it was product placement?!
Yeah, I wondered that, as well. Truly unimpressive, for now. We'll see where
it goes.
Ron Hunter wrote:
I really liked using autostitch. The CNN use of Photosynth used something like
11,000 images many if not most from cell phones using a autostitch type
algorithm. Being able to walk through a scene and watch perspective change
opens a lot of possibilities when a scene is scanned with multiple 5D2's for
example. It starts to open the era of terra bit images
w..
Walter Banks wrote:
11,000 people with cell phones is one thing but the same technology processing
good quality images as a starting point starts to make sense. Look at this a
an early technology test.
The sheer volume of data in some of the potential uses is staggering. There has
been a scanning laser system used in multicar accident investigations and archaeology
site scanning that has distance information, low resolution and little color or surface
information. The potential to extract this information from high resolution images
opens a whole new way of viewing an image.
Send Bret out with his FAB 5D2 and a spare hard disk drive and start snapping
images together they would create a virtual image that could reach a terra
pixel or more.
w..
What a poor suggestion. If he did it then every photo would have the wrong
focus, exposure, and color shift to them. Garbage in, garbage out. They need
real data for a project like that, not the distorted trash that he records. A
terra-pixel image of blurs, wrong exposures, and shifted colors. Just what the
world needs.
NOT
"PierceJ." wrote:
> >Send Bret out with his FAB 5D2 and a spare hard disk drive and start snapping
> >images together they would create a virtual image that could reach a terra
> >pixel or more.
>
> What a poor suggestion. If he did it then every photo would have the wrong
> focus, exposure, and color shift to them. Garbage in, garbage out. They need
> real data for a project like that, not the distorted trash that he records. A
> terra-pixel image of blurs, wrong exposures, and shifted colors. Just what the
> world needs.
Are you up to the task? I am in not position to judge to judge based on the
samples I have of your work.
w..
"PierceJ." wrote:
> They need
> real data for a project like that, not the distorted trash that he records. A
> terra-pixel image of blurs, wrong exposures, and shifted colors. Just what the
> world needs.
They would be adequate for a technology test. :)
w..
I challenge you to produce a pic of mine with poor focus.
He's likely to define "focus" in such a way that holograms have "poor
focus".
--
--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)
>> What a poor suggestion. If he did it then every photo would have the wrong
>> focus, exposure, and color shift to them.
>
> I challenge you to produce a pic of mine with poor focus.
>
I don't think anyone has ever taken a picture of you,
in or out of focus, other than yourself?