Dear Skanect Users,
We are happy and proud to announce Skanect 1.6.
This is a long-due release, that includes a lot of performance improvements, visual tweaks and bug fixes, along with brand new features, among which "Uplink" is probably the most important of them.
Uplink allows you to use the Structure Sensor in Skanect, wirelessly.
You can learn more about uplink by watching this video: http://youtu.be/RLQA6wTOYf8
And if you don't know about the Structure Sensor yet, you should really check it out! It's here: http://structure.io
Uplink is activated when you start Skanect on a computer that doesn't have any sensor plugged into it. When this happens, your operating system may warn you about Skanect acting as a server. This is normal, and you simply need to allow Skanect to access the local network, and you won't see the message again.
Other New Features
Mac OS X Improvements
Last but not least, you can grab the new binaries here:
We hope that you will enjoy this new release as much as we enjoyed preparing it for you!
NT, for the Occipital team.
PS: The windows 32 bit version is still due, we apologize for this, and will ship it next Monday.
A little feedback about my experience with 1.6
on mac it is magic - was 1-2 fps (in 1.5 - 1.6beta3), in final - 14-20 fps
same 2011 macbook air on i7
it scans a lot faster - now i can go mobile with kinect, but process now take twice more time than it used to
on pc with quad and geforce 630 i did not see any improvements on 1.6 compared to 1.5 or 1.6 beta3
also i tried to load scanned data on mac to process it in elder version - but when opening on win7x64 - it shows just empty box
it tries to make some offline reconstruct, but same empty box stays
has format of files changed ? or some inner structure ?
Adam from Occipital here. I’d like to take the opportunity to update you guys on future Structure Sensor updates, and try to answer a few questions from this thread.
We have a great application in the works that will calibrate the precise relationship between your Structure Sensor and iPad camera. We’re very excited to release it, and hope you will all find it worth the wait. We’re finishing it now, and we’ll post a preview on the Structure Forums within a few days, and then submit it to the App Store. Once you use this application, every single app built with the Structure SDK will have access to aligned depth and color.
Next, we’ll update the SDK’s Scanner sample to incorporate the new aligned color feature.
Right around the same time, we’ll update Uplink to use the aligned color stream, which will bring the iPad’s color stream right into Skanect.
Bob - Thanks for your feedback. It looks like you were not a Kickstarter backer, which means that we haven’t sent you a link to the Software Status Dashboard. We created this to help our earliest backers stay apprised of when new software becomes available - including depth and color alignment for the Scanner demo and Uplink/Skanect. Thanks to your feedback, we’re going to make sure our post-Kickstarter customers get sent a link to the dashboard, too. For now, here’s the dashboard: http://structure.io/software-dashboard.
Asimov_inc - As for the results you’ve received so far, can you please share some with us at structure <at> occipital <dot> com? Perhaps we can help figure out what’s happening and help you get better results. Often there’s a quick tip to help you get great results. Scanning software is always getting smarter, but it still takes a few tips sometimes to get the most of it.
Mark - as James said, the itseez3D model texturing is completely real. The Itseez team has done a fantastic job with texturing. Since they’re doing a custom texturing post-process, they actually avoid the need for pre-calibration, which means you won’t even need to wait for the new calibration app to begin capturing.
To see a handful of models captured by others, check them out via Sketchfab: https://sketchfab.com/models?q=itseez3d
James - thanks very much for your kind comments. We really appreciate all of your support and are a fan of all of the great things you’ve already done with Structure Sensor, and Skanect before that.
For any deeper discussion, we’re always available to answer your questions at structure <at> occipital <dot> com.
Thank you for your considered response.
I was in fact an original Kickstarter backer - silver with iPad air mount and hacker cable.
I am still not clear from your response when we might be able to colour scan meshes... Having a calibration tool is interesting however does not actually do anything useful.
I have tried itsees 4 times but it just sits there saying "sending". Even if I could get it to work - it just does not do what I need it to do and what I was led to believe that the structure sensor and iPad air would deliver.
I understand the requirement for it.
Personally I would like to start working with something that was less than perfect but was actually closer to a practical tool.
Thanks for the level response dude, I don't generally get all flamey.