Another photoFly project, on a smaller scale.
Rendered it out as a small movie.
Autodesk created an on-line prototype software that's capable of building a rough 3D mesh of any objects via photographs that the user takes. Using a shared photo compiling network, the software takes about 10-15 min to stitch the pictures together to make the final 3D render.
For more information and tutorials about photoFly: http://labs.autodesk.com/utilities/photo_scene_editor/
Below, are a couple of pictures as an example to show what the actual building looked like.
However for the software to output 3D mesh, the more pictures you can take, the better the quality of the 3D model.
Preferably > 50 pics. If you could get a 360 degree of the whole object, as well as some low and high angle shots, you will get the Google map building look.
One thing to take note of: All pictures been shot at the location, should have same exposure (easy for software to stitch pics together), and high F-stop (for details all around).
After the software compiled all of my photograph's that I manually renamed them and saved as .Tif files.
The result of the low poly mesh model was quite good.
After a re-rendering process of maximizing the 3D mesh, the final rendered look, actually looks pretty decent as a whole. Makes me think how much interesting objects could be databased on-line, for schools, libraries or even museums to categorize species, anatomy and etc. It is also useful for quick renders for concept film sets, animation sets.
Autodesk has published the software in its testing phase, makes it cost efficient compared to hacking a Kinect to use it as a scanning device. Although it takes a long process to develop a 3D mesh compared to a Kinect, but it's easily accessible where ever you are where there's Internet access.
No comments:
Post a Comment