Last Friday I attended the free GEM NI event ‘3D Printing and Scanning for the Museum and Heritage Sector’ at the Ulster Museum. The speaker was John Meneely of Queen’s University, who has extensive experience using 3D laser scanning technology in the cultural heritage sector. He’s currently working on his PhD monitoring the weathering of limestone in built and natural heritage using 3D scanning techniques, and teaches digital documentation in heritage science to Masters students. You can follow him on Facebook and Instagram as 1manscan or Twitter @1manscan to keep up to date with his adventures in 3D scanning.
John talked us through the different ways of capturing 3D imaging, focusing on equipment that museums and heritage sites would already have access to, such as digital cameras and smartphones. The first method, structure from motion can be as a simple as a series of photographs taken at different angles around a museum object, which can be stitched together by a free app to find pixels in common and produce a 3D image. There are completely free, easy to use apps available, such as Autodesk 123D catch. This 3D scan image of an Easter Island figure shows the points around the object where the images were taken from to produce the final result.
I really liked the idea of crowd sourcing structure from motion, combining a selection of images taken by different users to create the final image. This would be a great opportunity for a museum to ask visitors to upload photographs of their favourite objects and exhibitions to produce a 3D map of the museum, and also to evaluate the most popular exhibits, and routes taken through the museum. The photo below shows structured light 3D scanning, recently used by the Smithsonian to create a 3D scan of Obama for his presidential portrait. You can read more about the creation of the 3D Obama on the Smithsonian website.
This type of scanning technology is becoming more popular and accessible, with the launch of structure sensors that clip onto iPhones and iPads, and Google’s Project Tango, a structured light scanner built into a Google phone. To me this sounded like a great way for Google to get all the phone users to upload their data and 3D model entire cities and countries as the popularity of the technology spreads. Reflective transmission imaging is carried out with a fixed camera on a tripod and light that moves around an object, and can produce 3D images of paintings and artworks. As well as being used to determine authenticity of artworks it also reveals some of the process behind the final piece, showing how the paint is layered onto the canvas, and highlighting changes made to the piece.
John told us about the many uses of 3D scanning the heritage and museum sector, from reproduction of missing objects to analysis of delicate objects. Once an object is 3D scanned you can investigate measurements and details without having to handle the original object again, which makes conserving delicate artefacts easier. Laser scanning can create amazingly detailed scans, such as this image of Mount Stewart before the recent renovations started.
There is also this scan of Marble Arch caves, which gives a great idea of the depth and scale of the caves. The scan images can also be used so that people who are unable to visit the caves in person are still able to interact with the heritage and experience the site, albeit in a different way. The 3D scans can also be used by both natural and built heritage sites to monitor weathering and degradation of buildings and structures, by comparing images taken over a period of time.
There are plenty of software options that will convert your 3D scan into a 3D object so that you can print it, such as Meshlab. Why would I want to print it, I hear you ask? Well, who wouldn’t want their own miniature copy of their favourite museum object, heritage site, or local landmark? 3D printers either use thin layered strips of plastic, a laser to solidfy powder, or UV light to fuse a liquid polymer into your chosen shape. Here’s an example of a wooden 3D printed chess piece, demonstrated by Kylie.
You can even get a desktop RepRap plastic 3D printer that can print 90% of its own parts! For me, the most exciting part of this was that there is a website where you can share your 3D model files with others, and download and print a whole range of objects. I think it would be fun to choose a few objects to print and curate your own 3D printed exhibition of mini objects. John mentioned the environmentally friendly option of a shredder that uses recyclable plastic bottles to create the plastic printer spools.
Critics of 3D scanning technology and digitisation of collections sometimes comment that making museum collections available in this way will lead to less people visiting museums in person. However the opposite has been proven, by visitor numbers increasing as people want to see the objects in real life, for example the Horniman Museum’s walrus. It also opens up museums to people who are unable to visit in person due to disabilities, finances, or distance from the museum.
I found John’s talk really informative and interesting, and I was excited to try out some of the free apps he mentioned to create my own 3D scan. John did a great job of explaining the technologies involved and also sharing practical examples of his own work that we could all relate to. At the end of the talk John showed us a hologram made from a 3D scan, which was very clear. We also got to ask lots of questions about practical uses of the technology and even explore some of the scans he’s created in more detail.