Artifact Digitization Best Practices
Several best practices exist for file formatting, sharing, and metadata creation when digitizing a museum's collection of artifacts. Many of these best practices have been standardized by the US government through the Library of Congress, the Smithsonian, and the National Archives. When it comes to technology and process, there's less of a one-size-fits-all model -- depending on budgetary, personnel, and time restrictions, different techniques may be required. Museums also have access to a growing number of 3D digitization services ranging from affordable to quite expensive. Read on for my full rundown of this info.
STORAGE & SHARING BEST PRACTICES
The US government has a set of resources for artifact digitization with the Library of Congress maintaining fellowships and educational resources for these purposes. It publishes an annual list of recommended file formats for various media types and this list is ranked by preference and includes metadata guidelines. The US Federal Agencies Digital Guidelines Initiative has several lists of best practices as well, depending on artifact type.
The Smithsonian has the most succinct distillation of the file formatting and metadata best practices recommended by the government and:
"Images: 6,000 pixels along the long axis (minimum 600 ppi), RGB uncompressed TIFF format.
Audio: Uncompressed Broadcast Wave Format (BWF; WAV), 16 bit depth, sampling rate of 96 kHz or 44.1 kHz for spoken word
Video: MPEG 4:2:2 and MJEPG (MXF wrapper)."
Finally, the National Archives offers a list of general best practices for museum collection digitization and these have been quoted verbatim as follows:
"Digitize at the highest resolution appropriate to the nature of the source material and to avoid re-digitizing and re-handling of the originals in the future.
Digitize an original or first generation (negative rather than print) of the source material to get the highest quality image.
Create and store a master image file that can be used to produce copied image files and serve a variety of current and future user needs.
Use compression techniques and file formats that conform to current technology standards — particularly those in the cultural preservation areas.
Create backup copies of all files and store on servers that have an off-site backup strategy.
Create meaningful and intuitive metadata for image files or collections.
Store digital files in an appropriate server environment.
Document a migration strategy for transferring data across generations of technology.
Plan for future technological developments."
These guidelines are as close to industry-standard as I could find with the government's recommendations "based on several professional standards and best practice guidelines." They seem to be applicable outside of government-run institutions as well.
PROCESS & TECHNICAL BEST PRACTICES
In general, the process to digitize a museum collection involves scanning images and objects and uploading those scans to a computer. However, based on an institution's budget and personnel, different technologies might be more useful. Options include overhead scanners, flatbed scanners, and even digital cameras.
Librarians at Colorado State University tested their $55,000 overhead scanner against a $4,000 DSLR camera and actually found that the DSLR gave better results with many document types. This British guide shows a similar setup, as the DSLR setup is more portable and less space-intensive. They recommend "a fixed lens with a focal length of 50 to 60mm" and conveniently, this type of lens often comes standard with a DSLR.
The British Museum is a leader in digitizing its collection of 3D objects. While more specialized technology for this purpose exists (laser scanners and 3D scanners), the British Museum is using a technique called photogrammetry, which involves using software to stitch together pictures from several angles into a 3D model.
A review I found links to several photogrammetry software suites with similar results, the most accessible of these is Autodesk. Autodesk's ReCap (reality capture) software runs $40 a month or $300 a year for a single-user license and it also has a free trial available. I also found two pay to view articles that discuss advances in and best practices for 3D object digitization, which could be of use.
Some museums are also beginning to experiment with VR and AR for digitization, but this technology is too new to have associated best practices.
Finally, the EU, through a joint project, created an archaeology-focused scanning software suite called Presious. This software can speed up the artifact scanning process through predictive imaging and can even show the missing pieces of broken objects.
One additional note: while this guidebook doesn't touch on digitization best practices specifically, it does seem like a good resource for putting together an overall digital strategy for a museum and might be worth reading.
The US government has aggregated a large set of best practices for digitizing artifacts, including recommended file formats, metadata, and sharing practices. There's less of a standard when it comes to technology and process for digitization, but a few basic techniques will be generally applicable (scanning with overhead or flatbed scanners or taking images with a digital camera). Several software suites exist for 3D object digitization, the most accessible of which is Autodesk's ReCap photogrammetry software package.