by Henry-Louis Guillaume, Arnaud Schenkel, Benjamin van Nieuwenhove and melody Ducastelle
Photogrammetry exploits the principle of Parallax between several photographs of the same subject taken under different positions and angles. These Parallax differences will allow the Photogrammetry programs to determine, in a trigonometric manner, the coordinates of the reference and junction points for the creation of a three-dimensional numerical model. It is therefore necessary to determine a shooting procedure to cover the whole of a subject in a geometrically correct manner, avoiding any omission and underdraft.
The principle of simple Prime-to-Board, many people engage in photogrammetry, either by wanting to test a new technique or by a passion for 3D modeling. The constant problem of this technique stems, on the one hand, from the fact that it seems easy to set up and that everyone is able to produce, easily, realistic three-dimensional reconstructions intuitively. On the other hand, people who have photogrammetric work are not necessarily aware of the investment in time, photographic and computer equipment, software and experience that this technique requires to produce a professional quality work. The result is the emergence, on many sites such as Sketchfab, of a set of photogrammetric achievements that seldom meet the required quality criteria that we will mention below.
It is therefore necessary to know how to explain, to everyone, the difference between a quality Photogrammetry and an amateur photogrammetry, in order to justify the cost of professional achievements following a technical and scientific methodology.
Firstly, it is possible to make Photogrammetry from images from any material capable of taking photographs, but the quality and precision of the final model depends essentially on the images and their precision. Photogrammetric shooting equipment must be defined by its flexibility to adapt to the shooting constraints as well as its hardware and software characteristics. Strictly speaking, a quality Photogrammetry will depend-in addition to the correct use of the material and the experience of the operator-of a device possessing a sensor with a very high definition and an important dynamic, entirely detachable in manual mode, equipped with a calibrated auto-focus, an extended connection to manage triggering systems and light sources of quality, but also joinable to good lenses and mounted on a stable and handy tripod. The photogrammetric treatment requires a powerful workstation, rich in RAM and storage space consisting of one or more powerful graphics cards: the power of the machine will depend on the processing time. It is therefore necessary to explain this paramount importance of the material, the investment in the latter and the mastery that the operator has in photography, because the image quality is thus observable through a better definition, a size more information, better colour and light management, better controlled depth of field, elimination of the types of blurry and sensor noises, management of the lenticular deformation upstream and downstream by a regular calibration as well as precise usability according to the shooting constraints.
It is necessary, then, that any professional can explain, in a clear and understandable way his process of acquiring and processing images obtained during the shooting. Some Photogrammetrists consider that raw, unretouched images are a solid and Explosible basis directly to achieve a faithful 3D reconstruction. From the moment it presses the shutter, a professional photographer is conscious of producing a truncated 2D image of reality, depending on the brightness calculated by the device, according to certain conditions, and with strains produced by the objective used. It is therefore imperative to explain to anyone interested in the achievements of this reconstruction technique the importance of several hours of infographic retouching of the batches of images, so that they are interpreted at best by recontruction programs. This will result in colorimetric and photometric homogeneity, an improvement in the details leading to a greater number of control points during the geometric reconstruction and a more precise mesh avoiding the drift that arises from the architecture optics of the lenses used.
From the moment it presses the shutter, a professional photographer is conscious of producing a truncated 2D image of the real world […]
The model made, it is possible to demonstrate easily the quality of the final work in different aspects. The photogrammetric programs do, in fact, realize only part of the work: the production of a cloud of points and/or a raw and unmanipulable high definition 3D model. This crude model must necessarily be treated with 3D computer graphics techniques: decimation (reduction of the polygonal weight of the model), retopology (rational reorganization of the mesh), Relighting, correction of the mesh, correction of the texture, generation of the different maps necessary to create a complex shader transcribing the real according to the model and the aims of the project (color map, normal map, displacement map, elevation map, glossiness, roughness, shadow map, AO map,…), UV mapping, etc… It is imperative to compare a crude model that has not undergone any rectification, apart from a decimation to be exportable on an online Viewer, with a final model retouched upstream and downstream.
Here is an example of a comparison between a modeling that has no pre-and post-treatment and the same model with various treatments:
- Difference between a non-rectified model and a rectified model (photographic correction, colorimetric and delighting of the processing lot, addition of a modified normals card for details, Relighting in the Web Player):
- The difference between the topology and texture of an unworked mesh and a mesh and texture that undergoes controlled decimation, retopology and UV mapping:
|Device with high resolution sensor
|Device + Optical calibration
|Auto-focus calibration and creation of adapted optical correction presets
|Using RAW shooting
|For direct use or for alterations and development
|Deleting shadows and specular (delight)
|Hardware or software deletion of drop shadows and reflections
|Using a colorimetric chart for real color calibration
|Correction of the distortions produced by the optics
|Fixed image batches before Photogrammetry for Standardization
|Dense point cloud generation
|Getting the dense scatter cloud by photo-shifting
|Processing and cleaning the dense point cloud
|Colorization, cleaning, point cloud density processing, field data addition
|Mesh of the raw model
|Dense point cloud mesh to obtain high definition raw 3D model
|Texturisation of the raw model
|Projection of the raw texture on the high definition model
|Cleaning of excess points and polygons, filling of holes and smoothing
|Creating the shader
|Extracting and creating texture maps composing hardware and corrections
|Reduction of the polygonal weight of the model according to its final operation
|Reorganization and rationalization of the polygonal mesh
|Reprojection of high-definition corrected texture on decimated mesh
|Assembling the different cards of the final shader
|Reorganization and rationalization of the final texture and alterations
|Level of details
|Create different levels of model details for use in real-time engines
|Storage of acquisitions and raw and retouched models as well as intermediate model data for any subsequent work
Although seasoned amateurs can produce quality photogrammetries, professional production will take care to archive every step of the work in order to be able to redeploy the raw data or improve the model for use , correcting or increasing these models by adding topographical or lasergrammetric data to extract specific data such as calculations of distances, surfaces or volumes, orthophotographs, scatter points for various analyses, etc… A quality realization also guaranteed compatibility with many 3D modeling programs, 3D printing, digital milling machines, BIM managers, pre-calculated or real-time rendering engines, engine video games or compositing VFX, respecting the standards of each of these sectors.
After these material and technical considerations, it is important to emphasize that any computerized manipulation requires the use of programs: the latter have a high use license price of several thousand euros, all the more justified by an after-sales service that can solve many problems related to the realization of certain projects. It is therefore imperative to be able to prove, to any supervisor wishing, proof of compliance of the software licences; the amortization of the price of the software licenses is obviously in the invoice count of a quality project. Accepting a work done with pirate licenses is tantamount to participating in piracy.
- Michael J. Bennett, evaluating the creation and preservation challenges of Photogrammetry-based 3D models, 2015.
- Fei Dai, Youyi feng and Ryan Hough, photogrammetric error sources and impacts on modeling and surveying in construction engineering applications, in visualization in engineering, 2014 2:2.
- Jean-Michel Friedt, reconstruction of three-dimensional structures by photographs: the MicMac software, 2014.
- Isabella Toschi, Alessandro Capra, Livio de Luca, et al., one the evaluation of photogrammetric methods for dense 3D surface reconstruction in a metrological context, in ISPRS annals of the photogrammetry, remote sensing and spatial information sciences, volume II-5, 2014.
- Sam A. A. Kubba, architectural forensics, McGraw-Hill, 2008.
- Donald. H. Sanders, archaeological publications using virtual reality: case studies and caveats, in virtual reality in archaeology, pp. 37-52, Oxford: BAR, 2000.
- Maurizio Forte, Alberto Siliotti., virtual Archaeology: Re-creating ancient worlds, New York, 1996.
- Nick S. Ryan, computer based visualization of the past: technical ‘ realism ‘ and historical credibility, in, imaging the past: electronic imaging and computer graphics in museums and archaeology, pp. 95-108, London, 1996.