Meet the iReal 3D Mapping Software

08/24/22

After the high-precision laser 3D scanning by Scantech, as for the high-definition mapping in the post-processing part, the iReal 3D team from Scantech provides a brand new solution: iReal 3D smart mapping software.

The software uses photogrammetry, AI image recognition, texture fusion, and seam line editing technologies to achieve efficient and accurate mapping.

You can download the software by clicking here.

ireal 3d smart mapping software

The software is highly integrated (the traditional way of post-processing requires multiple 3D software working simultaneously, and has high requirements on capabilities and experience of the post-processing personnel), and highly automated (the automation rate has reached over 95%, making it easier to use.

There is no high professional requirement for the post-processing personnel. In addition, since most of the processing is automated, the person responsible for the post-processing can use multiple computers and process multiple models at the same time), highly efficient (the time of mapping on an object with mid-mesh has been reduced to only half an hour, while the traditional processing takes at least 2-5 hours), the above advantages significantly improve the production efficiency of high-definition mappings and greatly reduce the production cost.

In terms of the fineness, the iReal 3D smart mapping software achieves 0 loss of image resolution and restores every texture detail on the object to the greatest extent.

●Highly automated and intelligent: more than 95% of the work is automatically handled by the software

●Simple operation, easy to use: you can get started with one day of training, skilled in one week of use, even with no former experience

●Efficient output: it only takes 30 minutes to process the mid-model (30-100w faces) and high-model

●High-definition and lossless quality: it supports 4K, 8K, 16K ultra-high-resolution texture mapping output to ensure the quality of texture mappings

●Precise mapping: natural and seamless texture fusion, texture accuracy ≤ 0.1mm, color uniformity (color consistency between multiple photos) ≥ 95% Image blemishes (foreign object mapping, highlights, lens contamination, etc.) ≤ 0.01%

The main process of making high-definition color 3D models

Step 1: Data Collection

3D Scanning

Photo CollectionPhoto Collection

Step 2: Model & Photo Pre-processing

Model Pre-processing

Photo Color Adjustment

Step 3: Automatic extraction of photo subjects

Through AI image recognition technology, the extraction of the main parts of all photos is automatically completed (the white is the automatically recognized main part), which is used for the next step: relative orientation of photos

Automatic extraction of photo subjects

Step 4: Relative Orientation of Photos

Import all photos with one click, automatically calculate the relative positions of hundreds of photo sets within 5-10 minutes and complete the relative orientation of all photos

Using the texture features (the main part) between the photos, automatically calculate, complete the relative spatial position sorting of all photos, and then build a sequence virtual camera and feature sparse vertice for all photos

import - iReal 3D Mapping Software
Import hundreds of photos with one click
Sequence virtual camera - iReal 3D Mapping Software
Sequence virtual camera
Reconstructed sparse vertice - iReal 3D Mapping Software
Reconstructed sparse vertice

Step 5: Absolute Orientation – Manual Alignment

Select a photo, place the 3D model in the same size and position as the photo, analyze the similar structural features of the photo and the model through an intelligent algorithm, complete the automatic precise alignment of a single photo and the model, and then automatically construct the spatial mapping relationship between the model and the photo with one click

Step 6: Absolute Orientation – Fine Registration

fine registrationfine registration

Step 7: Smart Mapping

1. Intelligently identify wrong textures (facula area, out-of-focus/blur area, non-object area), and automatically eliminate reflective textures.

2. Each triangular mesh surface of the model can be associated with multiple multi-angle photos, automatically calculate and sort their texture quality, and automatically select high-quality textures with higher weights.

(Weighted statistics of many parameters such as sharpness, saturation, angle, etc.) For intelligent mapping, reducing the work of manual eliminating of faculas and out-of-focus blur areas.

3. After the mapping is successful, the texture seam lines are automatically generated, and the wrong textures that are not completely removed are edited in real-time, to realize the perfect texture fusion of multi-angle images in the 3D model.

Smart MapperSmart Mapper

Step 8: Real-time Seam Line Editing

Visualization, real-time editing, and refinement:

For maps that are not completely matched or local maps with reflections, out-of-focus blurred photos, etc., you can use the seam line editing tool to make a precise real-time refinement.

The time to recalculate the texture can be controlled to within 1 second

time seam line editing before - 3D Mapping Software
time seam line editing before
time seam line editing after - 3D Mapping Software
time seam line editing after

Step 9: Photoshop Plug-in

photoshop plug-in - 3D Mapping Software

Step 10: One-click Color Leveling and Feathering

The intelligent fusion processing of mappings ensures that the effect is natural and has uniformity in color:

1. Using the Poisson Blending algorithm to complete the color calculation

2. Feather processing for the texture of the seam line area

One-click color leveling and feathering - iReal 3D Mapping Software
One-click color leveling and feathering before
One-click color leveling and feathering - iReal 3D Mapping Software
One-click color leveling and feathering after

iReal 3D Mapping Software – Specifications

Data format 720°digital 3D model
Output format *.obj,*.ply,*.fbx,*.stl,*.off and other universal 3D formats
Resolution of texture 16384*16384、8192*8192、4096*4096
Mapping error ≤0.08+0.02*D/300 mm (“D” represents the largest size of the item, the unit is mm)
Average chromatic aberration (CIEDE2000)≤5
Image spot (Foreign object mapping, light spot, lens contamination, etc.) ≤ 0.01%
Color uniformity After multiple photos are mapped, uniform light and color to ensure that the overall color of the utensils is consistent
Mapping edge Mapping edges blend naturally and seamlessly
Mapping accuracy ≤0.1 mm