The calculation and determination the
parameters of the system to reconstruct the 3D
surface with good quality z resolution is presented.
From the experimental results, the optimal parameters
L = 820 mm, f = 12 mm and b = 160 mm can be
determined at the maximum measuring range 400mm
x 300 mm2. The device can reconstruct the 3D
surface of the object as shown in Figure 10 with the
optimal resolution of z = 0.67 mm
6 trang |
Chia sẻ: huongthu9 | Lượt xem: 493 | Lượt tải: 0
Bạn đang xem nội dung tài liệu Optimal Parameters Selection for 3D-Mechanical Surface Measuring Equipment Based on the Structured Light Gray Code, để tải tài liệu về máy bạn click vào nút DOWNLOAD ở trên
Journal of Science & Technology 122 (2017) 022-027
22
Optimal Parameters Selection for 3D-Mechanical Surface Measuring
Equipment Based on the Structured Light Gray Code
Nguyen Thi Kim Cuc*, Nguyen Van Vinh, Nguyen Thanh Hung, Pham Xuan Khai
Hanoi University of Science and Technology – No. 1, Dai Co Viet Str., Hai Ba Trung, Ha Noi, Viet Nam
Received: June 16, 2017; Accepted: November 03, 2017
Abstract
3D reconstruction techniques of object by using the structured light are widely used in the industry and
academia. This method gives some advantages like fast and accurate. The measurement precision of a
certain system depends on the geometrical parameters of the system, the characteristics of the surface
under test and the specification of devices. In this paper, the effect of some parameters on the resolution of
the 3D surface measurement equipment based on Gray code technique, especially the position correlation
between the camera and the projector is investigated. Thereby, the paper proposes the selection of optimum
parameters for the measurement system. Experimental results determine the optimum conditions for the
system at the measuring range (w x h = 400 mm x 300 mm) to obtain the best z-resolution of 0.67 mm.
Keyword: 3D measuring devices, coded patterns, structured light, computer vision.
1. Introduction1
3D Surface Optical Measurement is the process
of digitizing a surface object using optical sensors.
With the computer science improvement, the
measurement became easier and faster. In recent
decades, the structured light method has been
extensively studied and has been applied in many
fields of modern technologies such as precision
manufacturing, health sciences, security, and
entertainment [1].
Typically, the measurement system using the
structured light consists of: a single camera and a
single projector that projects a light pattern on the
measuring surface. The light pattern is encoded by
the intensity function. The depth information can be
extracted by measuring distortion between the
captured and the reflected image. The target object
should remain stable during the scanning process.
The object 3D position can be obtained by
triangulation method.
The binary code or Gray code method has been
created using some techniques such as analog fringe
generation methods (e.g. Grating) [2, 3] and binary
image coding with a projector [4, 5]. Two binary
coding methods have advantages of simple optical
arrangement, simple computation and robustness.
Moreover, comparing to grating projection methods,
the digital fringe projection methods tend to be more
flexible, easier, and faster [6].
*Corresponding author: Tel.: (+84) 966087567
Email: cuc.nguyenthikim@mail.hust.edu.vn
The binary coding method was first proposed by
Posdamer and Altschuler [7] in the field of a
formulation with sequential projection of n samples
to encode into 2n still using binary coding. Each
pattern on the final template has a unique binary
code. In these techniques, only two illumination
levels are commonly used, which are coded as 0 and
1. The symbol 0 corresponds to black intensity while
1 corresponds to full illuminate white. However, the
pixels on the same strip have the same binary code,
so that, to obtain coordinates of the measurement
points it is necessary to determine the center of the
fringes or boundary of the fringes on the object when
conducting the triangulation algorithm. The
advantages of Gray code is that consecutive code
words have a Hamming distance of one, being more
robust against noise [8].
The maximum number of patterns that can be
projected is the resolution in pixels of the projector
device. To get a better resolution, a larger number of
patterns is needed. However, reaching this value is
not recommended because the camera cannot always
detect such narrow stripes [9].
The structured light that is used in this paper is
Gray coding with projection patterns with only
alternating white and black stripes. This paper
presents a method for determining the optimum
parameters of the 3D measuring system using
structured light in order to minimize zmin resolution
and compact design.
2. Measurement principle
Figure 1 shows the principle of a structured light
measuring system using sequential Gray coded
Journal of Science & Technology 122 (2017) 022-027
23
patterns project on the measured object and a camera
captures images pattern on the object.
The measurement process is controlled by a
computer: The computer connected to the projector
controls the sequential patterns of image that are
encoded onto the measuring object. The computer
processes the images obtained from the camera. A
software is programed to project the structured light
of texture structure of the projector, to control the
movement of part under test, to acquit 3D images,
and to reconstruct 3D profile of the measuring object.
Fig. 1. Schematic diagram of measurement system.
a, b,
Fig. 2. Find tripe egged in Gray code pattern (a)
and phase map on CCD camera (b)
Fig. 3. Optical Geometry of the measuring system [3]
In order to remove one-bit decoding error of
Gray code based on pixel center, to increase the
accuracy of image sampling point location, and to
keep the correspondence between image sampling
point and object sampling point, encoding and
decoding method based on stripe edge of Gray code
is presented.
Due to the influence of ambient light and the
reflection of measured surfaces, the white and black
stripes in the image can change over different regions
of the image. The structured light Gray code method
projects both normal and inverse stripe patterns, to
prevent the effects of lighting conditions. Then by
finding the intersection of the two normal and inverse
image, the stripe edge is located.
When 3D reconstruction, the Gray level image
with projector effects to quality image captures by the
camera. Therefore, the field of appropriate
illuminance can be determined for best quality image
and high resolution.
As shown in figure 2, for example, three Gray
code patterns are projected, and there generate 23-1
edges. The reference plane is divided into 8 sections
that is stripes, which have a certain width and
projection sequence with a unique intensity code and
is represented by a code word. Sample images are
shown in the order of G1, G2, and G3, the first stripe
intensity from left is white, white, and white so the
corresponding code word is 111.
The measuring system is constructed with several
extrinsic parameters defined as the field of view (w × h) of the device which is the space that can be seen
both from the projector and the camera. This area
depends mainly on the parameters of the system such
as: the distance between the camera and the projector
b, the angle between the camera axis and the
projector axis α, the distance between the center of
the camera and the projector to the reference plane
(R) L.
Each pixel on the camera's CCD is determined
by the value of row i and column j. The size of the
camera CCD is Ci × Cj called the image plane.
The algorithm determines the 3D coordinates of
the object based on the triangulation method. The
coordinates of a point on the reference plane are
determined by the point on the image plane by two
parameters: the pixel coordinates and the code word
obtained by displaying sequential patterns project on
the plane R.
When Gray light is projected on the plane R, the
phase map of the projection plane (RP) is mapped.
When the object is placed on an (R) plane, an object
presents a parallel-epipedic shape. The phase map is
Journal of Science & Technology 122 (2017) 022-027
24
mapped to an object (OP). The height of the object is
determined by the phase difference between the pixel
on the phase map RP and the phase map OP.
As shown in Figure 3, point A lies on the plane
R. When there is no object, point A is seen at pixel A'
(i, j) on the CCD and has the code word of point B'’
on the projector's DMD.
As an example in Figure 2, in case n = 3, the
code word of point A stores the bit of pixel B'
sequence (0 1 0). When an object is placed on the
plane R, ray O''A intersects the object at point C(x, y)
which is seen at pixel A'(i, j) on the CCD. This is the
direction of ray O''E, which strikes the object at point
D. The point D on the image maps to pixel B'(i’, j) on
the CCD.
Evaluation of the height z(x, y) of point C on the
object is the distance 𝐶𝐶𝐶𝐶����. The distance 𝐴𝐴′𝐵𝐵′������ is
determined. In the case, the stripes are parallel to the
x coordinate, the distance 𝐴𝐴′𝐵𝐵′������= Δi = i - i'. The
distance 𝐴𝐴𝐵𝐵���� = Δy = y - y' on plane R is determined by
the formula [3]:
iC
h i. =yAB ∆∆=
(1)
Where: Ci is the vertical height of the CCD sensor.
The two triangles ACB and O'CO'' are similar
and that the following relationship holds:
CH
CH =
'O'O'
AB
−L
(2)
Where ),(CH yxz= , b='O'O'
by +∆
∆y L. =y)z(x,
(3)
Because the projected stripes are parallel to the X
direction, pixels that have the same Y coordinate
present the same Gray-code word. Thus the resolution
along the X coordinate depends on only the resolution
of the CCD matrix. Below we consider the
dependence of the resolution on the Y coordinate.
The minimum measurable value of z (x, y) = zmin, is
expressed by
by +∆
∆
min
min
min
yL. =z
(4)
According to the formula 4, the resolution is
defined as the minimum distance 𝐴𝐴𝐵𝐵���� is 𝐴𝐴𝐵𝐵���� = Δymin.
Then the distance 𝐴𝐴′𝐵𝐵′������ is the smallest from Eq (1),
where 𝐴𝐴′𝐵𝐵′������ = Δimin = 1 (A' and B' are two adjacent
pixels). According to formula (3) at this time
∆ymin= ℎ
𝐶𝐶𝑖𝑖
b
c
c
i
i
+
h
hL.
=zmin
(5)
From formula (5), the z-resolution can be
increased by decreasing L, h and increasing b.
Parameters L and h are interdependent depending on
the optical properties of the camera.
Normally, the vertical projection area is smaller
than the horizontal dimension (h < w) so that, the
short image direction h is selected. The value h is
large enough to observe the whole object and small
enough to optimize the measurement resolution.
As shown in Figure 4, the relation of the
projection height h and the height of the CCD sensor
through the distance L and the focal length of the
camera lens f are determined by the formula [10].
f
fL
C
h
i
−
=
(6)
Substituting formula (6) into (5) we have:
( )
=
−+
−
fLbf
fLL =zmin
(7)
Eq. (7) suggests that measurement resolution
can be increased by reducing the parameter L or
increasing the parameter f . However, if h is constant,
the ratio L/f should be constant. Therefore, we need
to calculate L and f accordingly. Finally, Eq. (7) is
used to determine the value of parameter b, which
optimizes the measurement resolution. A noteworthy
is that care must be taken in choosing the value of b.
However, if the larger the projection angle of the
projector to the large reference plane, the greater the
width and the greater the deformation of the
projections, the higher the resolution.
3. Experiments and discussion
The system composes of a DLP projector
(InFocus N104) working at 1024 × 768 pixels. A
Fig. 4. Image lens of camera
Journal of Science & Technology 122 (2017) 022-027
25
camera (DFK 41BU02) with 1280 × 960
resolution. A standard PC was used for
implementing the algorithms.
As illustrated in figure 5, the whole
experimental system based on the geometric
constraint. The rotary table is designed to be rotatable
so that the scanning area on the object is maximized,
with the main movement rotating on the plane R.
The experiment was carried out to select value
of illumination for high resolution. The Gray - level
of the program creating a binary pattern was changed
from 10 to 255 levels and measure the illuminance
pattern on the reference plane. During the experiment
the ambient light had an average illuminance of 18
(lux) and an average ambient temperature of 25 oC.
According to the figure 6 when projecting patterns
at low illumination intensities, the signal to
noise ratio of the system decreases and, therefore, the
depth from low reflective regions cannot be obtained.
On the other hand, when projecting high illumination
intensity patterns, depth from regions with high
reflectance cannot be recovered due to pixel
saturation. Therefore, Gray-level values are selected
in area of 100 to 200.
For determining the projection screen of the
projector: Use off-axis projection and projection
horizontally and vertically as symmetry. The angles
projection horizontal and vertical γ and β are shown
in Fig. 1. Because the projector has a zoom lens, there
are a variety of offset ratios. To project the clear
projection on the reference plane, it is necessary to
change the projector focus to a certain distance. So
choosing the scaling factor of projection to
experiment the relationship between projection
distance and projection width is necessary.
Experiment to change the distance L and adjust
the projector's focus so that the image is clear on the
plane R to determine the horizontal and vertical
projection area w × h. Thereby, the relationship
between the distance L and the vertical projection
area h is calculated.
The relationship between the distance L and the
viewing area w × h when the offset ratio is 100%.
The data was processed and graphs were drawn using
MS Excel 2010.
According to figure 7, the relationship between
L, w and h has a correlation coefficient R2 = 0.9986
and R2 = 0.9979, respectively, which is relatively
linear. Therefore, it is possible to determine half-
angle in two directions.
Half projection angle horizontally and vertically
can be calculated according to the following formula:
08,13
222
≈⇒
=
ββ
L
warctg
03,10
222
≈⇒
=
γγ
L
harctg
Based on the determination of the angle β it is
possible to determine the distance L so that the
vertical projection area is h = 300 (mm).
( )
)(820
3,102
300
2
2
0
mmL
tgtg
hL
≈
==
γ
Fig. 5: Experimental system.
Fig. 6. Graph of relationship between
illumination and Gray- level
Fig. 7. Graph of relationship between L , w and h
Journal of Science & Technology 122 (2017) 022-027
26
Given L = 820 (mm) horizontal projection area can
calculate by:
( )
)(400
8,13tan820.2
2
tan2 0
mmw
Lw
=⇒
=
=
β
For the experimental system using a camera
resolution (1280 × 960) and the actual pixel size of
4.65 (μm), it is possible to calculate and select the
size of the lens so that the recording area the camera
covers the projection area of the projector on a
standard plane.
The actual size of the camera CCD in horizontal
and vertical directions:
Ci = 1280 x 4,65.10-3 = 5,952 mm
Cj = 960 x 4,65.10-3 = 4,464mm
From Figure 4 we have the focal length of the camera
lens to achieve the smallest image area of h = 300
(mm) and L = 820 (mm).
)(01,12
300464,4
464,4.820 mm
hC
LCf
i
i =
+
=
+
=⇒
The shortest focal length f = 12 (mm) for design
optimization is used.
With the distance L and the focal length of the
camera f identify above, experimentally changes the
distance b, to determine the relationship between b
and the resolution of the equipment. The distance
between the camera and the projector will be changed
from as close as possible to as far as possible; each
time a standard tiered metering is performed.
With b values varying from 80 to 200 mm,
measurements and cloud point measurements are
made. From the 3D point cloud data, the distance
between the pixels in x, y, z directions determine the
resolution. An average value of resolution in different
directions is obtained by the distance measurements
that were carried out 25 times. 3D point clouds have
been processed and measured resolution on
Geomagic 2012 software
The relationship between average resolution (x,
y, z) and distance b is shown in Figure 8. The average
resolution in x and y axis does not change much when
b changes. Thus, the resolution of the x and y
directions does not depend on the change of the
distance between the camera and the projector.
However, the average resolution of the z axis
varies considerably when the distance between the
camera and the projector varies. On the graph we see
with increasing parameter b from 80mm to 160mm,
the z resolution has been reduced from 1,12 mm to
0,67 mm. The parameter b from 160 to 170 mm the
system have some errors so the z resolution
increasing suddenlly. The reason for that the angle
between projector and reference plane has been too
large. Which make larger the stripe than allowable
value. Therefore, the system can’t compensate error
and return real value. With parameter b = 160mm, the
z resolution reachs in minimum of 0.67 mm. From
experimentally determined with b = 160 mm, the z
resolution is the best resolution.
On the graph we see that the standard deviation
of the resolution in axes is quite small.
From calculation and identification of binary
code structured light measurement system, the paper
proposes the process of optimizing some parameters
of the system to obtain the best z-resolution as
follow:
Fig. 8. 3D point clouds of height step object
Fig. 9. Graph of relationship between average
resolution of the axes and distance b
Journal of Science & Technology 122 (2017) 022-027
27
1. Investigate the projector to determine the
measuring area and the distance from the projector to
the calibrated screen to suit the requirements.
2. For symmetric projectors, the half-angle
projection from which the minimum projection width
is fixed to the fixed distance L.
3. Through the projection area size, select the
appropriate focal length of the camera lens to obtain
the whole projection area.
4. Determine the optimum distance between the
camera and the projector to achieve optimum
projection and compact design
4. Conclusion
The calculation and determination the
parameters of the system to reconstruct the 3D
surface with good quality z resolution is presented.
From the experimental results, the optimal parameters
L = 820 mm, f = 12 mm and b = 160 mm can be
determined at the maximum measuring range 400mm
x 300 mm2. The device can reconstruct the 3D
surface of the object as shown in Figure 10 with the
optimal resolution of z = 0.67 mm.
In this paper, the dependence of the height
resolution on the parameters involved in the
measurement has been studied. It has been shown
how the resolution requirement determines the
relationship between the camera and the projector.
The experimental results prove that with the
calculation and selection of the parameters of the
system can be obtained resolution z in the best way,
consistent with the theory.
Acknowledgment
This research is funded by the Hanoi University
of Science and Technology (HUST) under project
number T2016-PC-066.
References
[1] J. Salvi, J. Pagâ, and J. Batlle, “Pattern codification
strategies in structured light systems,” Pattern
Recognit., vol. 37, pp. 827–849, 2004.
[2] Q. Zhang, X. Su, L. Xiang, and X. Sun, “3-D shape
measurement based on complementary Gray-code
light,” Opt. Lasers Eng., vol. 50, no. 4, pp. 574–579,
2012.
[3] G. Sansoni, S. Corini, S. Lazzari, R. Rodella, and F.
Docchio, “Three-dimensional imaging based on
Gray-code light projection: characterization of the
measuring algorithm and development of a measuring
system for industrial applications.,” Appl. Opt., vol.
36, no. 19, pp. 463–472, 1997.
[4] T. Peng and S. K. Gupta, “Algorithms and models for
3D shape measurement using digital fringe
projection,” Mech. Eng., p. 249, 2006.
[5] H. B. Wu, Y. Chen, M. Y. Wu, C. R. Guan, and X. Y.
Yu, “3D Measurement Technology by Structured
Light Using Stripe-Edge-Based Gray Code,” J. Phys.
Conf. Ser., vol. 48, pp. 537–541, 2006.
[6] J. Batlle, E. Mouaddib, and J. Salvi, “Recent progress
in coded structured light as a technique to solve the
correspondence problem: A Survey,” Pattern
Recognition, Vol. 31, No. 7, pp. 963 - 982, 1998.
[7] J. L. Posdamer and M. D. Altschuler, “Surface
measurement by space-encoded projected beam
systems,” Comput. Graph. Image Process., vol. 18,
no. 1, pp. 1–17, 1982.
[8] Gupta, Pratibha, “Gray Code Composite Pattern
Structured Light Illumination,” Univ. Kentucky
Master’s Theses, p. 438, 2007.
[9] S. Rusinkiewicz, O. H.- Holt, and M. Levoy, “Real-
time 3D model acquisition,” SIGGRAPH ’02 Proc.
29th Annu. Conf. Comput. Graph. Interact. Tech.,
vol. 21, no. 3, pp. 438–446, 2002.
[10] S. Zhang, High-Speed 3D Imaging with Digital
Fringe Projection Techniques, Optical Sciences and
Applications of Light, vol. 1. pp. 1-198 Published by
CRC Press, 2016.
a,
b,
c,
Fig. 10. Reconstruction 3D map result
a, Image of measurement
b, Point cloud reconstruction
c, Point cloud registration
Các file đính kèm theo tài liệu này:
- optimal_parameters_selection_for_3d_mechanical_surface_measu.pdf