United States Patent: Higgins Et Al. Feb. 15, 2011
United States Patent: Higgins Et Al. Feb. 15, 2011
Higgins et al.
(54)
(Continued)
US 7,889,905 B2
OTHER PUBLICATIONS
Deli g ianni ,
F. ,
A.
Chun g,
and
G.
Yan g .
Patient-s P eci?c
vol 9 N0 5
g ry
'
(Us)
215-226 (2004)
' p'
'
(Continued)
(*)
Notice:
(57)
ABSTRACT
(22) Filed:
(65)
Us 2007/0013710 A1
Jan 18 2007
'
(5 1) Int Cl
feed. Given one frame from this video feed, and starting from
G06K 9/00
(52)
(200601)
G06K 9/32
(2006.01)
A61B 5/05
(200601)
6 0 0/4 2 4
.
.
.
warped to the nearest v1ew1ng
s1te
of the reference source. A11
ima e difference is com uted between the w ed video
g
. p
.
.
. mp
.
frame and the reference image. The viewpoint 18 updated v1a
382/130
(56)
lstory'
References Cited
us PATENT DOCUMENTS
4,791,934 A
12/1988 Brunnett
.
(Commued)
KR
20020041577
6/2002
Flafelen-ie Images
interior Airway
Tree Surtazes
Ralaranca De plhs
F
Flalsrgma Gradients
Re 31
WE! ml?! El
1111:1154
5
rf/
\
Frame I l ,
F >;
r 12:, I,
mage
_/
ii
Parameter
Updma
US 7,889,905 B2
Page 2
US. PATENT DOCUMENTS
5740802
5,748,767
5,765,561
5,776,050
5,782,762
5,792,147
5,830,145
5,891,034
5,901,199
5,920,319
5,956,418
5 963 612
A
4/1998
A
5/1998
A
6/1998
A
7/1998
A
7/ 1998
A
8/1998
A
11/1998
A
4/1999
A
5/1999
A
7/1999
A * 9/1999
A
10/1999
2008/0262297 A1*
Na?s er a1~
Raab
Chen et al.
Chen et al.
WO-2006076789
Vining
7/2006
OTHER PUBLICATIONS
Evans et al.
Tenhoff
Bucholz
Navab
539633613 A
10/1999 Navab
Roberson, RE. and PW. Likins. A Linear-iZation Tool for Use with
5 971 767 A
5999840 A
10/1999 Kaufman et a1
0/1999 Grimson et a1
'
Murphy et al
Vining et a1
Aigeretal. ............... .. 382/154
6006126 A
60 l6439 A
12/1999 Cosman
V2000 Acker
6,019,725 A *
6,049,582 A
4/2000 Navab
6,083,162 A
6,190,395 B1
7/2000 Vining
2/2001 Williams
6,201,543
6,236,743
6,241,657
6,272,366
6,311,116
B1
B1
B1
B1
B1
3/2001
5/2001
6/2001
8/2001
10/2001
1/2002
6,334,847 B1
ODonnell et al.
Pratt
Chen et al.
Vining
Lee et a1,
Fenster et a1
6,343,936 B1
6,351,573 B1
2/2002 Schneider
Helferty, J.P., A.J. Sherbondy, A.P. Kiraly, and WE. Higgins. Com
6 363 163 B1 *
6,366,800 B1
4/2002 Vinin
g e
t l
3
600/426
6,480,732 B1 *
11/2002
6,491,702 B2
SlIIlOIl et
ronc oscopy an
221 , N0 ~ 2 ,
6 442 417 B1
8/2002 Shahidi et al
6470207 Bl * 10/2002 Simon et 31
65l4082 B2
P~
o a
531636 ~
Maurer, C.R., J.M. Fitzpatrick, M.Y Wang, R.L. Galloway, Jr., R].
. . . . . . . . . . . . . . ..
Maciunas, and
6,546,279 B1
6,556,695 B1 *
6,593,884
6,674,879
6,675,032
6,694,163
6,714,668
7/2003
1/2004
V2004
2/2004
3/2004
B1
B1
B2
B1
B1
Gilboa et a1~
Weisman et a1~
Chen et a1
Vining
Kerrien et al.
6,725,080 B2 *
6,771,262 B2
6,785,410 B2
8/2004 Krishnan
8/2004 Vining et al.
been offered for sale, publicly used, and/or published prior to the
?ling date of this application).
6,816,607 B2 *
6,819,785
6,859,203
6,909,913
6,920,347
11/2004
2/200 5
6/2005
700%
Vining et 31,
Van Muiswinkel et a1
Vining
Simon et a1
Johnson et al.
aPP11at1n)~
B1
B2
B2
B2
6,928,314 B1 *
8/2005
6,947,584 B1
9/2005 Avila et a1
6 980 682 B1
........... .. 600/407
12/2005 Avinash et a1
7,343,036 B2
3/2008
_ tp
8/2003 GGIgeI
10/2004 Geiger
12/2004 Reeves et 31'
Kleen et al.
7,019,745 B2
2003/0152897 A1
2004/0209234 A1
2004/0252870 A1
'
P
.
..
on eI-l an
S Tamra
Image Gm
Ce 0 Breast- Carper
2005/0074151 A1 *
2005/0078858 A1
2005/0096526 A1
2005/0182295 A1<
2005/0272999 A1
2006/0084860 A1
12/2005 Guendel
4/2006 Geiger et al.
US 7,889,905 B2
Page 3
Shoji, H., K. Mori, J. Sugiyama, Y. Suenaga, J. Toriwaki, H.
Takabatake, and H. Natori. Camera motion tracking of real
674-679.
Sherbondy, A.J., A.P Kiraly, A.L. Austin, J.P. Helferty, S. Wan, J.Z.
Turlington, T. Yang, C. Zhang, E.A. Hoffman, and G. McLennan.
Virtual Bronchoscopic approach for combining 3D CT and
Endoscopic Video Proceedings of SPIE 2000, vol. 3978, No. 104.
Helferty, J .P., A.J . Sherbondy, A.P. Kiraly, J .Z. Turlington, E.A. Hoff
man, G. McLennan, W.E. Higgins. Experiments in virtual
961-964.
1989.
IEEE Transactions on Medical Imaging 2004, vol. 23, No. 11, pp.
1365-1379.
for Pulmonary Peripheral Lesions. Chest 2006, vol. 130, No. 2, pp.
559-566.
25(5):578:589, 2003.
Mori, et al., A method for tracking the camera motion of real
US 7,889,905 B2
Page 4
Higgins, et al., Integrated bronchoscopic video tracking and 3D CT
registration for virtual bronchoscopy, Medical Imaging 2003: Physi
ology and Function: Methods, Systems, and Applications.
* cited by examiner
US. Patent
Sheet 1 012
US 7,889,905 B2
Rare-rams [ms-132s
\\\.
Helaranz-a Gradianls
if, [
He '11
1 {1
liliffe IEEHEI-B
hale 41
Frame
I
[Image
_ :}
ll
Pam ma'tar
Lind-m9
I
FIG. 1
FIG. 3A
FIG. 3B
FIG. 3C
US. Patent
FIG. 2A
FIG. 21)
Sheet 2 of2
FIG. 213
FIG. 2B
US 7,889,905 B2
FIG. 2C
FIG. 2F
US 7,889,905 B2
1
STATEMENT OF GOVERNMENT
SPONSORSHIP
20
30
problem.
45
55
[7].
eters of the system, and, in the case of the homography, can
65
US 7,889,905 B2
3
may both be real, virtual, or one real with the other virtual.
The set of reference images may endoscopic, derived from a
20
able.
The invention makes it possible to completely determine
the rigid transformation between multiple sources at real
time or near real-time frame-rates in order to register the two
35
40
Given the above setup, the goal is to locate the source of the
feed. Given one frame from this video feed, and starting
from an initial guess of viewpoint, the real-time video
frame is warped to the nearest viewing site of the refer
ence source.
50
update.
algorithm [5].
The objective function used in [5, 6] is the sum squared
difference (SSD) between the pixel intensities of the two
5. Steps 2-4 are repeated for each frame until the viewpoint
converges or the next video frame becomes available.
55
E = 2 [1m v; p + Ap) - w. m2
(1)
60
guided bronchoscopy;
FIGS. 2A-2F show source images and results for virtual
to-real registration; speci?cally, FIG. 2A shows a real video
65
and v are the row and column indices, and I, is the real RB
US 7,889,905 B2
6
depth map of the real image Zr. Solving for the Gauss-New
ton parameter update associated With 4 yields
Ap = W12 [glam v; p) - w. w]
(2)
(3)
MW
Where
30
in vector p.
descent images
40
costly algorithm.
ing the virtual vieWpoint toWards the real vieWpoint using the
50
function, all the equations presented thus far are general and
60
tant also to note that the Warp in this case is dependent on the
is:
site.
65
US 7,889,905 B2
7
With three Euler rotation angles and three translations With
respect to the nearest reference vieW.
BI
Where VHI and Val are the image gradients With respect to the
roWs and columns of the image, and JP is the Jacobian of the
Warped coordinates With respect to p and thus can be found by
differentiating u' and v' from (7) With respect to each of the
Warp parameters and evaluating it at a particular current value
of p. In the case of the inverse compositional algorithm, the
image derivatives are alWays evaluated at p:() and thus the
Jacobian is constant for each reference vieWing site:
Where R is the rotation matrix de?ned by the Euler angles (6 ,,
6P, 6y), u and V are the columns and roWs of the image, f is the
focal length, and Z is the entry on the depth map Z corre
(9)
20
depth map Zr, and therefore large errors in the image Warping.
Under such circumstances, the forWard gradient descent
method governed by (1-2) may be better suited to the prob
R'IRRf
(11)
lem.
In order to apply the Warping function, at each pixel coor
(10)
30
40
Ap.
(4) requires that the coordinate locations be the same for both
images. The resultant array must therefore be interpolated
45
50
depths.
Finally, We turn to the calculation of the steepest-descent
images
55
61
13p
60
chain rule:
65
US 7,889,905 B2
10
in order to emphasize dark areas, which tend to have more
(13)
F0
before.
mi
#21124 13%],
(14)
Viewpoint
60L (deg)
@[5 (deg)
6y
(deg)
30
Initial
Reference
Site
Registered
Ground
Truth
35
Error
149.2
149.4
71.1
73.3
20.2
7.3
1.7
5.1
0
19.9
147.6
147.1
149.0
148.9
73.9
73.8
20.9
20.24
1.2
1.8
3.2
0.4
0.6
0.1
0.5
0.7
3.0
2.8
ized images.
Examples
50
maps Zv.
Virtual-to-Real Registration
The virtual-to-real registration was performed using pyra
mid decomposition starting from level 3 and ending at level 1 .
To account for the difference in intensity characteristics
between the imaging sources, the weighted normalized cross
correlation (12) was used as the objective function, with
weights WW chosen as
(15)
60
US 7,889,905 B2
11
12
REFERENCES
25
30
35
40
45
50
1997.
airWay.
10. A method of registering video frames of a body lumen
received from an endoscope inserted into the body lumen to
60
US 7,889,905 B2
14
13
function.
15. The method of claim 10 Wherein the optimiZation tech
nique is based on a Gauss-Newton parameter update.
16. The method of claim 10 Wherein the step of receiving at
least one video frame from an endoscope comprises receiving
a live bronchoscopic video frame.
17. The method of claim 16 Wherein the computed set of
reference images of the body lumen comprises a 3-D model of
a bronchial tree.