1461186169-b543bd90-7e4f-4ea6-bba5-63bd292d10f3

1. A medical system comprising:
an image data acquisition unit that acquires image data corresponding to a three-dimensional (3D) image of a living body including an object;
a user input unit that receives user input information; and
a processor that generates 3D data by using the image data and the 3D image by using the 3D data, detects 3D geometry information corresponding to a 3D caliper from the 3D data based on the user input information, sets the 3D caliper on the 3D image based on the detected 3D geometry information, and creates measurement information.
2. The medical system of claim 1, wherein the user input information comprises first input information needed for setting a region of interest (ROI) corresponding to the 3D caliper on the 3D image and second input information needed for selecting the object.
3. The medical system of claim 2, wherein the processor sets a reference value corresponding to the object according to a reference value preset for the object, based on the second input information.
4. The medical system of claim 2, wherein the user input information further comprises third input information needed for setting a reference value that will be used to determine a position of the 3D caliper on the 3D image in a depth direction, and
wherein the processor sets a reference value corresponding to the object based on the second input information and the third input information.
5. The medical system of claim 3, wherein the processor sets an observation plane consisting of a plurality of pixels based on the 3D data, detects a pixel corresponding to the ROI from the observation plane based on the first input information, projects a virtual ray from the detected pixel onto the 3D data, samples the virtual ray at preset sampling intervals and acquires a sampling point and a sampling value corresponding to the sampling point, detects a voxel corresponding to the reference value from the 3D data based on the sampling value, and sets 3D geometry information of the detected voxel as 3D geometry information of the ROI.
6. The medical system of claim 5, wherein the processor cumulatively adds the sampling value along the propagation direction of the virtual ray and detects as the voxel corresponding to the reference value a first voxel where a cumulative sum of the sampling value is greater than or equal to the reference value.
7. The medical system of claim 5, wherein the processor compares the sampling value with a preset threshold value to detect a sampling value that is greater than or equal to the preset threshold value, cumulatively adds the detected sampling value along the propagation direction of the virtual ray, and detects as the voxel corresponding to the reference value a first voxel where a cumulative sum of the sampling value is greater than or equal to the reference value.
8. The medical system of claim 5, wherein the processor compares the sampling value with the reference value to detect a first sampling value that is greater than or equal to the reference value and detects a voxel corresponding to the detected first sampling value as the voxel corresponding to the reference value.
9. The medical system of claim 5, wherein the processor sets a virtual plane on the 3D image based on the 3D geometry information, sets a two-dimensional (2D) caliper having depth information in a depth direction of the 3D image on the virtual plane as the 3D caliper, based on the 3D geometry information, measures a length of the object by using the 2D caliper to create length information, and creates measurement information including the length information and the depth information.
10. The medical system of claim 5, wherein the processor projects at least two virtual rays having different angles from the detected pixel.
11. The medical system of claim 5, wherein the processor also detects the 3D geometry information by using perspective ray casting.
12. The medical system of claim 1, wherein the user input information comprises first input information needed for setting a virtual plane on the 3D image, second input information needed for changing a 3D position of the virtual plane, and third input information needed for setting a region of interest (ROI) corresponding to the 3D caliper on the 3D image.
13. The medical system of claim 12, wherein the processor sets a virtual plane based on the first input information, changes the 3D position of the virtual plane based on the second input information to place the virtual plane in the 3D data, sets the ROI on the 3D image based on the third input information, and detects 3D geometry information of the ROI from the 3D image in which the virtual plane has been set, based on the 3D data.
14. The medical system of claim 13, wherein the processor sets a virtual observation plane consisting of a plurality of pixels based on the 3D data, detects a pixel corresponding to the ROI from the virtual observation plane, projects a virtual ray from the detected pixel onto the 3D data to detect a voxel at which the virtual ray and the virtual plane meet each other, and sets 3D geometry information of the detected voxel as 3D geometry information of the ROI.
15. The medical system of claim 14, wherein the processor sets a 2D caliper having depth information in a depth direction of the 3D image on the virtual plane as the 3D caliper, based on the 3D geometry information, measures a length of the object by using the 2D caliper to create length information, and creates measurement information including the length information and the depth information.
16. The medical system of claim 12, wherein the processor creates a cross-sectional image corresponding to the virtual plane by using the 3D data, sets the ROI on the cross-sectional image based on the third input information, and performs measurement based on the ROI to create measurement information.
17. A method of providing measurement information, the method comprising:
acquiring image data corresponding to a three-dimensional (3D) image of a living body including an object;
creating 3D data by using the image data;
creating the 3D image by using the 3D data;
receiving user input information;
detecting 3D geometry information corresponding to a 3D caliper from the 3D data based on the user input information; and
setting the 3D caliper on the 3D image based on the detected 3D geometry information to create measurement information.
18. The method of claim 17, wherein the user input information comprises first input information needed for setting a region of interest (ROI) corresponding to the 3D caliper on the 3D image and second input information needed for selecting the object.
19. The method of claim 18, wherein in the detecting of 3D geometry information corresponding to a 3D caliper from the 3D data, a reference value corresponding to the object is set according to a reference value preset for the object, based on the second input information.
20. The method of claim 18, wherein the user input information further comprises third input information needed for setting a reference value that will be used to determine a position of the 3D caliper on the 3D image in a depth direction, and
wherein in the detecting of 3D geometry information of a 3D caliper from the 3D data, a reference value corresponding to the object is set based on the second input information and the third input information.
21. The method of claim 19, wherein the detecting of 3D geometry information of a 3D caliper from the 3D data comprises:
setting an observation plane consisting of a plurality of pixels based on the 3D data;
detecting a pixel corresponding to the ROI on the observation plane based on the first input information;
projecting a virtual ray from the detected pixel onto the 3D data;
sampling the virtual ray at preset sampling intervals and acquiring a sampling point and a sampling value corresponding to the sampling point;
detecting a voxel corresponding to the reference value from the 3D data based on the sampling value; and
setting 3D geometry information of the detected voxel as 3D geometry information of the ROI.
22. The method of claim 21, wherein the detecting of a voxel corresponding to the reference value comprises cumulatively adding the sampling value along the propagation direction of the virtual ray and detecting as the voxel corresponding to the reference value a first voxel where a cumulative sum of the sampling value is greater than or equal to the reference value.
23. The method of claim 21, wherein the detecting of a voxel corresponding to the reference value comprises:
comparing the sampling value with a preset threshold value to detect a sampling value that is greater than or equal to the preset threshold value;
cumulatively adding the detected sampling value along the propagation direction of the virtual ray; and
detecting as the voxel corresponding to the reference value a first voxel where a cumulative sum of the sampling value is greater than or equal to the reference value.
24. The method of claim 21, wherein the detecting of a voxel corresponding to the reference value comprises comparing the sampling value with the reference value to detect a first sampling value that is greater than or equal to the reference value and detecting a voxel corresponding to the detected first sampling value as the voxel corresponding to the reference value.
25. The method of claim 21, wherein the setting of the 3D caliper on the 3D image based on the detected 3D geometry information to create measurement information comprises:
setting a virtual plane on the 3D image based on the 3D geometry information;
setting a two-dimensional (2D) caliper having depth information in a depth direction of the 3D image on the virtual plane as the 3D caliper, based on the 3D geometry information;
measuring a length of the object by using the 2D caliper to create length information; and
creating measurement information including the length information and the depth information.
26. The method of claim 21, wherein the projecting of a virtual ray from the detected pixel onto the 3D data further comprises projecting at least two virtual rays having different angles from the detected pixel.
27. The method of claim 21, wherein the detecting of 3D geometry information of a 3D caliper from the 3D data further comprises detecting the 3D geometry information by using perspective ray casting.
28. The method of claim 17, wherein the user input information comprises first input information needed for setting a virtual plane on the 3D image, second input information needed for changing a 3D position of the virtual plane, and third input information needed for setting a region of interest (ROI) corresponding to the 3D caliper on the 3D image.
29. The method of claim 28, wherein the detecting of 3D geometry information corresponding to a 3D caliper from the 3D data comprises:
setting a virtual plane based on the first input information;
changing the 3D position of the virtual plane based on the second input information to place the virtual plane in the 3D data;
setting the ROI on the 3D image based on the third input information;
detecting 3D geometry information of the ROI from the 3D image in which the virtual plane has been set, based on the 3D data.
30. The method of claim 30, wherein the detecting of 3D geometry information of the ROI from the 3D image comprises:
setting a virtual observation plane consisting of a plurality of pixels based on the 3D data;
detecting a pixel corresponding to the ROI from the virtual observation plane;
projecting a virtual ray from the detected pixel onto the 3D data to detect a voxel at which the virtual ray and the virtual plane meet each other; and
setting 3D geometry information of the detected voxel as 3D geometry information of the ROI.
31. The method of claim 30, wherein the setting of the 3D caliper on the 3D image based on the detected 3D geometry information to create measurement information comprises:
setting a 2D caliper having depth information in a depth direction of the 3D image on the virtual plane as the 3D caliper, based on the 3D geometry information;
measuring a length of the object by using the 2D caliper to create length information; and
creating measurement information including the length information and the depth information.
32. The method of claim 28, further comprising:
creating a cross-sectional image corresponding to the virtual plane by using the 3D data;

setting the ROI on the cross-sectional image based on the third input information; and
performing measurement based on the ROI to create measurement information.

The claims below are in addition to those above.
All refrences to claim(s) which appear below refer to the numbering after this setence.

1. A cushioned wristband with printable laminated label, said wristband comprising a layer of cushion material for contacting the wearer’s wrist, a strap extending to one side of said cushion material, a laminated label for attachment to the wristband, said laminated label having at least one slot through which said strap may be inserted, an attachment for securing the strap so that it remains inserted through the at least one slot, and a fixative for closely positioning said strap to said label.
2. The wristband of claim 1 wherein the fixative comprises a patch of adhesive applied to the strap.
3. The wristband of claim 2 wherein the attachment comprises a layer of one of either hook or loop material backing covering at least a portion of the cushion material, and wherein the strap has a surface of the other of the hook or loop material.
4. The wristband of claim 3 wherein the label has a pair of slots, one on either side thereof, and wherein the strap is sufficiently long to be inserted through each of said slots and beyond to be adhered by said adhesive patch to the label as the wristband is applied to a user.
5. The wristband of claim 4 wherein the label comprises a printable face stock portion and a laminating portion, said laminating portion being approximately twice the size of said printable face stock portion so that the laminating portion may be folded over to substantially self laminate the face stock portion.
6. The wristband of claim 5 wherein the backing layer is loop material and the strap is hook material.
7. The wristband of claim 6 wherein the label is separable from a two-layer business form, the form being arranged for printing of the face stock portion by a printer under computer control prior to separation from the form.
8. The wristband of claim 7 wherein the strap is attached to the loop material and arranged so that after insertion through at least one of the label slots the strap may be wrapped around the user’s wrist and past the opposite edge of the wristband for attachment to the backing so that the user’s wrist is completely encircled by the cushion material and the label is located outside the wristband.
9. A cushioned wristband including an information bearing label, the wristband having a carrier comprised of an inside layer of cushion material and an outside layer of either hook or loop material, and a strap affixed near an edge of the carrier, the strap having a layer of the other of either the hook or loop material applied to a first side thereof and a fixative applied on the other side of the strap, the strap being inserted through at least one slot formed in the label to thereby position the label to contact the fixative and thereby become affixed thereto and the wristband then wrap around the user’s wrist for attachment of the strap to the outside layer as the wristband is applied to a user’s wrist.
10. The wristband of claim 9, said label comprising a self laminating label formed from two plies of material, said two plies comprising a face ply area for receiving a printed image and a laminating ply for over-laminating the face ply area.
11. The wristband of claim 10 wherein the laminating ply includes two slots, the two slots being arranged on opposing sides of the face ply area so that the fixative contacts and affixes to the non-imaged side of the face ply area.
12. The cushioned wristband of claim 11 wherein said outside layer is loop material and the strap is hook material, and wherein the carrier is elongated and the strap is affixed to said carrier near an end thereof.
13. The cushioned wristband of claim 12 wherein said fixative comprises a patch of adhesive extending along the strap and for substantially the entirety of the length of the face ply area of said label.
14. A cushioned wristband comprising a wristband carrier, said wristband carrier having a cushion material surface and a loop material surface, a hook strap affixed to the carrier and extending to one side thereof, and a laminated label having a pair of slots arranged along opposite edges thereof, said laminated label having a fixative extending along at least a portion of its length, and said slots being adapted to receive the hook strap therethrough and being the label in close proximity to said fixative.
15. The cushioned wristband of claim 14 wherein said fixative comprises a patch of hook material so as to become affixed to the loop material surface as the strap is wrapped about the wristband carrier.
16. The cushioned wristband of claim 14 wherein said laminated label has a face ply area for receiving a printed image and a laminating ply for laminating the face ply area, the slots being located on opposites sides of said face ply, said slots being formed in said laminating ply.
17. The cushioned wristband of claim 16 wherein said label is formed in a business form, the business form having at least two plies, and wherein each of said face ply area and said laminating ply are formed by die cuts in said two plies.
18. In a cushioned wristbandlabel assembly, the assembly including a cushioned carrier a strap extending therefrom and an information bearing label, the strap having a length for extending through a pair of slots arranged on opposing sides of the label to bring the label closely adjacent the strap, the improvement comprising a fixative interposed between said label and said strap to adhere and securely position the location of the label with respect to the strap.
19. The cushioned wristbandlabel assembly of claim 18 wherein the fixative is interposed between the label and either one or both of said strap and cushioned carrier.
20. The cushioned wristbandlabel assembly of claim 19 wherein a layer of one of a hook or loop material is applied to the cushioned carrier and the other material is applied to the strap and wherein the fixative comprises a patch of the other material applied to the label so that as the strap is wrapped about the cushioned carrier to attach the wristband assembly to a person’s appendage, the label and strap affix to the cushioned carrier.
21. The cushioned wristbandlabel assembly of claim 18 wherein the fixative comprises a patch of adhesive applied to the label.
22. The cushioned wristbandlabel assembly of claim 18 wherein the fixative comprises a patch of adhesive applied to the strap.